@sweepbright/sbimport v1.3.7
sbimport
Installation
This tool is built using Node.js. You can install it globally using npm or yarn as follows:
For npm
npm install -g @sweepbright/sbimportFor yarn
yarn global add @sweepbright/sbimportOnce the process terminates, you can start using it. Let's try
sbimport --helpAnother approach is to use npx or its analog to immediately use the tool.
For npm
npx @sweepbright/sbimport --helpFor yarn berry
yarn dlx @sweepbright/sbimport --helpUsage
Currently, sbimport support two commands:
init: to initialise the import folderimport: to import the datasets in an existing import folder.
Let's see in more details what this commands do.
init command
The init command expect a folder name as an argument. When executed, it will ask you the API Key and the environment. Provide them and it’s done!
You can now inspect the folder the tools created. If you have tree installed you should get a similar output.
sbimport init ImportCustomerX
tree ImportCustomerX
ImportCustomerX
├── company.config.yaml
├── labels
├── contacts
├── logs
└── properties
└── files
└── README.txt
5 directories, 2 filesThe folder is organised by entity that can be imported. Currently, we support import of properties (properties subfolder), labels (labels subfolder) and contacts (contacts subfolder).
We also have:
company.config.yaml: the configuration file that contains the information specified during the init (we reserve the possibility to expand this in the future with more options).cache.sqlite: The file what the tool uses to keep track of what has been imported and avoid importing the same data multiple times.logs: The folder where all the logs will be persisted.
import command
The import command expect a folder name as an argument and support a few options. To have a full overview of the available option run
sbimport import --helpThe import command will attempt to import every record it finds in the import folder provided as an argument. These behaviour can be changed by using specific options. Having the possibility to select what to import is useful when you just need to import specific records.
The sbimport keeps track of what has been process so that consecutive execution of the same dataset will not generate unnecessary import operations. It does that by storing an hash for each record and asset in the datasets and comparing it during subsequent executions. When the comparison fail we consider the record to be imported.
All options are:
--entities: Allow to import just the selected entity type. Available options areproperty,propertyAsset,contact,label. You can specify multiple entities by repeating the option.--force: By default thesbimportremembers what was processed to avoid importing the same data. To prevent that use this option.--batchSize: The number of records to import in a single batch. Default is 100.--files: Number of async file operations scheduled together. Only for property assets. Default is 5.--silent: Do not print any output to the console. Default is false.--maxAttempts: The number of attempts to retry a failed request. Default is 10.--retryWaitMs: The time to wait between retries in milliseconds. Default is 1000. It doubles at each retry.--uri: The URI to the cloud storage. It’s used to import data from a cloud storage. Read more in the Cloud import section.
Environment variables
| CLI option | Environment variable | Defaults |
|---|---|---|
uri | SBIMPORT_IMPORT_URI | See in the Cloud import section |
force | SBIMPORT_IMPORT_NO_CACHE | false |
silent | SBIMPORT_IMPORT_SILENT | false |
maxAttempts | SBIMPORT_IMPORT_MAX_RETRY_ATTEMPTS | 10 |
retryWaitMs | SBIMPORT_IMPORT_RETRY_WAIT_MS | 1000 |
batchSize | SBIMPORT_IMPORT_BATCH_SIZE | 100 |
files | SBIMPORT_IMPORT_ASYNC_FILE_OPERATION_COUNT | 5 |
tasks | SBIMPORT_IMPORT_ASYNC_TASKS_COUNT | 1 |
entities | SBIMPORT_IMPORT_ENTITIES | label,contact,property,propertyAsset (all) |
Examples
To import everything in the import folder test
sbimport import testTo import just properties datasets in the import folder test
sbimport import test --entities propertyTo import properties datasets and relative images in the import folder test.
sbimport import test --entities property --entities propertyAssetPreparing the datasets
The sbimport tool expect data in JSONL format. In short, files using JSONL expect to have a valid JSON object per line.
To convert a normal JSON file to the JSONL version that the tool expect, you could use jq as follows:
jq -c '.[]' < dataset.json > dataset.jsonlOnce your datasets are ready, move them to the respective folders:
- Properties datasets in
propertiesfolder - Contacts datasets in the
contactsfolder. - Labels datasets in the
labelsfolder.
Company config file
The company.config.yaml file is used to store the configuration of the import process. It’s a YAML file that contains the following:
env: production
key: SB************************The key property is used to store the API key to use to authenticate the requests.
Batch deduplication
The tool automatically deduplicate the datasets for entities which has updated_at attribute in their schema.
The duplicates are identified by the id attribute, and the updated_at attribute is used to determine which record
is the most recent one to keep. The import stats don't count the duplicates in any metrics.
Batch example:
{"id": "1", "updated_at": "2024-12-01T00:00:00Z", "name": "John Doe"} // This will be excluded
{"id": "1", "updated_at": "2024-12-02T00:00:00Z", "name": "John Doe"} // This record will be kept
{"id": "2", "updated_at": "2024-12-01T00:00:00Z", "name": "Jane Doe"}Preparing properties files
To support SweepBright properties file options, we decided to organise the import folder properties files using the following hierarchy (it’s also described in the Readme file inside the files folder)
This directory is used to store the properties files.
The structure is as follows:
{property-reference}/
├── documents/
│ ├── private/
│ └── public/
├── images
│ ├── private/
│ └── public/
└── plans
├── private/
└── public/
Replace {property-reference} with your property ID.For example, assuming you are importing one property with ID a3155152-3cb3-4878-b1e6-39466844328c, and this property has:
doc1.pdfpublic anddoc2.pdfprivateimg1.pngpublic andimg2.pngprivate- and no plans
Important node: the order of folders and files is important because file paths are used to determine the ordinal of the property assets. Both local and cloud imports rely on the natural sorting of the file paths, which is alphanumeric ascending.
The files folder should look like that
files
└── a3155152-3cb3-4878-b1e6-39466844328c
├── documents
│ ├── private
│ │ └── doc2.pdf
│ └── public
│ └── doc1.pdf
├── images
│ ├── private
│ │ └── img2.png
│ └── public
│ └── img1.png
└── plans
├── private
└── publicCloud import
The sbimport tool can be used to import data from a cloud storage. The tool supports the following cloud storage:
- AWS S3
To use the cloud import feature, you need to provide additional environment variables or CLI options.
URI to the cloud storage
- Argument:
--uri - Environment variable:
SBIMPORT_URI
| Cloud storage | URI format |
|---|---|
| AWS S3 | https://<BUCKET_NAME>.s3.<REGION>.amazonaws.com/ |
8 months ago
9 months ago
9 months ago
10 months ago
9 months ago
9 months ago
10 months ago
10 months ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
12 months ago
12 months ago
12 months ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
2 years ago
1 year ago
2 years ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
1 year ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago
2 years ago