0.0.1 • Published 7 years ago
google-cloud-storage-bulk v0.0.1
google-cloud-storage-bulk
google cloud storage bulk upload
Table of Contents
Install
npm:
npm install google-cloud-storage-bulkyarn:
yarn add google-cloud-storage-bulkUsage
const GCS = require('google-cloud-storage-bulk');
const gcs = new GCS({
projectId,
bucketName: 'my-bucket',
concurrency: 100,
hashStrategy: 'file',
retries: 3,
subdirectory: 'application-name',
uploadOptions = {}
});
// ... in async function
await gcs.uploadFiles(someDirectory);Options
projectId- (required) - google cloud project id (see project reference for more details)bucketName- (required) - cloud storage bucket nameconcurrency- (optional- default:250) - upload concurrency limitsretries- (optional- default:3) - upload retry attemptshashStrategy- (required- options:none,file,subdirectory) - described in detail belowsubdirectory- (optional) - subdirectory within cloud storage bucket to push content intouploadOptions- (optional) - extends@google/cloud-storageupload options
hashStrategy
none
No hashing to file structure, just push content as is. Google cloud does handle metageneration that you can take advantage of for versioning.
Before:
|_directory_to_upload
|_a.js
|_b.js
|_c.css
After:
|_subdirectory
|_a.js
|_b.js
|_c.cssfile
Hash per file currently using sha1 algorithm.
Before:
|_directory_to_upload
|_a.js
|_b.js
|_c.css
After:
|_subdirectory
|_a.a9993e364706816aba3e25717850c26c9cd0d89d.js
|_b.924f61661a3472da74307a35f2c8d22e07e84a4d.js
|_c.bcb8c41b803b91661b5e6ee45362f47df368a731.css
|_asset-manifest.json <-- references initial file to their new hash complementsubdirectory
Hash content of the directory being uploaded and use this directory hash as the bucket subdirectory.
Before:
|_directory_to_upload
|_a.js
|_b.js
|_c.css
After:
|_subdirectory
|_924f61661a3472da74307a35f2c8d22e07e84a4d
|_a.js
|_b.js
|_c.cssContributors
| Name | Website |
|---|---|
| Shaun Warman | https://shaunwarman.com |
License
0.0.1
7 years ago