1.1.3 • Published 6 years ago

archive-stream-to-s3 v1.1.3

Weekly downloads
9
License
ISC
Repository
github
Last release
6 years ago

A stream for writing the contents of a tar file to s3

Usage

With Writable

const { ArchiveStreamToS3 } = require("archive-stream-to-s3");
const gunzip = require("gunzip-maybe");

const toS3 = new ArchiveStreamToS3("my-bucket", "some/prefix/to/add", s3, [
  /.*foo.txt$/
]);

toS3.on("finish", () => {
  console.log("upload completed");
});

toS3.on("error", e => {
  console.error(e);
});

const archive = fs.createReadStream("archive.tgz");

//Note: if you have compressed archive you can decompress w/ gunzip-maybe.
archive.pipe(gunzip()).pipe(toS3);

With Promise

You can also use the promise function that has gunzip built in.

const { promise } = require("archive-stream-to-s3");

const archive = fs.createReadStream("archive.tgz");

promise("my-bucket", "prefix", s3, archive).then(result => {
  console.log(result); //=> { keys: [...]}
});

Contributing

Unit tests

npm run test

Integration tests

For integration tests to run you'll need to set the following env vars:

namepurpose
ARCHIVE_STREAM_TO_S3_TEST_BUCKETthe s3 bucket, this bucket is removed and recreated
ARCHIVE_STREAM_TO_S3_TEST_PREFIXthe prefix to use

You'll also need the aws-cli commands on your path.

Run

npm run it

Release

npm run release
1.1.3

6 years ago

1.1.1

6 years ago

0.0.1-security

6 years ago