1.1.5 • Published 5 years ago

elasticsearch-batch-stream v1.1.5

Weekly downloads
88
License
MIT
Repository
github
Last release
5 years ago

elasticsearch-batch-stream

Travis Codacy Badge Known Vulnerabilities Coverage Status Greenkeeper badge js-standard-style

A write stream that creates batches of elasticsearch bulk operations.

Example

The ElasticSearch library has a function to bulk write documents, but since a stream emits a write for each document, we cannot group multiple operations together.

This package wraps the bulk function in a writestream to help buffer the operations and passing them on as batches to the bulk function. For example, we can now create batches of 500 docs each and reduce the number of API calls to ElasticSearch from 100.000 to 200, which will improve speed.

  const docTransformStream = through2.obj(function (chunk, enc, callback) {
    // convert chunk => doc
    const doc = { index: 'myindex', type: 'mytype', id: '12345', action: 'index', doc: { name: 'test' } }
    callback(null, doc)
  })

  sourceReadStream().pipe(docTransformStream()).pipe(bulkWriteStream({ client, size: 500 }))

Installation

$ npm install elasticsearch-batch-stream

API

bulkWriteStream(options = { client, size })

Creates the write stream to ElasticSearch.

options

The options object argument is required and should at least include the ElasticSearch client object.

client

An instance of the ElasticSearch client i.e. new elasticsearch.Client()

size

Number of stream operations to group together in the bulk command (default = 100).

Maintainers

Osmond van Hemert Github Web

Contributing

If you would like to help out with some code, check the details.

Not a coder, but still want to support? Have a look at the options available to donate.

License

Licensed under MIT.

1.1.5

5 years ago

1.1.4

5 years ago

1.1.3

5 years ago

1.1.2

5 years ago

1.1.1

5 years ago

1.1.0

5 years ago

1.0.0

5 years ago