0.1.2 • Published 3 years ago

pino-kafka v0.1.2

Weekly downloads
13
License
MIT
Repository
github
Last release
3 years ago

pino-kafka

This module provides a "transport" for pino that simply forwards messages to kafka.

You should install pino-kafka globally for ease of use:

$ npm install --production -g pino-kafka
### or with yarn
$ yarn global add pino-kafka

Requirements

This library depends on node-rdkafka. Have a look at node-rdkafka requirements.

Usage

CLI

Given an application foo that logs via pino, and a kafka broker listening on 10.10.10.5:9200 you would use pino-kafka as:

$ node foo | pino-kafka -b 10.10.10.5:9200

Programmatic Usage

Initialize pino-kafka and pass it to pino.

const pino = require('pino')
const pkafka = require('pino-kafka')

const logger = pino({}, pkafka({ brokers: "10.10.10.5:9200"}))

Options

  • --brokers (-b): broker list for kafka producer. Comma seperated hosts
  • --defaultTopic (-d): default topic name for kafka.
  • --timeout (-t): timeout for initial broker connection in milliseconds. Default 10000
  • --echo (-e): echo the received messages to stdout. Default: false.
  • --settings: path to config JSON file. Have a look at Settings JSON file section for details and examples
  • --kafka.$config: any kafka configuration can be passed with prefix kafka. Please visit node-rdkafka configuration for available options. Note that only producer and global configuration properties will be used. Have a look at Kafka Settings section for details and examples

Settings JSON File

The --settings switch can be used to specify a JSON file that contains a hash of settings for the application. A full settings file is:

{
  "brokers": "10.6.25.11:9092, 10.6.25.12:9092",
  "defaultTopic": "blackbox",
  "kafka": {
    "compression.codec":"none",
    "enable.idempotence": "true",
    "max.in.flight.requests.per.connection": 4,
    "message.send.max.retries": 10000000,
    "acks": "all"
  }
}

Note that command line switches take precedence over settings in a settings file. For example, given the settings file:

{
  "brokers": "my.broker",
  "defaultTopic": "test"
}

And the command line:

$ yes | pino-kafka -s ./settings.json -b 10.10.10.11:9200

The connection will be made to address 10.10.10.11:9200 with the default topic test.

Kafka Settings

You can pass node-rdkafka producer configuration by prefixing the property with kafka. For example:

$ yes | pino-kafka --kafka.retries=5 --kafka.retry.backoff.ms=500

In the Setting JSON File you can use followings:

{
  "kafka": {
    "retries": "5",
    "retry.backoff.ms": "500"
  }
}

Following will work also:

{
  "kafka": {
    "retries": "5",
    "retry":{
      "backoff": {
        "ms":  "500"
      }
    }
  }
}

Accessing Internal Kafka Producer

You can access node-rdkafka producer from pino stream with _kafka.

For example:

const pino = require('pino')

const logger = pino({}, pkafka({ brokers: "10.10.10.5:9200"}))

logger[pino.symbols.streamSym]._kafka.getMetadata({}, (err, data)=> {
    //...
})

Testing

For running tests make sure you installed dependencies with npm install or yarn and have a running kafka. More easily, if you have docker and docker-compose installed, you can create one with following.

$ cd pino-kafka
$ docker-compose up -d

Look at docker-compose file for more details.

After you all setup, just run test command with following:

$ npm run test
# or with yarn
$ yarn test

NOTE: If you use your own kafka setup, you may need to change test configuration accordingly to your needs(ip, topic etc.)

License

MIT