@meeshkanml/http-types-kafka v0.0.2
http-types-kafka
Tools for writing ts-http-types to Kafka in Node.js, powered by kafka.js.
The library is a wrapper around kafkajs so it must be installed as peer dependency.
Installation
kafkajs must be installed as peer dependency.
$ yarn add kafkajs http-types-kafka
# or
$ npm i kafkajs http-types-kafaQuick start
First create the topic you're writing to:
$ kafka-topics.sh --bootstrap-server localhost:9092 --topic express_recordings --create --partitions 3 --replication-factor 1Note that you may need to change script name depending on how you installed Kafka.
Create a HttpTypesKafkaProducer and connect to Kafka:
import { CompressionTypes, KafkaConfig, ProducerConfig } from "kafkajs";
import { HttpTypesKafkaProducer } from "http-types-kafka";
// Create a `KafkaConfig` instance (from kafka.js)
const brokers = ["localhost:9092"];
const kafkaConfig: KafkaConfig = {
clientId: "client-id",
brokers,
};
const producerConfig: ProducerConfig = { idempotent: false };
// Specify the topic
const kafkaTopic = "express_recordings";
// Create the producer
const producer = HttpTypesKafkaProducer.create({
compressionType: CompressionTypes.GZIP,
kafkaConfig,
producerConfig,
topic: kafkaTopic,
});
// Connect to Kafka
await producer.connect();Send a single HttpExchange to Kafka:
const exchange: HttpExchange = ...;
await producer.send(exchange);Send multiple HttpExchanges:
const exchanges: HttpExchange[] = ...;
await producer.sendMany(exchanges);Send recordings from a JSON lines file, where every line is a JSON-encoded HttpExchange:
await producer.sendFromFile("recordings.jsonl");Finally, disconnect:
await producer.disconnect();Delete the topic if you're done:
$ kafka-topics.sh --bootstrap-server localhost:9092 --topic express_recordings --deleteCommand-line interface
See available commands:
$ http-types-kafkaProducer
First create the destination topic in Kafka.
To send recordings from recordings.jsonl to Kafka, run:
$ http-types-kafka producer --file=recordings.jsonl --topic=my_recordingsDevelopment
Install dependencies:
$ yarnBuild a package in lib:
$ yarn compileRun tests:
$ ./docker-start.sh # Start Kafka and zookeeper
$ yarn test
$ ./docker-stop.sh # Once you're donePackage for npm:
$ npm packPublish to npm:
$ yarn publish --access publicPush git tags:
$ TAG=v`cat package.json | grep version | awk 'BEGIN { FS = "\"" } { print $4 }'`
# Tagging done by `yarn publish`
# git tag -a $TAG -m $TAG
$ git push origin $TAGWorking with local Kafka
First start kafka and zookeeper:
# See `docker-compose.yml`
docker-compose upCreate a topic called http_types_kafka_test:
docker exec kafka1 kafka-topics --bootstrap-server kafka1:9092 --topic http_types_kafka_test --create --partitions 3 --replication-factor 1Check the topic exists:
docker exec kafka1 kafka-topics --bootstrap-server localhost:9092 --listDescribe the topic:
docker exec kafka1 kafka-topics --bootstrap-server localhost:9092 --describe --topic http_types_kafka_testUsing kafkacat
List topics:
kafkacat -b localhost:9092 -LPush data to topic from file with snappy compression:
tail -f tests/resources/recordings.jsonl | kafkacat -b localhost:9092 -t http_types_kafka_test -z snappyConsume messages from topic to console:
kafkacat -b localhost:9092 -t http_types_kafka_test -C