@hoekma/redpopconsumer v1.0.10
RedPop Consumer
RedPopConsumer represents a complete, ready to go consumer. To use it
- Edit src/config.js to meet your requirements
- Edit src/consumer.js to meet your requirements
- If you don't have a running Redis 5.0 server, run
docker-compose up -dto start a redis server. Docker must be installed and running of course. - run
npm start
See full documentation at https://www.github.com/hoekma/redpop
Lifecyle for the RedPop Consumer is the following:
RedPop Consumer startup:
When
npm startruns, Node runssrc/index.jswhich- instantiates
ProtoConsumer, passingsrc/config.jsto its constructor. ProtoConsumer is a subclass of RedPop's Consumer class. - Runs
Consumer::start()to start the consumer which is an infinite loop listening for events.
- instantiates
The first event hook available to the developer is
Consumer::init(). This runs once before the consumer begins listening for events. It never runs again. The default ProtoConsumer behavior is to outputStarting ProtoConsumerto the console.ProtoConsumer then starts listenting for events to be published on to the Redis stream. It waits
config.waitTimeMsmilliseconds for a batch of events from the Redis Stream defined insrc/config.js.If it does not recieve a batch of events, the next event hook available to the developer is
Consumer::onBatchesComplete(). This runs each time the Consumer polls for events and doesn't receive anything.
RedPop Consumer Event Processing
Because RedPop is an event driven system, nothing happens until an external
eventhappens. This takes place in the form of aPublisherputting a events into the the stream defined insrc/config.js.When ProtoConsumer listens for events and the Redis server has unplayed event(s), the Redis server assigns a batch of events to the instance of ProtoConsumer. The size of the batch is limited by
config.consumer.batchSizewhich can be tuned for performance in high-volume implementations.RedPop's Consumer will process each event in the batch one at a time. The next event hook available to the developer is
Consumer::processEvent(). For each event in the batch, processEvent will receive a JSON object in the following format:
{ id: `1234567890123-0`, // event ID
data: { ...name-value-pairs }
}ProcessEvent() is where your business logic acts on the data in the event payload. For instance, yu may update a database record, send an SMS message, or update a log file. Return
trueto signal te event was successfully processed orfalseto indicate it needs to be reprocessed. It will be reprocessed up toconfig.consumer.eventMaximumReplaystimes.Afer RedPop's Consumer processes all of the events in the batch, the next event
BatchCompleteruns, and an event hook is available to the developer:onBatchComplete().Then the consumer goes back into listening mode waiting for another batch. If one is immediately avalable, it will play the events to
processEvent(). If not, it runsonBatchesComplete()and resumes listening.
The full event processing lifecycle is:
EVENT BATCH RECEIVED_ => processEvent() => onBatchComplete() => onBatchesComplete()
See it in action
This test will require at least two terminal windows. One will contain your consumer that is waiting for events. The other will run an NPM script that publishes events. You will be able to see the consumer playing the event.
- Open two terminal windows
cdto the directory with redpop consumer.- In the first terminal window, run
npm i - When that finishes, run
npm startto start the consumer - You should see a message saying
Starting ProtoConsumeralong with some capital "B" that will start showing the consumer polling for batches of events. - In the second terminal window, type
npm run publishTestEvents - Observe the event IDs being published in the window.
- Observe the consumer output showing the event IDs.
I recommend you play with the consumer's processEvent method to see how you can act on the event that is passed in. Here are some ideas:
- See if you can make the consumer output the message instead of the event ID.
- Try increasing the number of events published in the
publishTestEvent.jsfile to 100,000. Start another consumer in a third terminal window and publish 100K events to see your consumer scale horizontally. Notice the small'b'in front of the event id. This is an indicator that the consumer has finished processing a batch, which means that the consumers are processing the events as fast as the publisher is publishing them. - Run your publishers at least 10 times to build up over 1 million events in the stream. Taking too long? Create more publisher windows to simulate horizontal load.
- Depending on your computer (primarily cores available to horizontally scale) this might take a number of minutes. You can check in a new window by running
npm run getEventCountto see how many events have been added to the stream. - After your consumer finishes processing (or you run out of patience and ctl-c the publishers). Launch 10 or more consumer windows. Here you are simulating your server consumer environment scaling out servers to handle huge workloads.
- In a publisher window, type
npm run replayTestEvents. This will replay all of your events in the stream -- over a million events (or however many messages you saw when your rannpm getEventCount). You will notice that replaying these events takes a fraction of the time now that they are available for replay, which makes transaction-intensive operations like data science easy to iterate on.
Conclusion
That's it! All you really have to do at a bare minimum is add some basic logic to Consumer::processEvent() and the rest is up to your imagination.