A modern Apache Kafka client for node.js. This library is compatible with Kafka 0.10+
.
Native support for Kafka 0.11
features.
KafkaJS is battle-tested and ready for production.
- Producer
- Consumer groups with pause, resume, and seek
- Transactional support for producers and consumers
- Message headers
- GZIP compression
- Snappy and LZ4 compression through plugins
- Plain, SSL and SASL_SSL implementations
- Support for SCRAM-SHA-256 and SCRAM-SHA-512
- Support for AWS IAM authentication
- Admin client
Read something on the website that didn't work with the latest stable version?
Check the pre-release versions - the website is updated on every merge to master.
npm install kafkajs
# yarn add kafkajs
const { Kafka } = require('kafkajs')
const kafka = new Kafka({
clientId: 'my-app',
brokers: ['kafka1:9092', 'kafka2:9092']
})
const producer = kafka.producer()
const consumer = kafka.consumer({ groupId: 'test-group' })
const run = async () => {
// Producing
await producer.connect()
await producer.send({
topic: 'test-topic',
messages: [
{ value: 'Hello KafkaJS user!' },
],
})
// Consuming
await consumer.connect()
await consumer.subscribe({ topic: 'test-topic', fromBeginning: true })
await consumer.run({
eachMessage: async ({ topic, partition, message }) => {
console.log({
partition,
offset: message.offset,
value: message.value.toString(),
})
},
})
}
run().catch(console.error)
Learn more about using KafkaJS on the official site!
KafkaJS is an open-source project where development takes place in the open on GitHub. Although the project is maintained by a small group of dedicated volunteers, we are grateful to the community for bugfixes, feature development and other contributions.
See Developing KafkaJS for information on how to run and develop KafkaJS.
We welcome contributions to KafkaJS, but we also want to see a thriving third-party ecosystem. If you would like to create an open-source project that builds on top of KafkaJS, please get in touch and we'd be happy to provide feedback and support.
Here are some projects that we would like to build, but haven't yet been able to prioritize:
- Dead Letter Queue - Automatically reprocess messages
- Schema Registry -
Seamless integration with the schema registry to encode and decode AVRONow available! thanks to @erikengervall - Metrics - Integrate with the instrumentation events to expose commonly used metrics
Thanks to Sebastian Norde for the V1 logo ❤️
Thanks to Tracy (Tan Yun) for the V2 logo ❤️
See LICENSE for more details.