Kafka Connect, Kafka, Avro, Nifi and Data Pipeline
With Kafka Connect, Avro, Kafka and Nifi working together, we can build a close to real time data pipeline for system integration. Here is how it works.
The following is a diagram showing how the data pipeline works.
- Kafk Connect captures the data changes in a traditional RDBMS like MySQL or SQLServer using a JDBC connector in real time.
- The connect serializes the captured data using Avro and publishes it to a Kafka topic
- Nifi consumes the data using built-in processor ConsumeKafka and with Avro (for deserialization)
- Nifi transforms, enriches, and persists the data to MongoDB or Other big data storage
The following is a diagram showing how the data pipeline works.
Comments
Post a Comment