Kafka and MQTT are two complementary technologies. Together, they allow us to build IoT end-to-end integration from the edge to the data center — no matter if on-premise or in the public cloud. I talk about this topic in greater detail at the Kafka Summit SF in San Francisco in October 2018: “Processing IoT Data from End to End with MQTT and Apache Kafka“.
As preparation for the talk, I created two demo projects on GitHub. I want to share these here. The full talk (slide deck + video recording + demo) is available on the website of Kafka Summit: Kafka MQTT Integration.
Motivation for Using Apache Kafka and MQTT Together
MQTT is a widely used ISO standard (ISO/IEC PRF 20922) publish-subscribe-based messaging protocol. MQTT has many implementations, such as Mosquitto or HiveMQ. For a good overview, see Wikipedia’s comparison of MQTT brokers. MQTT is mainly used for Internet of Things scenarios (like connected cars or smart home devices). But, it is also used more and more in mobile devices due to its support for WebSockets.
However, MQTT is not built for high scalability, longer storage, or easy integration to legacy systems. Apache Kafka is a highly scalable distributed streaming platform. Kafka ingests, stores, processes, and forwards high volumes of data from thousands of IoT devices.
Therefore, MQTT and Apache Kafka are a perfect combination of end-to-end IoT integration from the edge to the data center (and back, of course, i.e. bi-directional).
Let’s take a look at two different alternatives for building this integration. Both solutions allow highly scalable and mission-critical integration of IoT scenarios. However, they use different concepts, including trade-offs.
MQTT Broker, Apache Kafka, and Kafka Connect MQTT Connector
This project focuses on the integration of MQTT sensor data into Kafka via MQTT Broker and Kafka Connect for further processing:
In this approach, you pull the data from the MQTT Broker via Kafka Connect to the Kafka Broker. You can leverage any features of Kafka Connect, such as built-in fault tolerance, load balancing, Converters, and Simple Message Transformations (SMT) for routing/filtering/etc., scaling different connectors in one Connect worker instance and other Kafka Connect related benefits.
Here is the GitHub project, including demo script, to try it out by yourself:
MQTT Proxy and Apache Kafka (no MQTT Broker)
As an alternative to using Kafka Connect, you can leverage Confluent MQTT Proxy. You will need to integrate IoT data from IoT devices directly without the need for an MQTT Broker:
In this approach, you push the data directly to the Kafka broker via the Confluent MQTT Proxy. You can scale MQTT Proxy easily with a Load Balancer (similar to Confluent REST Proxy). The huge advantage is that you do not need an additional MQTT Broker in the middle. This reduces efforts and costs significantly. It is a better approach for pushing data into Kafka.
Here is the Github project, including demo script, to try it out by yourself:
Both approaches have their trade-offs. The good thing is that you have the choice. Choose the right tool and architecture for your use case!
Conclusion: MQTT and Apache Kafka Are Complementary and Integrate Well
Both, MQTT and Apache Kafka have great benefits for their own use cases. But, none of them are the single allrounder for everything. The combination of both makes them very powerful and a great solution to build IoT end-to-end scenarios from the edge to data center and back.
At Kafka Summit, I will share many more details and content, including the following:
- How to integrate MQTT and Apache Kafka end-to-end (not just talking about connectivity, but also about data processing, filtering, routing, etc.)
- When to choose Kafka Connect and MQTT Broker vs. MQTT Proxy without MQTT Broker
- Slides and video recording of talk and live demo
By the way, if you want to see an IoT example with Kafka sink applications, like Elasticsearch/ Grafana, please take a look at the project “Kafka and KSQL for Streaming IoT Data from MQTT to the Real Time UI.” This example demonstrates how to realize the integration with ElasticSearch and Grafana via Kafka Connect.