This demo and accompanying playbook show users how to deploy an Apache Kafka® event streaming application using KSQL and Kafka Streams for stream processing. All the components in the Confluent platform have security enabled end-to-end. Run the demo with the playbook.
Table of Contents
The use case is an event streaming application that processes live edits to real Wikipedia pages. Wikimedia Foundation has IRC channels that publish edits happening to real wiki pages (e.g. #en.wikipedia, #en.wiktionary) in real time. Using Kafka Connect, a Kafka source connector kafka-connect-irc streams raw messages from these IRC channels, and a custom Kafka Connect transform kafka-connect-transform-wikiedit transforms these messages and then the messages are written to a Kafka cluster. This demo uses KSQL and Kafka Streams for data enrichment. Then a Kafka sink connector kafka-connect-elasticsearch streams the data out of Kafka, applying another custom Kafka Connect transform called NullFilter. The data is materialized into Elasticsearch for analysis by Kibana. Use Confluent Control Center for management and monitoring.
You can find the documentation for running this demo and its accompanying playbook at https://docs.confluent.io/current/tutorials/cp-demo/docs/index.html.
For additional examples that showcase streaming applications within an event streaming platform, please refer to the examples GitHub repository.