Small image processing demo to test Kafka (w/ Avro) integration in a full frontend/backend application
A Kafka integration in a minimal but "realistic" environment, where there is a client/backend pair handling requests and interacting with Kafka, and additional backends doing processing over the Kafka streams. The kafka data is saved in Avro format
- CycleJS for reactive frontend
- Akka kafka streams integration
- Avro integration
- Schema evolution example
- Avro serialization/deserialization using auto-generated case classes from Avro schemas
All the services are dockerized and orchestrated through docker-compose.
- docker
- docker-compose
- Free ports: The following ports need to be free on your system to run the application
- 3000: Frontend
- 8083: Avro Schema registry
- 8084: Kafka topics / messages
- Build project dockerfiles:
$ docker-compose -f ./docker/docker-compose build
- Run project:
$ docker-compose -f ./docker/docker-compose up
- Point browser to http://localhost:3000
- CycleJS reactive UI that communicates with the backend via Websockets
- More details
- Scala Play backend that connects to the client via websockets and receives processing requests from it.
- Forwards data to Kafka to be processed by another backend. So it behaves as a Kafka producer
- More details
- This is a Kafka stream backend that connects to Kafka and transforms streams (processes images from image urls)
- More details
A Kafka/Zookeeper pair that provides an event system for the application