A Kafka Connect Sink plugin that allows to ingest events from Kafka to Neo4j via templated Cypher statements. (docs, article)
-
Source: a Change-Data-Capture (CDC) implementation sends change data to Kafka topics (docs)
-
Sink: a Neo4j extension that ingest data from Kafka topics into Neo4j via templated Cypher statements (docs)
-
Neo4j Streams Procedures (Read & Write): Procedures to write to and read from topics interactively/programmatically (docs)
Read more at http://r.neo4j.com/kafka
Here are articles, introducing the Neo4j Extension and the Kafka Connect Neo4j Connector.
And practical applications of the extension for Building Data Pipelines with Kafka, Spark, Neo4j & Zeppelin (part 2).
And for exchanging results of Neo4j Graph Algorithms within a Neo4j Cluster.
Please raise issues on GitHub, we also love contributions, so don’t be shy to send a Pull Request.
We would also love you to fill out our survey to learn more about your Kafka + Neo4j use-cases and deployments.
You can run/test the extension locally with Docker, or install it manually into your existing Neo4j server.
-
Download the jar-file from the latest release
-
Copy
neo4j-streams-<VERSION>.jar
into$NEO4J_HOME/plugins
-
Update
$NEO4J_HOME/conf/neo4j.conf
with the necessary configuration. -
Restart Neo4j
mvn clean install
You’ll find the build artifact in <project_dir>/target/neo4j-streams-<VERSION>.jar
Testing the Kafka Connect Neo4j Connector locally with Docker.