This repository contains examples of use cases that utilize Decodable streaming solution as well as demos for related open-source projects such as Apache Flink, Debezium, and Postgres.
Examples are presented "as-is" and are maintained on a best effort basis. PRs for updating existing (or adding new) examples are welcome!
For help with any of the examples, or using Decodable in general, please join our Slack group.
Decodable radically simplifies real-time data, making it easier to access the freshest, high-quality data. Reduce infrastructure overhead, connect data sources, transform, and deliver data reliably to any destination.
Learn more here, and sign up for a free trial today!
Example | Description |
---|---|
Change Streams | Using change streams to build materialized views in Postgres |
XML Processing | Parse XML and transform to JSON |
Masking | Ways to mask data |
Array Aggregation (1) | Demonstrating how to aggregate the elements of the many side of 1:n join into an array with data from Postgres |
Array Aggregation (2) | Using the array_agg() UDF for denormalizing data in a pipeline from MySQL to OpenSearch |
Example | Description |
---|---|
Opinionated Data Pipelines | Building data pipelines with schema on write streams. |
Postman | Building data pipelines with Postman. |
Decodable provides a managed PyFlink service. Learn more here.
Example | Description |
---|---|
PyFlink | Running a basic PyFlink job on Kubernetes |
PyFlink on Decodable | Running a PyFlink job as a Custom Pipeline on Decodable |
PyFlink and MongoDB Vector Search | End-to-end example for PyFlink Vector Ingestion on Decodable with MongoDB Vector Search |
Example | Description |
---|---|
Apache Druid | Sending COVID-19 data to Decodable using its REST API, cleaning it with SQL, and then sending it to Apache Druid |
Apache Kafka / Flink / Iceberg | Integrating Apache Kafka with Apache Iceberg through Apache Flink. As presented at Kafka Summit London 2024 |
Apache Kafka / Flink / Iceberg (with Decodable) | Streaming from Apache Kafka to Apache Iceberg with Decodable |
Apache Kafka Upsert connector | Explaining the difference between the Flink Kafka and Kafka Upsert connectors |
Apache Kafka mTLS | Installing Apache Kafka on EC2 and configuring it with mTLS |
Apache Kafka with ngrok | Using Docker Compose for running Apache Kafka locally, accessible from the internet using ngrok |
Apache Kafka | Installing Apache Kafka on EC2 and writing to S3 with Decodable |
Apache Pinot | Transforming osquery logs to Apache Pinot and Superset |
AsyncAPI | Publishing Data Products with AsyncAPI |
Confluent | Clickstream from Confluent Cloud joined with CDC user data from Postgres |
Delta Lake / Flink | Writing to Delta Lake with Apache Flink |
GitHub Webhooks | Processing GitHub Webhook events using the Decodable REST source connector |
OSQuery Routing | Routing OSQuery logs with SQL |
Redpanda | Reading and writing data to Redpanda from Flink |
S3 Events in a Lambda Function | Configuring an S3 bucket with a Lambda notification to send data to Kinesis to be processed in Decodable |
Tinybird | Writing data to Tinybird from Decodable |
Example | Description |
---|---|
MSSQL CDC | Enabling MSSQL in Docker with CDC, reading from it with Debezium, writing change events into AWS Kinesis |
Oracle CDC | Configuring Oracle AWS RDS with LogMiner, reading from it with Debezium, writing change events into AWS Kinesis |
DynamoDb CDC | Configure DynamoDB to send change data to Kinesis, reading changes into Decodable for transformation or replication. |
Logical Decoding Message Examples | How to retrieve logical decoding messages from the Postgres WAL |
Example | Description |
---|---|
Flink SQL Troubleshooting | A set of Docker Compose environments for demonstrating various Flink SQL troubleshooting scenarios (see related blog) |
Example | Description |
---|---|
Decodable CI/CD | An example of using Decodable with GitHub Actions |
Decodable CLI Docker image | An example Dockerfile for running the Decodable CLI under Docker. |
This code base is available under the Apache License, version 2.