This example shows how to create a Confluent Kafka Cloud Producer/Consumer in Go and deploy it to Kubernetes.
-
Linux (Ubuntu) or Mac OS
-
gvm Go 1.23
gvm install go1.23 --prefer-binary --with-build-tools --with-protobuf gvm use go1.23 --default
-
[Cross compilation on Ubuntu with CGO ] Optional
Install LLVM Compiler Infrastructure, release 17
./scripts/install-clang-17-ubuntu.sh
Install libraries for cross compilation (Windows, etc.)
./scripts/install-cross-libs-ubuntu.sh
Install osxcross for MacOS cross compilation
./scripts/install-osxcross-ubuntu.sh
-
[Build ARM Images on x86 Hosts] Optional
sudo apt-get update && sudo apt-get install -y --no-install-recommends qemu-user-static binfmt-support update-binfmts --enable qemu-arm update-binfmts --display qemu-arm docker buildx build --platform linux/arm64 --file Dockerfile.consumer -t kafka-confluent-go-consumer:latest .
-
curl -sL --http1.1 https://cnfl.io/cli | sh -s -- latest confluent update # log in to a Confluent Cloud organization confluent login --save
-
docker Optional
-
GoReleaser Optional
-
kubectl Optional
Following steps are required.
-
Create Environment and Cluster - https://confluent.cloud/home
-
Export Confluent Environment ID as CONFLUENT_ENV
xdg-open https://confluent.cloud/environments export CONFLUENT_ENV=
-
Export Confluent cluster ID as CONFLUENT_CLUSTER
xdg-open https://confluent.cloud/environments/$CONFLUENT_ENV/clusters export CONFLUENT_CLUSTER=
-
Select Environment
confluent environment use $CONFLUENT_ENV
-
Select Cluster
confluent kafka cluster use $CONFLUENT_CLUSTER confluent login --save
-
Create a new API key and secret pair
confluent api-key create --resource $CONFLUENT_CLUSTER
-
Export previously created KEY and SECRET
export CONFLUENT_API_KEY= export CONFLUENT_API_SECRET=
-
Use an API key and secret in the CLI
confluent api-key use $CONFLUENT_API_KEY --resource $CONFLUENT_CLUSTER
-
Export Confluent Kafka Cluster Bootstrap Server -
Cluster settings -> Endpoints -> Bootstrap server
, create kafka.properties from the template and add it to the git repository.xdg-open https://confluent.cloud/environments/$CONFLUENT_ENV/clusters/$CONFLUENT_CLUSTER/settings/kafka export CONFLUENT_BOOTSTRAP_SERVER= sed -e "s%BTSTRP%$CONFLUENT_BOOTSTRAP_SERVER%g" ./tmpl/kafka.properties.tmpl > ./kafka.properties
-
Create .env file
sed -e "s%BTSTRP%$CONFLUENT_BOOTSTRAP_SERVER%g" -e "s%APIKEY%$CONFLUENT_API_KEY%g" -e "s%APISECRET%$CONFLUENT_API_SECRET%g" ./tmpl/.env.tmpl > ./.env
-
Create Confluent Kafka topic
confluent kafka topic create test-topic
confluent kafka topic list
confluent kafka topic produce test-topic
confluent kafka topic consume -b test-topic
To deploy on Kubernetes create configmap and secret:
# create configmap from Kafka properties file
kubectl create configmap kafka-config --from-file kafka.properties -o yaml --dry-run=client >./k8s/cm.yaml
# store $CONFLUENT_API_KEY and $CONFLUENT_API_SECRET as k8s secret
sed -e"s%USR%`echo -n $CONFLUENT_API_KEY|base64 -w0`%g" -e "s%PWD%`echo -n $CONFLUENT_API_SECRET|base64 -w0`%g" ./tmpl/sc.yaml.tmpl > ./k8s/sc.yaml
and then run
make k8s-deploy
make consumer-image-run
make runp
Commands :
help - List available tasks
clean - Cleanup
build - Build
test - Run tests
update - Update dependency packages to latest versions
get - Download and install dependency packages
release - Create and push a new tag
version - Print current version(tag)
consumer-image-build - Build Consumer Docker image
consumer-image-run - Run a Docker image
consumer-image-stop - Run a Docker image
runp - Run producer
runc - Run consumer
k8s-deploy - Deploy to Kubernetes
k8s-undeploy - Undeploy from Kubernetes