Skip to content

Commit

Permalink
resolveFewTODOCommentsInKafka (#42715)
Browse files Browse the repository at this point in the history
* resolve TODO comments

---------

Co-authored-by: annie-mac <xinlian@microsoft.com>
  • Loading branch information
xinlian12 and annie-mac authored Nov 1, 2024
1 parent 78b753b commit ac46a73
Show file tree
Hide file tree
Showing 20 changed files with 666 additions and 84 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ This guide walks through setting up Confluent Cloud using Docker containers.
- Java 11+ ([download](https://www.oracle.com/java/technologies/javase-jdk11-downloads.html))
- Maven ([download](https://maven.apache.org/download.cgi))
- Docker ([download](https://www.docker.com/products/docker-desktop))
- CosmosDB [Setting up an Azure Cosmos DB Instance]<!--(CosmosDB_Setup.md)-->
- CosmosDB [Setting up an Azure Cosmos DB Instance](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md)

## Setup

Expand Down
50 changes: 27 additions & 23 deletions sdk/cosmos/azure-cosmos-kafka-connect/dev/README_Sink.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,9 @@ The connector polls data from Kafka to write to container(s) in the database bas

### Prerequisites

- It is recommended to start with the Confluent Platform (recommended to use this [setup]<!--(Confluent_Platform_Setup.md))--> as this gives you a complete environment to work with.
- It is recommended to start with the Confluent Platform (recommended to use this [setup](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/Confluent_Platform_Setup.md) as this gives you a complete environment to work with.
- If you do not wish to use Confluent Platform, then you need to install and configure Zookeper, Apache Kafka, Kafka Connect, yourself. You will also need to install and configure the Cosmos DB connectors manually.
- Cosmos DB Instance ([setup guide]<!--(CosmosDB_Setup.md))-->
- Cosmos DB Instance ([setup guide](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md)
- Bash shell
- Will not work in Cloud Shell or WSL1
- Java 11+ ([download](https://www.oracle.com/java/technologies/javase-jdk11-downloads.html))
Expand All @@ -33,6 +33,10 @@ Otherwise, you can download the JAR file from the latest [Release](https://mvnre
You can also package a new JAR file from the source code.

```bash
# clone the azure-cosmos repo if you haven't done so already
git clone https://github.com/Azure/azure-sdk-for-java.git
cd sdk/cosmos

mvn -e -DskipTests -Dgpg.skip -Dmaven.javadoc.skip=true -Dcodesnippet.skip=true -Dspotbugs.skip=true -Dcheckstyle.skip=true -Drevapi.skip=true -pl ,azure-cosmos,azure-cosmos-tests -am clean install
mvn -e -DskipTests -Dgpg.skip -Dmaven.javadoc.skip=true -Dcodesnippet.skip=true -Dspotbugs.skip=true -Dcheckstyle.skip=true -Drevapi.skip=true -pl ,azure-cosmos-kafka-connect clean install

Expand Down Expand Up @@ -87,28 +91,28 @@ Create the Cosmos DB Sink Connector in Kafka Connect

The following JSON body defines the config for the Cosmos DB Sink Connector.

> Note: You will need to fill out the values for `azure.cosmos.account.endpoint` and `azure.cosmos.account.key`, which you should have saved from the [Cosmos DB setup guide]<!--(CosmosDB_Setup.md)-->.
> Note: You will need to fill out the values for `azure.cosmos.account.endpoint` and `azure.cosmos.account.key`, which you should have saved from the [Cosmos DB setup guide](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md).
Refer to the [sink properties](#sink-configuration-properties) section for more information on each of these configuration properties.

```json

{
"name": "cosmos-kafka-connectv2",
"config": {
"connector.class": "com.azure.cosmos.kafka.connect.CosmosSinkConnector",
"tasks.max": "5",
"topics": "GreenTaxiRecords",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable": "false",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable": "false",
"azure.cosmos.account.endpoint":"",
"azure.cosmos.account.key":"",
"azure.cosmos.applicationName": "",
"azure.cosmos.sink.database.name":"SampleDatabase",
"azure.cosmos.sink.containers.topicMap":"SampleTestContainer#SampleTestContainerSink"
}
"name": "cosmosdb-sink-connector-v2",
"config": {
"connector.class": "com.azure.cosmos.kafka.connect.CosmosSinkConnector",
"tasks.max": "5",
"topics": "{topic}",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable": "false",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable": "false",
"azure.cosmos.account.endpoint":"{endpoint}",
"azure.cosmos.account.key":"{masterKey}",
"azure.cosmos.applicationName": "{applicationName}",
"azure.cosmos.sink.database.name":"{databaseName}",
"azure.cosmos.sink.containers.topicMap":"{topic}#{container}"
}
}

```
Expand Down Expand Up @@ -157,14 +161,14 @@ Alternatively, use the Connect REST API.
curl -X DELETE http://localhost:8083/connectors/cosmosdb-sink-connector
```

To delete the created Azure Cosmos DB service and its resource group using Azure CLI, refer to these [steps]<!--(CosmosDB_Setup.md#cleanup)-->.
To delete the created Azure Cosmos DB service and its resource group using Azure CLI, refer to these [steps](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md#cleanup).

## Sink configuration properties

The following settings are used to configure the Cosmos DB Kafka Sink Connector. These configuration values determine which Kafka topics data is consumed, which Cosmos DB containers data is written into and formats to serialize the data. For an example configuration file with the default values, refer to [this config]<!--(../src/docker/resources/sink.example.json)-->.
The following settings are used to configure the Cosmos DB Kafka Sink Connector. These configuration values determine which Kafka topics data is consumed, which Cosmos DB containers data is written into and formats to serialize the data. For an example configuration file with the default values, refer to [this config](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/src/docker/resources/sink.example.json).

- [Generic Configs For Sink And Source]<!--(/doc/configuration-reference.md#generic-configurations)-->
- [Configs only for Sink]<!--(/doc/configuration-reference.md#sink-connector-configurations)-->
- [Generic Configs For Sink And Source](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/doc/configuration-reference.md#generic-configurations)
- [Configs only for Sink](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/doc/configuration-reference.md#sink-connector-configurations)

Data will always be written to the Cosmos DB as JSON without any schema.

Expand Down Expand Up @@ -245,7 +249,7 @@ Refer to the [InsertUUID repository](https://github.com/confluentinc/kafka-conne

Using both the `InsertField` and `Cast` SMTs, you can add specify the TTL on each item created in Cosmos DB.

> Note: You will need to enable TTL on the Cosmos DB container to enable TTL at an item level. Refer to the [Cosmos DB setup guide]<!--(CosmosDB_Setup.md)--> or the [Cosmos DB docs](https://learn.microsoft.com/azure/cosmos-db/nosql/how-to-time-to-live?tabs=dotnet-sdk-v3#enable-time-to-live-on-a-container-using-azure-portal) for more information on setting the TTL.
> Note: You will need to enable TTL on the Cosmos DB container to enable TTL at an item level. Refer to the [Cosmos DB setup guide](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md) or the [Cosmos DB docs](https://learn.microsoft.com/azure/cosmos-db/nosql/how-to-time-to-live?tabs=dotnet-sdk-v3#enable-time-to-live-on-a-container-using-azure-portal) for more information on setting the TTL.
Inside your Sink connector config, add the following properties to set the TTL (in seconds). In this following example, the TTL is set to 100 seconds.

Expand Down
54 changes: 29 additions & 25 deletions sdk/cosmos/azure-cosmos-kafka-connect/dev/README_Source.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The Azure Cosmos DB Source connector provides the capability to read data from t

- It is recommended to start with the Confluent Platform (recommended to use this setup) as this gives you a complete environment to work with.
- If you do not wish to use Confluent Platform, then you need to install and configure Zookeper, Apache Kafka, Kafka Connect, yourself. You will also need to install and configure the Cosmos DB connectors manually.
- Cosmos DB Instance ([setup guide]<!--(CosmosDB_Setup.md-->)
- Cosmos DB Instance ([setup guide](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md)
- Bash shell
- Will not work in Cloud Shell or WSL1
- Java 11+ ([download](https://www.oracle.com/java/technologies/javase-jdk11-downloads.html))
Expand All @@ -29,6 +29,10 @@ Otherwise, you can use JAR file from latest [Release](https://mvnrepository.com/
You can also package a new JAR file from the source code.

```bash
# clone the azure-cosmos repo if you haven't done so already
git clone https://github.com/Azure/azure-sdk-for-java.git
cd sdk/cosmos

mvn -e -DskipTests -Dgpg.skip -Dmaven.javadoc.skip=true -Dcodesnippet.skip=true -Dspotbugs.skip=true -Dcheckstyle.skip=true -Drevapi.skip=true -pl ,azure-cosmos,azure-cosmos-tests -am clean install
mvn -e -DskipTests -Dgpg.skip -Dmaven.javadoc.skip=true -Dcodesnippet.skip=true -Dspotbugs.skip=true -Dcheckstyle.skip=true -Drevapi.skip=true -pl ,azure-cosmos-kafka-connect clean install

Expand All @@ -49,32 +53,32 @@ To create a topic inside Control Center, see [here](https://docs.confluent.io/pl

The following JSON body defines the config for the Cosmos DB Source Connector.

>Note: You will need to replace placeholder values for below properties which you should have saved from the [Cosmos DB setup guide]<!--(CosmosDB_Setup.md)-->.
>Note: You will need to replace placeholder values for below properties which you should have saved from the [Cosmos DB setup guide](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md).
>`azure.cosmos.account.endpoint`
>`azure.cosmos.account.key`
Refer to the [source properties](#source-configuration-properties) section for more information on each of these configuration properties.

```json
{
"name": "cosmosdb-source-connectorv2",
"config": {
"connector.class": "com.azure.cosmos.kafka.connect.CosmosSourceConnector",
"tasks.max": "5",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable": "false",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable": "false",
"azure.cosmos.account.endpoint":"{endpoint}",
"azure.cosmos.account.key":"{masterKey}",
"azure.cosmos.application.name": "{applicationName}",
"azure.cosmos.source.database.name":"{database}",
"azure.cosmos.source.containers.includedList":"{container}",
"azure.cosmos.source.changeFeed.maxItemCountHint":"500",
"azure.cosmos.source.containers.topicMap":"{topic}#{container}",
"azure.cosmos.source.metadata.storage.type":"Cosmos",
"azure.cosmos.source.metadata.storage.name":"{metadataContainerName}"
}
"name": "cosmosdb-source-connector-v2",
"config": {
"connector.class": "com.azure.cosmos.kafka.connect.CosmosSourceConnector",
"tasks.max": "5",
"value.converter": "org.apache.kafka.connect.json.JsonConverter",
"value.converter.schemas.enable": "false",
"key.converter": "org.apache.kafka.connect.json.JsonConverter",
"key.converter.schemas.enable": "false",
"azure.cosmos.account.endpoint":"{endpoint}",
"azure.cosmos.account.key":"{masterKey}",
"azure.cosmos.application.name": "{applicationName}",
"azure.cosmos.source.database.name":"{database}",
"azure.cosmos.source.containers.includedList":"{container}",
"azure.cosmos.source.changeFeed.maxItemCountHint":"500",
"azure.cosmos.source.containers.topicMap":"{topic}#{container}",
"azure.cosmos.source.metadata.storage.type":"Cosmos",
"azure.cosmos.source.metadata.storage.name":"{metadataContainerName}"
}
}

```
Expand Down Expand Up @@ -104,7 +108,7 @@ curl -H "Content-Type: application/json" -X POST -d @<path-to-JSON-config-file>

### Insert document in to Cosmos DB

Use [Cosmos DB setup guide]<!--(CosmosDB_Setup.md)--> to create and set up Cosmos DB Instance.
Use [Cosmos DB setup guide](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md) to create and set up Cosmos DB Instance.

- Sign into the [Azure portal](https://portal.azure.com/learn.docs.microsoft.com) using the account you activated.
- On the Azure portal menu (left hand side blue lines at the top), select All services.
Expand Down Expand Up @@ -157,11 +161,11 @@ curl -X DELETE http://localhost:8083/connectors/cosmosdb-source-connector

```

To delete the created Azure Cosmos DB service and its resource group using Azure CLI, refer to these [steps]<!--(CosmosDB_Setup.md#cleanup)-->.
To delete the created Azure Cosmos DB service and its resource group using Azure CLI, refer to these [steps](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md#cleanup).

## Source configuration properties

The following settings are used to configure the Cosmos DB Kafka Source Connector. These configuration values determine which Cosmos DB container is consumed, which Kafka topics data is written into and formats to serialize the data. For an example configuration file with the default values, refer to [this config]<!--(../src/docker/resources/source.example.json)-->.
The following settings are used to configure the Cosmos DB Kafka Source Connector. These configuration values determine which Cosmos DB container is consumed, which Kafka topics data is written into and formats to serialize the data. For an example configuration file with the default values, refer to [this config](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/src/docker/resources/source.example.json).

- [Generic Configs For Sink And Source]<!--(/doc/configuration-reference.md#generic-configurations)-->
- [Configs only for Source]<!--(/doc/configuration-reference.md#source-connector-configurations)-->
- [Generic Configs For Sink And Source](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/doc/configuration-reference.md#generic-configurations)
- [Configs only for Source](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/doc/configuration-reference.md#source-connector-configurations)
12 changes: 6 additions & 6 deletions sdk/cosmos/azure-cosmos-kafka-connect/dev/setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,16 +50,16 @@ CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR=3
CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR=3
CONNECT_STATUS_STORAGE_REPLICATION_FACTOR=3
```
Please follow [Confluent_Cloud_Setup]<!--(Confluent_Cloud_Setup.md)--> to setup a confluent cloud cluster.
Please follow [Confluent_Cloud_Setup](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/Confluent_Cloud_Setup.md) to setup a confluent cloud cluster.
Integration tests are having ITest suffix. Use following command to run integration tests:
```bash
mvn -e -Dgpg.skip -Dmaven.javadoc.skip=true -Dcodesnippet.skip=true -Dspotbugs.skip=true -Dcheckstyle.skip=true -Drevapi.skip=true -pl ,azure-cosmos-kafka-connect test package -Pkafka-integration
```

### Configure Confluent Platform, Cosmos DB and validate Kafka Connectors

- [Confluent Platform Setup]<!--(Confluent_Platform_Setup.md)-->
- [Confluent Cloud Setup]<!--(Confluent_Cloud_Setup.md)-->
- [Setting up an Azure Cosmos DB Instance]<!--(CosmosDB_Setup.md)-->
- [Kafka Connect Cosmos DB Sink Connector]<!--(README_Sink.md)-->
- [Kafka Connect Cosmos DB Source Connector]<!--(README_Source.md)-->
- [Confluent Platform Setup](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/Confluent_Platform_Setup.md)
- [Confluent Cloud Setup](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/Confluent_Cloud_Setup.md)
- [Setting up an Azure Cosmos DB Instance](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/CosmosDB_Setup.md)
- [Kafka Connect Cosmos DB Sink Connector](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/README_Sink.md)
- [Kafka Connect Cosmos DB Source Connector](https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-cosmos-kafka-connect/dev/README_Source.md)
Loading

0 comments on commit ac46a73

Please sign in to comment.