-
The following executables must be copied or generated or linked into these locations,
sync_diff_inspector
can be downloaded from tidb-community-toolkit ,tidb-server
related binaries can be downloaded from tidb-community-server:bin/tidb-server
# version >= 4.0.0-rc.1bin/tikv-server
# version >= 4.0.0-rc.1bin/pd-server
# version >= 4.0.0-rc.1bin/pd-ctl
# version >= 4.0.0-rc.1bin/tiflash
# needs tiflash binary and some necessary so filesbin/sync_diff_inspector
- bin/go-ycsb
- bin/etcdctl
- bin/jq
- bin/minio
If you are running tests on MacOS, tidb related binaries can be downloaded from tiup mirrors, such as https://tiup-mirrors.pingcap.com/tidb-v4.0.2-darwin-amd64.tar.gz. And
sync_diff_inspector
can be compiled by yourself from source tidb-toolsAll Tiflash required files can be found in tidb-community-server packages. You should put
flash_cluster_manager
,libtiflash_proxy.so
andtiflash
intobin
directory in TiCDC code base. -
The following programs must be installed:
- mysql (the MySQL cli client, currently mysql client 8.0 is not supported)
-
The user used to execute the tests must have permission to create the folder /tmp/tidb_cdc_test. All test artifacts will be written into this folder.
The following programs must be installed:
We recommend that you provide docker with at least 6+ cores and 8G+ memory. Of course, the more resources, the better.
- Unit test does not need any dependencies, just running
make unit_test
in root dir of source code, or cd into directory of a test case and run single case viaGO111MODULE=on go test -check.f TestXXX
.
-
Run
make integration_test_build
to generate TiCDC related binaries for integration test -
Run
make integration_test
to execute the integration tests. This command will- Check that all required executables exist.
- Execute
tests/integration_tests/run.sh
If want to run one integration test case only, just pass the CASE parameter, such as
make integration_test CASE=simple
.If want to run integration test cases from the specified one, just pass the START_AT parameter, such as
make integration_test START_AT=simple
.There exists some environment variables that you can set by yourself, variable details can be found in test_prepare.
MySQL sink
will be used by default, if you want to testKafka sink
, please run withmake integration_test_kafka CASE=simple
. -
After executing the tests, run
make coverage
to get a coverage report at/tmp/tidb_cdc_test/all_cov.html
.
Warning: These scripts and files may not work under the arm architecture, and we have not tested against it. Also, we currently use the PingCAP intranet address in our download scripts, so if you do not have access to the PingCAP intranet you will not be able to use these scripts. We will try to resolve these issues as soon as possible.
-
If you want to run kafka tests, run
START_AT="clustered_index" docker-compose -f ./deployments/ticdc/docker-compose/docker-compose-kafka-integration.yml up --build
-
If you want to run MySQL tests, run
CASE="clustered_index" docker-compose -f ./deployments/ticdc/docker-compose/docker-compose-mysql-integration.yml up --build
-
Use the command
docker-compose -f ./deployments/ticdc/docker-compose/docker-compose-kafka-integration.yml down -v
to clean up the corresponding environment.
Some useful tips:
-
The log files for the test are mounted in the
./deployments/ticdc/docker-compose/logs
directory. -
You can specify multiple tests to run in CASE, for example:
CASE="clustered_index kafka_messages"
. You can even useCASE="*"
to indicate that you are running all tests。 -
You can specify in the integration-test.Dockerfile the version of other dependencies that you want to download, such as tidb, tikv, pd, etc.
For example, you can change
RUN ./download-integration-test-binaries.sh master
toRUN ./download-integration-test-binaries.sh release-5.2
to use the release-5.2 dependency. Then rebuild the image with the --no-cache flag.
New integration tests can be written as shell scripts in tests/integration_tests/TEST_NAME/run.sh
. The script should
exit with a nonzero error code on failure.