Skip to content

Console Output

Skipping 2,372 KB.. Full Log
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table test.finish not exists for 197-th check, retry later
table test.finish_mark not exists for 1-th check, retry later
+ set +x
+ run_sql 'USE TEST;Create table t1(a int primary key, b int);insert into t1 values(1,2);insert into t1 values(2,3);'
+ check_table_exists test.t1 127.0.0.1 3306
table test.t1 not exists for 1-th check, retry later
11:33AM INF > Run case=sql/debezium/timestamp_column_test.sql
table test.finish_mark not exists for 2-th check, retry later
table test.finish not exists for 198-th check, retry later
table test.t1 exists
+ sleep 5
table test.finish_mark not exists for 3-th check, retry later
table test.finish not exists for 199-th check, retry later
table test.finish_mark not exists for 4-th check, retry later
table test.finish not exists for 200-th check, retry later
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
+ kill_tikv
table test.finish not exists for 201-th check, retry later
++ ps aux
++ grep tikv-server
++ grep /tmp/tidb_cdc_test/synced_status
+ info='jenkins    19815 30.7  0.5 4728872 2274184 ?     Sl   11:33   0:07 tikv-server --pd 127.0.0.1:2379 -A 127.0.0.1:20160 --status-addr 127.0.0.1:20181 --log-file /tmp/tidb_cdc_test/synced_status/tikv1.log --log-level debug -C /tmp/tidb_cdc_test/synced_status/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status/tikv1
jenkins    19816 23.7  0.5 4690984 2202908 ?     Sl   11:33   0:05 tikv-server --pd 127.0.0.1:2379 -A 127.0.0.1:20161 --status-addr 127.0.0.1:20182 --log-file /tmp/tidb_cdc_test/synced_status/tikv2.log --log-level debug -C /tmp/tidb_cdc_test/synced_status/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status/tikv2
jenkins    19817 24.0  0.5 4691496 2223720 ?     Sl   11:33   0:05 tikv-server --pd 127.0.0.1:2379 -A 127.0.0.1:20162 --status-addr 127.0.0.1:20183 --log-file /tmp/tidb_cdc_test/synced_status/tikv3.log --log-level debug -C /tmp/tidb_cdc_test/synced_status/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status/tikv3
jenkins    19819 29.8  0.5 4723236 2263512 ?     Sl   11:33   0:07 tikv-server --pd 127.0.0.1:2479 -A 127.0.0.1:21160 --status-addr 127.0.0.1:21180 --log-file /tmp/tidb_cdc_test/synced_status/tikv_down.log --log-level debug -C /tmp/tidb_cdc_test/synced_status/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status/tikv_down'
++ ps aux
++ grep tikv-server
++ grep /tmp/tidb_cdc_test/synced_status
++ awk '{print $2}'
++ xargs kill -9
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   243  100   243    0     0   2542      0 --:--:-- --:--:-- --:--:--  2557
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 11:34:06.131","puller_resolved_ts":"2024-05-05 11:34:00.081","last_synced_ts":"2024-05-05 11:34:00.131","now_ts":"2024-05-05 11:34:07.000","info":"The data syncing is not finished, please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '11:34:06.131","puller_resolved_ts":"2024-05-05' '11:34:00.081","last_synced_ts":"2024-05-05' '11:34:00.131","now_ts":"2024-05-05' '11:34:07.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '11:34:06.131","puller_resolved_ts":"2024-05-05' '11:34:00.081","last_synced_ts":"2024-05-05' '11:34:00.131","now_ts":"2024-05-05' '11:34:07.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq -r .info
+ info='The data syncing is not finished, please wait'
+ target_message='The data syncing is not finished, please wait'
+ '[' 'The data syncing is not finished, please wait' '!=' 'The data syncing is not finished, please wait' ']'
+ sleep 130
11:34AM INF > Run case=sql/debezium/tinyint_test.sql
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 11:34:08 CST 2024] <<<<<< run test case canal_json_basic success! >>>>>>
table test.finish not exists for 202-th check, retry later
table test.finish not exists for 203-th check, retry later
11:34AM INF > Run case=sql/debezium/topic_name_sanitization_test.sql
+ '[' kafka == mysql ']'
+ stop_tidb_cluster
table test.finish not exists for 204-th check, retry later
table test.finish not exists for 205-th check, retry later
table test.finish not exists for 206-th check, retry later
table test.finish not exists for 207-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_content_compatible/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
+ stop_tidb_cluster
table test.finish not exists for 208-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/canal_json_content_compatible
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1855/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
table test.finish not exists for 209-th check, retry later
[Pipeline] // stage
[Pipeline] }
11:34AM INF > Run case=sql/debezium/unsigned_integer_test.sql
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table test.finish not exists for 210-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish not exists for 211-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish not exists for 212-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish not exists for 213-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4538cc000d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-wqs8r-vmgg0, pid:28931, start at 2024-05-05 11:34:33.278717031 +0800 CST m=+5.115274847	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:36:33.285 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:34:33.267 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:24:33.267 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4538cc000d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-wqs8r-vmgg0, pid:28931, start at 2024-05-05 11:34:33.278717031 +0800 CST m=+5.115274847	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:36:33.285 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:34:33.267 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:24:33.267 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4538b00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-wqs8r-vmgg0, pid:29025, start at 2024-05-05 11:34:33.286767258 +0800 CST m=+5.068348527	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:36:33.293 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:34:33.260 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:24:33.260 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/canal_json_content_compatible/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/canal_json_content_compatible/tiflash/log/error.log
arg matches is ArgMatches { args: {"advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/canal_json_content_compatible/tiflash/db/proxy"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/canal_json_content_compatible/tiflash/log/proxy.log"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/canal_json_content_compatible/tiflash-proxy.toml"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table test.finish not exists for 214-th check, retry later
[Sun May  5 11:34:36 CST 2024] <<<<<< START cdc server in canal_json_content_compatible case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_content_compatible.3048630488.out server --log-file /tmp/tidb_cdc_test/canal_json_content_compatible/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/canal_json_content_compatible/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table test.finish not exists for 215-th check, retry later
table test.finish not exists for 216-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 03:34:39 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/aeedd5c4-e8bc-47a6-ac73-b89edfd7fdda
	{"id":"aeedd5c4-e8bc-47a6-ac73-b89edfd7fdda","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880076}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d1284bc3
	aeedd5c4-e8bc-47a6-ac73-b89edfd7fdda

/tidb/cdc/default/default/upstream/7365353798631202403
	{"id":7365353798631202403,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/aeedd5c4-e8bc-47a6-ac73-b89edfd7fdda
	{"id":"aeedd5c4-e8bc-47a6-ac73-b89edfd7fdda","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880076}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d1284bc3
	aeedd5c4-e8bc-47a6-ac73-b89edfd7fdda

/tidb/cdc/default/default/upstream/7365353798631202403
	{"id":7365353798631202403,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/aeedd5c4-e8bc-47a6-ac73-b89edfd7fdda
	{"id":"aeedd5c4-e8bc-47a6-ac73-b89edfd7fdda","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880076}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d1284bc3
	aeedd5c4-e8bc-47a6-ac73-b89edfd7fdda

/tidb/cdc/default/default/upstream/7365353798631202403
	{"id":7365353798631202403,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_content_compatible.cli.30539.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-canal-json-content-compatible?protocol=canal-json&enable-tidb-extension=true&content-compatible=true'
Create changefeed successfully!
ID: ca0f9f56-ac29-4254-b3b8-d35f095a60d8
Info: {"upstream_id":7365353798631202403,"namespace":"default","id":"ca0f9f56-ac29-4254-b3b8-d35f095a60d8","sink_uri":"kafka://127.0.0.1:9092/ticdc-canal-json-content-compatible?protocol=canal-json\u0026enable-tidb-extension=true\u0026content-compatible=true","create_time":"2024-05-05T11:34:40.00209909+08:00","start_ts":449545523656654853,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":true,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449545523656654853,"checkpoint_ts":449545523656654853,"checkpoint_time":"2024-05-05 11:34:39.867"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table test.finish not exists for 217-th check, retry later
+ set +x
table test.finish not exists for 218-th check, retry later
table test.finish not exists for 219-th check, retry later
[Sun May  5 11:34:46 CST 2024] <<<<<< START kafka consumer in canal_json_content_compatible case >>>>>>
table test.finish not exists for 220-th check, retry later
table test.finish not exists for 221-th check, retry later
table test.finish not exists for 222-th check, retry later
table test.finish not exists for 223-th check, retry later
table test.finish not exists for 224-th check, retry later
table test.finish not exists for 225-th check, retry later
table test.finish not exists for 226-th check, retry later
table test.finish not exists for 227-th check, retry later
table test.finish not exists for 228-th check, retry later
table test.finish not exists for 229-th check, retry later
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
table test.finish not exists for 230-th check, retry later
table test.finish_mark not exists for 3-th check, retry later
table test.finish not exists for 231-th check, retry later
table test.finish_mark not exists for 4-th check, retry later
table test.finish not exists for 232-th check, retry later
table test.finish_mark not exists for 5-th check, retry later
table test.finish not exists for 233-th check, retry later
table test.finish not exists for 234-th check, retry later
table test.finish_mark not exists for 6-th check, retry later
table test.finish_mark not exists for 7-th check, retry later
table test.finish not exists for 235-th check, retry later
table test.finish_mark not exists for 8-th check, retry later
table test.finish not exists for 236-th check, retry later
table test.finish_mark not exists for 9-th check, retry later
table test.finish not exists for 237-th check, retry later
table test.finish not exists for 238-th check, retry later
table test.finish_mark not exists for 10-th check, retry later
table test.finish not exists for 239-th check, retry later
table test.finish_mark exists
check diff successfully
table test.finish not exists for 240-th check, retry later
table test.finish not exists for 241-th check, retry later
table test.finish not exists for 242-th check, retry later
table test.finish not exists for 243-th check, retry later
11:35AM INF > Run case=sql/dml.sql
table test.finish not exists for 244-th check, retry later
table test.finish_mark not exists for 1-th check, retry later
table test.finish not exists for 245-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
table test.finish not exists for 246-th check, retry later
table test.finish_mark not exists for 3-th check, retry later
table test.finish not exists for 247-th check, retry later
table test.finish_mark not exists for 4-th check, retry later
table test.finish not exists for 248-th check, retry later
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 11:35:45 CST 2024] <<<<<< run test case canal_json_content_compatible success! >>>>>>
table test.finish not exists for 249-th check, retry later
table test.finish not exists for 250-th check, retry later
table test.finish not exists for 251-th check, retry later
table test.finish not exists for 252-th check, retry later
table test.finish not exists for 253-th check, retry later
table test.finish not exists for 254-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_topics/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table test.finish not exists for 255-th check, retry later
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   221  100   221    0     0   2545      0 --:--:-- --:--:-- --:--:--  2540
100   221  100   221    0     0   2543      0 --:--:-- --:--:-- --:--:--  2540
+ synced_status='{"synced":true,"sink_checkpoint_ts":"2024-05-05 11:35:44.909","puller_resolved_ts":"2024-05-05 11:35:38.909","last_synced_ts":"2024-05-05 11:33:29.258","now_ts":"2024-05-05 11:35:46.000","info":"Data syncing is finished"}'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '11:35:44.909","puller_resolved_ts":"2024-05-05' '11:35:38.909","last_synced_ts":"2024-05-05' '11:33:29.258","now_ts":"2024-05-05' '11:35:46.000","info":"Data' syncing is 'finished"}'
++ jq .synced
+ status=true
+ '[' true '!=' true ']'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '11:35:44.909","puller_resolved_ts":"2024-05-05' '11:35:38.909","last_synced_ts":"2024-05-05' '11:33:29.258","now_ts":"2024-05-05' '11:35:46.000","info":"Data' syncing is 'finished"}'
++ jq -r .info
+ info='Data syncing is finished'
+ target_message='Data syncing is finished'
+ '[' 'Data syncing is finished' '!=' 'Data syncing is finished' ']'
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
+ run_case_with_failpoint conf/changefeed-redo.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status_with_redo
+ mkdir -p /tmp/tidb_cdc_test/synced_status_with_redo
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status_with_redo
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
The 1 times to try to start tidb cluster...
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
table test.finish not exists for 256-th check, retry later
table test.finish not exists for 257-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/multi_topics
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
start tidb cluster in /tmp/tidb_cdc_test/synced_status_with_redo
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table test.finish not exists for 258-th check, retry later
table test.finish not exists for 259-th check, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table test.finish not exists for 260-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish not exists for 261-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish not exists for 262-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish not exists for 263-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4b61100010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-wqs8r-vmgg0, pid:31897, start at 2024-05-05 11:36:14.165922713 +0800 CST m=+5.074044255	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:38:14.172 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:36:14.148 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:26:14.148 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4b61100010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-wqs8r-vmgg0, pid:31897, start at 2024-05-05 11:36:14.165922713 +0800 CST m=+5.074044255	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:38:14.172 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:36:14.148 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:26:14.148 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4b63480006	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-wqs8r-vmgg0, pid:31984, start at 2024-05-05 11:36:14.295723038 +0800 CST m=+5.155141057	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:38:14.302 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:36:14.290 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:26:14.290 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/multi_topics/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/multi_topics/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/multi_topics/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/multi_topics/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/multi_topics/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4b6acc0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-lf7fd-n125f, pid:19172, start at 2024-05-05 11:36:14.813014816 +0800 CST m=+5.051059073	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:38:14.820 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:36:14.820 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:26:14.820 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4b6acc0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-lf7fd-n125f, pid:19172, start at 2024-05-05 11:36:14.813014816 +0800 CST m=+5.051059073	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:38:14.820 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:36:14.820 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:26:14.820 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4b6cfc000d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-lf7fd-n125f, pid:19258, start at 2024-05-05 11:36:14.922989616 +0800 CST m=+5.104443558	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:38:14.932 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:36:14.911 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:26:14.911 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/proxy.log"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table test.finish not exists for 264-th check, retry later
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_topics.cli.33367.out cli tso query --pd=http://127.0.0.1:2379
+ cd /tmp/tidb_cdc_test/synced_status_with_redo
+ export 'GO_FAILPOINTS=github.com/pingcap/tiflow/cdc/owner/ChangefeedOwnerNotUpdateCheckpoint=return(true)'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ChangefeedOwnerNotUpdateCheckpoint=return(true)'
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.20711.out cli tso query --pd=http://127.0.0.1:2379
table test.finish not exists for 265-th check, retry later
+ set +x
+ tso='449545549302464513
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449545549302464513 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 11:36:19 CST 2024] <<<<<< START cdc server in multi_topics case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_topics.3341233414.out server --log-file /tmp/tidb_cdc_test/multi_topics/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_topics/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ set +x
+ tso='449545549478887425
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449545549478887425 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449545549478887425
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status_with_redo --binary cdc.test
[Sun May  5 11:36:19 CST 2024] <<<<<< START cdc server in synced_status_with_redo case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ChangefeedOwnerNotUpdateCheckpoint=return(true)'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.2074620748.out server --log-file /tmp/tidb_cdc_test/synced_status_with_redo/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status_with_redo/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table test.finish not exists for 266-th check, retry later
table test.finish not exists for 267-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 03:36:22 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/adab61fd-3419-4206-b1d9-6c4f9063730c
	{"id":"adab61fd-3419-4206-b1d9-6c4f9063730c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880179}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d2acb2cf
	adab61fd-3419-4206-b1d9-6c4f9063730c

/tidb/cdc/default/default/upstream/7365354225416783616
	{"id":7365354225416783616,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/adab61fd-3419-4206-b1d9-6c4f9063730c
	{"id":"adab61fd-3419-4206-b1d9-6c4f9063730c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880179}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d2acb2cf
	adab61fd-3419-4206-b1d9-6c4f9063730c

/tidb/cdc/default/default/upstream/7365354225416783616
	{"id":7365354225416783616,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/adab61fd-3419-4206-b1d9-6c4f9063730c
	{"id":"adab61fd-3419-4206-b1d9-6c4f9063730c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880179}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d2acb2cf
	adab61fd-3419-4206-b1d9-6c4f9063730c

/tidb/cdc/default/default/upstream/7365354225416783616
	{"id":7365354225416783616,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_topics.cli.33471.out cli changefeed create --start-ts=449545549302464513 '--sink-uri=kafka://127.0.0.1:9092/multi_topics?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1' --config /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_topics/conf/changefeed.toml
Create changefeed successfully!
ID: 0a9f33ea-0731-4999-a8d1-23df4e0db2bb
Info: {"upstream_id":7365354225416783616,"namespace":"default","id":"0a9f33ea-0731-4999-a8d1-23df4e0db2bb","sink_uri":"kafka://127.0.0.1:9092/multi_topics?protocol=canal-json\u0026enable-tidb-extension=true\u0026kafka-version=2.4.1","create_time":"2024-05-05T11:36:22.70889231+08:00","start_ts":449545549302464513,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"dispatchers":[{"matcher":["workload.*"],"topic":"workload"},{"matcher":["test.*"],"topic":"{schema}_{table}"}],"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449545549302464513,"checkpoint_ts":449545549302464513,"checkpoint_time":"2024-05-05 11:36:17.698"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 03:36:23 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/a43c6804-e023-4b99-bbe8-ef551648d4b3
	{"id":"a43c6804-e023-4b99-bbe8-ef551648d4b3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880180}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d2af51ce
	a43c6804-e023-4b99-bbe8-ef551648d4b3

/tidb/cdc/default/default/upstream/7365354232683723579
	{"id":7365354232683723579,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/a43c6804-e023-4b99-bbe8-ef551648d4b3
	{"id":"a43c6804-e023-4b99-bbe8-ef551648d4b3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880180}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d2af51ce
	a43c6804-e023-4b99-bbe8-ef551648d4b3

/tidb/cdc/default/default/upstream/7365354232683723579
	{"id":7365354232683723579,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/a43c6804-e023-4b99-bbe8-ef551648d4b3
	{"id":"a43c6804-e023-4b99-bbe8-ef551648d4b3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880180}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d2af51ce
	a43c6804-e023-4b99-bbe8-ef551648d4b3

/tidb/cdc/default/default/upstream/7365354232683723579
	{"id":7365354232683723579,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed-redo.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449545549478887425 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.20807.out cli changefeed create --start-ts=449545549478887425 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365354232683723579,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T11:36:23.540693435+08:00","start_ts":449545549478887425,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"eventual","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"storage":"file:///tmp/tidb_cdc_test/synced_status/redo","use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449545549478887425,"checkpoint_ts":449545549478887425,"checkpoint_time":"2024-05-05 11:36:18.371"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
+ set +x
table test.finish not exists for 268-th check, retry later
+ set +x
+ sleep 20
table test.finish not exists for 269-th check, retry later
table test.finish not exists for 270-th check, retry later
11:36AM INF > All tests pass failed=0 passed=219
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   723  100   723    0     0   8860      0 --:--:-- --:--:-- --:--:--  8817
100   723  100   723    0     0   8851      0 --:--:-- --:--:-- --:--:--  8817
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 11:34:07.131","puller_resolved_ts":"2024-05-05 11:34:07.131","last_synced_ts":"2024-05-05 11:34:00.131","now_ts":"2024-05-05 11:36:17.000","info":"Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' \u003e '\''Resolved-Ts'\'' \u003e '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '11:34:07.131","puller_resolved_ts":"2024-05-05' '11:34:07.131","last_synced_ts":"2024-05-05' '11:34:00.131","now_ts":"2024-05-05' '11:36:17.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '11:34:07.131","puller_resolved_ts":"2024-05-05' '11:34:07.131","last_synced_ts":"2024-05-05' '11:34:00.131","now_ts":"2024-05-05' '11:36:17.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq -r .info
+ info='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ target_message='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ '[' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' '!=' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' ']'
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
+ run_case_with_unavailable_tidb conf/changefeed.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status
+ mkdir -p /tmp/tidb_cdc_test/synced_status
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
The 1 times to try to start tidb cluster...
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
table test.finish not exists for 271-th check, retry later
table test.finish not exists for 272-th check, retry later
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
start tidb cluster in /tmp/tidb_cdc_test/synced_status
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table test.finish not exists for 273-th check, retry later
table test.finish not exists for 274-th check, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
[Sun May  5 11:36:36 CST 2024] <<<<<< run test case debezium success! >>>>>>
table test.finish not exists for 275-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish not exists for 276-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish not exists for 277-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish not exists for 278-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4d2f5c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-3nsj2-f79k3, pid:23285, start at 2024-05-05 11:36:43.757466871 +0800 CST m=+5.075736767	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:38:43.766 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:36:43.735 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:26:43.735 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4d2f5c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-3nsj2-f79k3, pid:23285, start at 2024-05-05 11:36:43.757466871 +0800 CST m=+5.075736767	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:38:43.766 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:36:43.735 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:26:43.735 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1b4d30100014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1855-3nsj2-f79k3, pid:23361, start at 2024-05-05 11:36:43.797653339 +0800 CST m=+5.064382354	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-11:38:43.805 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-11:36:43.780 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-11:26:43.780 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/log/proxy.log"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table test.finish not exists for 279-th check, retry later
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   723  100   723    0     0   9397      0 --:--:-- --:--:-- --:--:--  9513
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 11:36:18.371","puller_resolved_ts":"1970-01-01 08:00:00.000","last_synced_ts":"1970-01-01 08:00:00.000","now_ts":"2024-05-05 11:36:45.000","info":"Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' \u003e '\''Resolved-Ts'\'' \u003e '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait"}'
++ jq .synced
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '11:36:18.371","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '11:36:45.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '11:36:18.371","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '11:36:45.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq -r .info
+ info='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ target_message='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ '[' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' '!=' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' ']'
+ export GO_FAILPOINTS=
+ GO_FAILPOINTS=
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
+ cd /tmp/tidb_cdc_test/synced_status
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.24810.out cli tso query --pd=http://127.0.0.1:2379
table test.finish not exists for 280-th check, retry later
+ set +x
+ tso='449545557071888385
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449545557071888385 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449545557071888385
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status --binary cdc.test
[Sun May  5 11:36:48 CST 2024] <<<<<< START cdc server in synced_status case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.2484924851.out server --log-file /tmp/tidb_cdc_test/synced_status/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/lossy_ddl/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 11:36:48 CST 2024] <<<<<< run test case lossy_ddl success! >>>>>>
table test.finish not exists for 281-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/storage_csv_update/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 11:36:51 CST 2024] <<<<<< run test case storage_csv_update success! >>>>>>
table test.finish not exists for 282-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 03:36:51 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6d562892-1b3a-4083-a59d-a3a5751c9e3a
	{"id":"6d562892-1b3a-4083-a59d-a3a5751c9e3a","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880209}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d320a8ce
	6d562892-1b3a-4083-a59d-a3a5751c9e3a

/tidb/cdc/default/default/upstream/7365354354337617031
	{"id":7365354354337617031,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6d562892-1b3a-4083-a59d-a3a5751c9e3a
	{"id":"6d562892-1b3a-4083-a59d-a3a5751c9e3a","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880209}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d320a8ce
	6d562892-1b3a-4083-a59d-a3a5751c9e3a

/tidb/cdc/default/default/upstream/7365354354337617031
	{"id":7365354354337617031,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6d562892-1b3a-4083-a59d-a3a5751c9e3a
	{"id":"6d562892-1b3a-4083-a59d-a3a5751c9e3a","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714880209}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f46d320a8ce
	6d562892-1b3a-4083-a59d-a3a5751c9e3a

/tidb/cdc/default/default/upstream/7365354354337617031
	{"id":7365354354337617031,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449545557071888385 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.24911.out cli changefeed create --start-ts=449545557071888385 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365354354337617031,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T11:36:52.305498738+08:00","start_ts":449545557071888385,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449545557071888385,"checkpoint_ts":449545557071888385,"checkpoint_time":"2024-05-05 11:36:47.336"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
+ run_sql 'USE TEST;Create table t1(a int primary key, b int);insert into t1 values(1,2);insert into t1 values(2,3);'
+ check_table_exists test.t1 127.0.0.1 3306
table test.t1 not exists for 1-th check, retry later
table test.finish not exists for 283-th check, retry later
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1855/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
table test.finish not exists for 284-th check, retry later
table test.t1 exists
+ sleep 5
+ check_logs /tmp/tidb_cdc_test/synced_status_with_redo
++ date
+ echo '[Sun May  5 11:36:56 CST 2024] <<<<<< run test case synced_status_with_redo success! >>>>>>'
[Sun May  5 11:36:56 CST 2024] <<<<<< run test case synced_status_with_redo success! >>>>>>
+ stop_tidb_cluster
table test.finish not exists for 285-th check, retry later
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1855/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
table test.finish not exists for 286-th check, retry later
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
+ kill_tidb
++ ps aux
++ grep tidb-server
++ grep /tmp/tidb_cdc_test/synced_status
+ info='jenkins    23285 13.5  0.0 2730440 252812 ?      Sl   11:36   0:02 tidb-server -P 4000 -config /tmp/tidb_cdc_test/synced_status/tidb-config-1714880198675340784.toml --store tikv --path 127.0.0.1:2379 --status=10080 --log-file /tmp/tidb_cdc_test/synced_status/tidb.log
jenkins    23289  3.7  0.0 2416748 192152 ?      Sl   11:36   0:00 tidb-server -P 4001 -config /tmp/tidb_cdc_test/synced_status/tidb-config-1714880198678252366.toml --store tikv --path 127.0.0.1:2379 --status=10081 --log-file /tmp/tidb_cdc_test/synced_status/tidb_other.log
jenkins    23361 13.5  0.0 2634316 275292 ?      Sl   11:36   0:02 tidb-server -P 3306 -config /tmp/tidb_cdc_test/synced_status/tidb-config-1714880198726039768.toml --store tikv --path 127.0.0.1:2479 --status=20080 --log-file /tmp/tidb_cdc_test/synced_status/tidb_down.log'
++ ps aux
++ grep tidb-server
++ grep /tmp/tidb_cdc_test/synced_status
++ awk '{print $2}'
++ xargs kill -9
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   243  100   243    0     0   5519      0 --:--:-- --:--:-- --:--:--  5651
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 11:36:59.736","puller_resolved_ts":"2024-05-05 11:36:53.736","last_synced_ts":"2024-05-05 11:36:53.835","now_ts":"2024-05-05 11:37:00.000","info":"The data syncing is not finished, please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '11:36:59.736","puller_resolved_ts":"2024-05-05' '11:36:53.736","last_synced_ts":"2024-05-05' '11:36:53.835","now_ts":"2024-05-05' '11:37:00.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '11:36:59.736","puller_resolved_ts":"2024-05-05' '11:36:53.736","last_synced_ts":"2024-05-05' '11:36:53.835","now_ts":"2024-05-05' '11:37:00.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq -r .info
+ info='The data syncing is not finished, please wait'
+ target_message='The data syncing is not finished, please wait'
+ '[' 'The data syncing is not finished, please wait' '!=' 'The data syncing is not finished, please wait' ']'
+ sleep 130
table test.finish not exists for 287-th check, retry later
table test.finish not exists for 288-th check, retry later
table test.finish not exists for 289-th check, retry later
table test.finish not exists for 290-th check, retry later
table test.finish not exists for 291-th check, retry later
table test.finish not exists for 292-th check, retry later
table test.finish not exists for 293-th check, retry later
table test.finish not exists for 294-th check, retry later
table test.finish not exists for 295-th check, retry later
table test.finish not exists for 296-th check, retry later
table test.finish not exists for 297-th check, retry later
table test.finish not exists for 298-th check, retry later
table test.finish not exists for 299-th check, retry later
table test.finish not exists for 300-th check, retry later
table test.finish not exists at last check
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
Post stage
[Pipeline] sh
+ ls /tmp/tidb_cdc_test/
consistent_partition_table
consistent_replicate_ddl
consistent_replicate_gbk
consistent_replicate_storage_file
consistent_replicate_storage_file_large_value
consistent_replicate_storage_s3
cov.kafka_big_messages_v2.36373639.out
cov.multi_tables_ddl_v2.74627464.out
cov.multi_topics_v2.cli.10519.out
cov.multi_topics_v2.cli.10614.out
kafka_big_messages
kafka_big_messages_v2
multi_tables_ddl
multi_tables_ddl_v2
multi_topics
multi_topics_v2
sql_res.kafka_big_messages_v2.txt
sql_res.multi_tables_ddl_v2.txt
sql_res.multi_topics_v2.txt
++ find /tmp/tidb_cdc_test/ -type f -name '*.log'
+ tar -cvzf log-G02.tar.gz /tmp/tidb_cdc_test/multi_topics/output/sync_diff.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_kafka_consumer.log /tmp/tidb_cdc_test/multi_topics_v2/tikv2/db/000005.log /tmp/tidb_cdc_test/multi_topics_v2/sync_diff_inspector.log /tmp/tidb_cdc_test/multi_topics_v2/tiflash/log/server.log /tmp/tidb_cdc_test/multi_topics_v2/tiflash/log/error.log /tmp/tidb_cdc_test/multi_topics_v2/tiflash/log/proxy.log /tmp/tidb_cdc_test/multi_topics_v2/tiflash/db/proxy/db/000005.log /tmp/tidb_cdc_test/multi_topics_v2/tidb_other.log /tmp/tidb_cdc_test/multi_topics_v2/tidb.log /tmp/tidb_cdc_test/multi_topics_v2/tidb-slow.log /tmp/tidb_cdc_test/multi_topics_v2/pd1.log /tmp/tidb_cdc_test/multi_topics_v2/down_pd.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0002/000002.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0005/000002.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0006/000002.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0004/000002.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0003/000002.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0001/000002.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0007/000002.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0000/000002.log /tmp/tidb_cdc_test/multi_topics_v2/tikv1/db/000005.log /tmp/tidb_cdc_test/multi_topics_v2/cdc_kafka_consumer_stdout.log /tmp/tidb_cdc_test/multi_topics_v2/tikv3/db/000005.log /tmp/tidb_cdc_test/multi_topics_v2/tikv_down.log /tmp/tidb_cdc_test/multi_topics_v2/tidb_down.log /tmp/tidb_cdc_test/multi_topics_v2/down_pd/region-meta/000001.log /tmp/tidb_cdc_test/multi_topics_v2/down_pd/hot-region/000001.log /tmp/tidb_cdc_test/multi_topics_v2/tikv3.log /tmp/tidb_cdc_test/multi_topics_v2/tikv2.log /tmp/tidb_cdc_test/multi_topics_v2/stdout.log /tmp/tidb_cdc_test/multi_topics_v2/pd1/region-meta/000001.log /tmp/tidb_cdc_test/multi_topics_v2/pd1/hot-region/000001.log /tmp/tidb_cdc_test/multi_topics_v2/tikv_down/db/000005.log /tmp/tidb_cdc_test/multi_topics_v2/cdc.log /tmp/tidb_cdc_test/multi_topics_v2/tikv1.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/cdc_kafka_consumer.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/sync_diff_inspector.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/tidb_other.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/tidb.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/tidb-slow.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/pd1.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/down_pd.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/cdc_kafka_consumer_stdout.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/tikv_down.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/tidb_down.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/tikv3.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/tikv2.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/stdout.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/cdc.log /tmp/tidb_cdc_test/multi_tables_ddl_v2/tikv1.log /tmp/tidb_cdc_test/kafka_big_messages_v2/cdc_kafka_consumer.log /tmp/tidb_cdc_test/kafka_big_messages_v2/sync_diff_inspector.log /tmp/tidb_cdc_test/kafka_big_messages_v2/tidb_other.log /tmp/tidb_cdc_test/kafka_big_messages_v2/tidb.log /tmp/tidb_cdc_test/kafka_big_messages_v2/tidb-slow.log /tmp/tidb_cdc_test/kafka_big_messages_v2/pd1.log /tmp/tidb_cdc_test/kafka_big_messages_v2/down_pd.log /tmp/tidb_cdc_test/kafka_big_messages_v2/cdc_kafka_consumer_stdout.log /tmp/tidb_cdc_test/kafka_big_messages_v2/tikv_down.log /tmp/tidb_cdc_test/kafka_big_messages_v2/tidb_down.log /tmp/tidb_cdc_test/kafka_big_messages_v2/tikv3.log /tmp/tidb_cdc_test/kafka_big_messages_v2/tikv2.log /tmp/tidb_cdc_test/kafka_big_messages_v2/stdout.log /tmp/tidb_cdc_test/kafka_big_messages_v2/cdc.log /tmp/tidb_cdc_test/kafka_big_messages_v2/tikv1.log
tar: Removing leading `/' from member names
/tmp/tidb_cdc_test/multi_topics/output/sync_diff.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_kafka_consumer.log
/tmp/tidb_cdc_test/multi_topics_v2/tikv2/db/000005.log
/tmp/tidb_cdc_test/multi_topics_v2/sync_diff_inspector.log
/tmp/tidb_cdc_test/multi_topics_v2/tiflash/log/server.log
/tmp/tidb_cdc_test/multi_topics_v2/tiflash/log/error.log
/tmp/tidb_cdc_test/multi_topics_v2/tiflash/log/proxy.log
/tmp/tidb_cdc_test/multi_topics_v2/tiflash/db/proxy/db/000005.log
/tmp/tidb_cdc_test/multi_topics_v2/tidb_other.log
/tmp/tidb_cdc_test/multi_topics_v2/tidb.log
/tmp/tidb_cdc_test/multi_topics_v2/tidb-slow.log
/tmp/tidb_cdc_test/multi_topics_v2/pd1.log
/tmp/tidb_cdc_test/multi_topics_v2/down_pd.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0002/000002.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0005/000002.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0006/000002.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0004/000002.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0003/000002.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0001/000002.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0007/000002.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_data/tmp/sorter/0000/000002.log
/tmp/tidb_cdc_test/multi_topics_v2/tikv1/db/000005.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc_kafka_consumer_stdout.log
/tmp/tidb_cdc_test/multi_topics_v2/tikv3/db/000005.log
/tmp/tidb_cdc_test/multi_topics_v2/tikv_down.log
/tmp/tidb_cdc_test/multi_topics_v2/tidb_down.log
/tmp/tidb_cdc_test/multi_topics_v2/down_pd/region-meta/000001.log
/tmp/tidb_cdc_test/multi_topics_v2/down_pd/hot-region/000001.log
/tmp/tidb_cdc_test/multi_topics_v2/tikv3.log
/tmp/tidb_cdc_test/multi_topics_v2/tikv2.log
/tmp/tidb_cdc_test/multi_topics_v2/stdout.log
/tmp/tidb_cdc_test/multi_topics_v2/pd1/region-meta/000001.log
/tmp/tidb_cdc_test/multi_topics_v2/pd1/hot-region/000001.log
/tmp/tidb_cdc_test/multi_topics_v2/tikv_down/db/000005.log
/tmp/tidb_cdc_test/multi_topics_v2/cdc.log
/tmp/tidb_cdc_test/multi_topics_v2/tikv1.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/cdc_kafka_consumer.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/sync_diff_inspector.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/tidb_other.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/tidb.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/tidb-slow.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/pd1.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/down_pd.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/cdc_kafka_consumer_stdout.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/tikv_down.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/tidb_down.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/tikv3.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/tikv2.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/stdout.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/cdc.log
/tmp/tidb_cdc_test/multi_tables_ddl_v2/tikv1.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/cdc_kafka_consumer.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/sync_diff_inspector.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/tidb_other.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/tidb.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/tidb-slow.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/pd1.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/down_pd.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/cdc_kafka_consumer_stdout.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/tikv_down.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/tidb_down.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/tikv3.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/tikv2.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/stdout.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/cdc.log
/tmp/tidb_cdc_test/kafka_big_messages_v2/tikv1.log
+ ls -alh log-G02.tar.gz
-rw-r--r-- 1 jenkins jenkins 15M May  5 11:37 log-G02.tar.gz
[Pipeline] archiveArtifacts
Archiving artifacts
Recording fingerprints
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G02'
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
{"level":"warn","ts":1714880278.0637343,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0016f9500/127.0.0.1:2379","attempt":0,"error":"rpc error: code = Unavailable desc = error reading from server: EOF"}
{"level":"warn","ts":1714880280.0642738,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0016f9500/127.0.0.1:2379","attempt":1,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
script returned exit code 143
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G01'
++ stop_tidb_cluster
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/run.sh: line 1: 24988 Terminated              sleep 130
script returned exit code 143
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G09'
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE