Skip to content

Console Output

Skipping 2,875 KB.. Full Log
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Mon Apr 29 20:41:18 CST 2024] <<<<<< run test case force_replicate_table success! >>>>>>
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
check diff failed 13-th time, retry later
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
check diff failed 14-th time, retry later
check diff failed 15-th time, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/simple/run.sh using Sink-Type: mysql... <<=================
The 1 times to try to start tidb cluster...
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-release-7.5-pull_cdc_integration_mysql_test-348/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
check diff failed 16-th time, retry later
[Pipeline] // timeout
[Pipeline] }
start tidb cluster in /tmp/tidb_cdc_test/simple
Starting Upstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Starting Downstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Verifying upstream PD is started...
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
check diff failed 17-th time, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-release-7.5-pull_cdc_integration_mysql_test-348/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
check diff failed 18-th time, retry later
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/consistent_partition_table/run.sh using Sink-Type: mysql... <<=================
The 1 times to try to start tidb cluster...
Starting Upstream TiDB...
Release Version: v7.5.1-46-g3df1fe2cb9
Edition: Community
Git Commit Hash: 3df1fe2cb94fcc572aaaf15efed0a26269743a0d
Git Branch: release-7.5
UTC Build Time: 2024-04-29 09:35:42
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v7.5.1-46-g3df1fe2cb9
Edition: Community
Git Commit Hash: 3df1fe2cb94fcc572aaaf15efed0a26269743a0d
Git Branch: release-7.5
UTC Build Time: 2024-04-29 09:35:42
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/consistent_partition_table
Starting Upstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Starting Downstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Verifying upstream PD is started...
check diff failed 19-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
check diff failed 20-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v7.5.1-46-g3df1fe2cb9
Edition: Community
Git Commit Hash: 3df1fe2cb94fcc572aaaf15efed0a26269743a0d
Git Branch: release-7.5
UTC Build Time: 2024-04-29 09:35:42
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v7.5.1-46-g3df1fe2cb9
Edition: Community
Git Commit Hash: 3df1fe2cb94fcc572aaaf15efed0a26269743a0d
Git Branch: release-7.5
UTC Build Time: 2024-04-29 09:35:42
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 21-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 22-th time, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca77fe2300012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-348-j2h8x, pid:20742, start at 2024-04-29 20:41:35.144453106 +0800 CST m=+5.477798716	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-20:43:35.152 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-20:41:35.116 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-20:31:35.116 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca77fe2300012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-348-j2h8x, pid:20742, start at 2024-04-29 20:41:35.144453106 +0800 CST m=+5.477798716	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-20:43:35.152 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-20:41:35.116 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-20:31:35.116 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca77fe474000a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-348-j2h8x, pid:20815, start at 2024-04-29 20:41:35.274129094 +0800 CST m=+5.532963810	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-20:43:35.283 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-20:41:35.261 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-20:31:35.261 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v7.5.1-12-g9002cc34d
Edition:         Community
Git Commit Hash: 9002cc34d3b593a718b6c5260ba18f30a45ab314
Git Branch:      HEAD
UTC Build Time:  2024-04-18 07:24:48
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO

Raft Proxy
Git Commit Hash:   521fd9dbc55e58646045d88f91c3c35db50b5981
Git Commit Branch: HEAD
UTC Build Time:    2024-04-18 07:28:40
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:    portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/simple/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/simple/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v7.5.1-12-g9002cc34d"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/simple/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/simple/tiflash-proxy.toml"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["9002cc34d3b593a718b6c5260ba18f30a45ab314"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/simple/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 23-th time, retry later
+ pd_host=127.0.0.1
+ pd_port=2379
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.simple.cli.22070.out cli tso query --pd=http://127.0.0.1:2379
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 24-th time, retry later
+ set +x
+ tso='449418231890837505
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449418231890837505 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca780233c000e	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-348-mzgrm, pid:18475, start at 2024-04-29 20:41:39.295859948 +0800 CST m=+5.367356310	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-20:43:39.305 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-20:41:39.279 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-20:31:39.279 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca780233c000e	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-348-mzgrm, pid:18475, start at 2024-04-29 20:41:39.295859948 +0800 CST m=+5.367356310	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-20:43:39.305 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-20:41:39.279 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-20:31:39.279 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca780271c0013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-348-mzgrm, pid:18558, start at 2024-04-29 20:41:39.553577928 +0800 CST m=+5.535437227	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-20:43:39.562 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-20:41:39.527 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-20:31:39.527 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v7.5.1-12-g9002cc34d
Edition:         Community
Git Commit Hash: 9002cc34d3b593a718b6c5260ba18f30a45ab314
Git Branch:      HEAD
UTC Build Time:  2024-04-18 07:24:48
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO

Raft Proxy
Git Commit Hash:   521fd9dbc55e58646045d88f91c3c35db50b5981
Git Commit Branch: HEAD
UTC Build Time:    2024-04-18 07:28:40
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:    portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/consistent_partition_table/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/consistent_partition_table/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/consistent_partition_table/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/consistent_partition_table/tiflash-proxy.toml"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["9002cc34d3b593a718b6c5260ba18f30a45ab314"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/consistent_partition_table/tiflash/log/proxy.log"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v7.5.1-12-g9002cc34d"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Mon Apr 29 20:41:42 CST 2024] <<<<<< START cdc server in simple case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
+ GO_FAILPOINTS=
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.simple.2211322115.out server --log-file /tmp/tidb_cdc_test/simple/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/simple/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
check diff failed 25-th time, retry later
[Mon Apr 29 20:41:44 CST 2024] <<<<<< START cdc server in consistent_partition_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.consistent_partition_table.1991719919.out server --log-file /tmp/tidb_cdc_test/consistent_partition_table/cdcpartition_table.server1.log --log-level debug --data-dir /tmp/tidb_cdc_test/consistent_partition_table/cdc_datapartition_table.server1 --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
check diff failed 26-th time, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
> GET /debug/info HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Mon, 29 Apr 2024 12:41:45 GMT
< Content-Length: 613
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/cfc6178c-491f-4964-bf0b-576cd2272449
	{"id":"cfc6178c-491f-4964-bf0b-576cd2272449","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29dfcd5b0c
	cfc6178c-491f-4964-bf0b-576cd2272449

/tidb/cdc/default/default/upstream/7363268250354914670
	{"id":7363268250354914670,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/cfc6178c-491f-4964-bf0b-576cd2272449
	{"id":"cfc6178c-491f-4964-bf0b-576cd2272449","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29dfcd5b0c
	cfc6178c-491f-4964-bf0b-576cd2272449

/tidb/cdc/default/default/upstream/7363268250354914670
	{"id":7363268250354914670,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/cfc6178c-491f-4964-bf0b-576cd2272449
	{"id":"cfc6178c-491f-4964-bf0b-576cd2272449","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29dfcd5b0c
	cfc6178c-491f-4964-bf0b-576cd2272449

/tidb/cdc/default/default/upstream/7363268250354914670
	{"id":7363268250354914670,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.simple.cli.22165.out cli changefeed create --start-ts=449418231890837505 --sink-uri=mysql+ssl://normal:123456@127.0.0.1:3306/
Create changefeed successfully!
ID: 1dc776f8-b17e-4e4c-9016-1bdd3749b38e
Info: {"upstream_id":7363268250354914670,"namespace":"default","id":"1dc776f8-b17e-4e4c-9016-1bdd3749b38e","sink_uri":"mysql+ssl://normal:xxxxx@127.0.0.1:3306/","create_time":"2024-04-29T20:41:46.355270046+08:00","start_ts":449418231890837505,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64"},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50,"event_cache_percentage":0}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"sql_mode":"ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION","synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v7.5.1-21-g88db1a842","resolved_ts":449418231890837505,"checkpoint_ts":449418231890837505,"checkpoint_time":"2024-04-29 20:41:40.316"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
> GET /debug/info HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
check diff failed 27-th time, retry later
< HTTP/1.1 200 OK
< Date: Mon, 29 Apr 2024 12:41:47 GMT
< Content-Length: 613
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/41fb0c01-d4c1-483a-bad6-bb43f0d041e9
	{"id":"41fb0c01-d4c1-483a-bad6-bb43f0d041e9","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29dfe5d4f2
	41fb0c01-d4c1-483a-bad6-bb43f0d041e9

/tidb/cdc/default/default/upstream/7363268268542343018
	{"id":7363268268542343018,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/41fb0c01-d4c1-483a-bad6-bb43f0d041e9
	{"id":"41fb0c01-d4c1-483a-bad6-bb43f0d041e9","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29dfe5d4f2
	41fb0c01-d4c1-483a-bad6-bb43f0d041e9

/tidb/cdc/default/default/upstream/7363268268542343018
	{"id":7363268268542343018,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/41fb0c01-d4c1-483a-bad6-bb43f0d041e9
	{"id":"41fb0c01-d4c1-483a-bad6-bb43f0d041e9","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29dfe5d4f2
	41fb0c01-d4c1-483a-bad6-bb43f0d041e9

/tidb/cdc/default/default/upstream/7363268268542343018
	{"id":7363268268542343018,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ set +x
succeed to verify meta placement rules
TEST FAILED: OUTPUT DOES NOT CONTAIN 'id: 1'
____________________________________
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
check data failed 1-th time, retry later
check diff failed 28-th time, retry later
TEST FAILED: OUTPUT DOES NOT CONTAIN 'id: 1'
____________________________________
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
check data failed 2-th time, retry later
check diff failed 29-th time, retry later
table partition_table2.t2 not exists for 1-th check, retry later
check data successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
table partition_table2.t2 not exists for 2-th check, retry later
check diff failed 30-th time, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Mon Apr 29 20:41:53 CST 2024] <<<<<< run test case simple success! >>>>>>
table partition_table2.t2 not exists for 3-th check, retry later
check diff failed 31-th time, retry later
table partition_table2.t2 not exists for 4-th check, retry later
check diff failed 32-th time, retry later
table partition_table2.t2 not exists for 5-th check, retry later
check diff failed 33-th time, retry later
table partition_table2.t2 not exists for 6-th check, retry later
check diff failed 34-th time, retry later
table partition_table2.t2 not exists for 7-th check, retry later
check diff failed 35-th time, retry later
table partition_table2.t2 not exists for 8-th check, retry later
check diff failed 36-th time, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/cdc_server_tips/run.sh using Sink-Type: mysql... <<=================
The 1 times to try to start tidb cluster...
table partition_table2.t2 exists
check diff failed 37-th time, retry later
start tidb cluster in /tmp/tidb_cdc_test/cdc_server_tips
Starting Upstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Starting Downstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Verifying upstream PD is started...
check diff failed 38-th time, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
check diff failed 39-th time, retry later
Starting Upstream TiDB...
Release Version: v7.5.1-46-g3df1fe2cb9
Edition: Community
Git Commit Hash: 3df1fe2cb94fcc572aaaf15efed0a26269743a0d
Git Branch: release-7.5
UTC Build Time: 2024-04-29 09:35:42
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v7.5.1-46-g3df1fe2cb9
Edition: Community
Git Commit Hash: 3df1fe2cb94fcc572aaaf15efed0a26269743a0d
Git Branch: release-7.5
UTC Build Time: 2024-04-29 09:35:42
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
check diff failed 40-th time, retry later
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Mon Apr 29 20:42:14 CST 2024] <<<<<< START cdc server in consistent_partition_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/sink/dmlsink/txn/mysql/MySQLSinkHangLongTime=return(true);github.com/pingcap/tiflow/cdc/sink/ddlsink/mysql/MySQLSinkExecDDLDelay=return(true)'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.consistent_partition_table.2010320105.out server --log-file /tmp/tidb_cdc_test/consistent_partition_table/cdcpartition_table.server2.log --log-level debug --data-dir /tmp/tidb_cdc_test/consistent_partition_table/cdc_datapartition_table.server2 --cluster-id default
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 41-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 42-th time, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca7827cf00010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-348-j2h8x, pid:23174, start at 2024-04-29 20:42:17.799616331 +0800 CST m=+5.406504449	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-20:44:17.806 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-20:42:17.788 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-20:32:17.788 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
> GET /debug/info HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Mon, 29 Apr 2024 12:42:18 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:

changefeedID: default/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
{UpstreamID:7363268268542343018 Namespace:default ID:0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a SinkURI:mysql://normal:123456@127.0.0.1:3306/ CreateTime:2024-04-29 20:41:47.636411637 +0800 CST StartTs:449418233768837126 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc002b57170 State:normal Error:<nil> Warning:<nil> CreatorVersion:v7.5.1-21-g88db1a842 Epoch:449418233808158722}
{CheckpointTs:449418241239941126 MinTableBarrierTs:449418241777336328 AdminJobType:noop}
span: {table_id:107,start_key:7480000000000000ff6b5f720000000000fa,end_key:7480000000000000ff6b5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:123,start_key:7480000000000000ff7b5f720000000000fa,end_key:7480000000000000ff7b5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:112,start_key:7480000000000000ff705f720000000000fa,end_key:7480000000000000ff705f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:114,start_key:7480000000000000ff725f720000000000fa,end_key:7480000000000000ff725f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:113,start_key:7480000000000000ff715f720000000000fa,end_key:7480000000000000ff715f730000000000fa}, resolvedTs: 449418241777336322, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:109,start_key:7480000000000000ff6d5f720000000000fa,end_key:7480000000000000ff6d5f730000000000fa}, resolvedTs: 449418241763966982, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:116,start_key:7480000000000000ff745f720000000000fa,end_key:7480000000000000ff745f730000000000fa}, resolvedTs: 449418241763966982, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:119,start_key:7480000000000000ff775f720000000000fa,end_key:7480000000000000ff775f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:108,start_key:7480000000000000ff6c5f720000000000fa,end_key:7480000000000000ff6c5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:105,start_key:7480000000000000ff695f720000000000fa,end_key:7480000000000000ff695f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:117,start_key:7480000000000000ff755f720000000000fa,end_key:7480000000000000ff755f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b1afa41f-b815-4421-8ba5-f0188c393f33
	{"id":"b1afa41f-b815-4421-8ba5-f0188c393f33","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29dfe5d5d2
	b1afa41f-b815-4421-8ba5-f0188c393f33

/tidb/cdc/default/default/changefeed/info/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
	{"upstream-id":7363268268542343018,"namespace":"default","changefeed-id":"0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a","sink-uri":"mysql://normal:123456@127.0.0.1:3306/","create-time":"2024-04-29T20:41:47.636411637+08:00","start-ts":449418233768837126,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64"},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true},"consistent":{"level":"eventual","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"file:///tmp/tidb_cdc_test/consistent_partition_table/redo","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50,"event-cache-percentage":0}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"sql-mode":"ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION","synced-status":{"synced-check-interval":300,"checkpoint-interval":15}},"state":"normal","error":null,"warning":null,"creator-version":"v7.5.1-21-g88db1a842","epoch":449418233808158722}

/tidb/cdc/default/default/changefeed/status/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
	{"checkpoint-ts":449418241239941126,"min-table-barrier-ts":449418241777336328,"admin-job-type":0}

/tidb/cdc/default/default/task/position/b1afa41f-b815-4421-8ba5-f0188c393f33/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7363268268542343018
	{"id":7363268268542343018,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
{UpstreamID:7363268268542343018 Namespace:default ID:0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a SinkURI:mysql://normal:123456@127.0.0.1:3306/ CreateTime:2024-04-29 20:41:47.636411637 +0800 CST StartTs:449418233768837126 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc002b57170 State:normal Error:<nil> Warning:<nil> CreatorVersion:v7.5.1-21-g88db1a842 Epoch:449418233808158722}
{CheckpointTs:449418241239941126 MinTableBarrierTs:449418241777336328 AdminJobType:noop}
span: {table_id:107,start_key:7480000000000000ff6b5f720000000000fa,end_key:7480000000000000ff6b5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:123,start_key:7480000000000000ff7b5f720000000000fa,end_key:7480000000000000ff7b5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:112,start_key:7480000000000000ff705f720000000000fa,end_key:7480000000000000ff705f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:114,start_key:7480000000000000ff725f720000000000fa,end_key:7480000000000000ff725f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:113,start_key:7480000000000000ff715f720000000000fa,end_key:7480000000000000ff715f730000000000fa}, resolvedTs: 449418241777336322, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:109,start_key:7480000000000000ff6d5f720000000000fa,end_key:7480000000000000ff6d5f730000000000fa}, resolvedTs: 449418241763966982, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:116,start_key:7480000000000000ff745f720000000000fa,end_key:7480000000000000ff745f730000000000fa}, resolvedTs: 449418241763966982, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:119,start_key:7480000000000000ff775f720000000000fa,end_key:7480000000000000ff775f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:108,start_key:7480000000000000ff6c5f720000000000fa,end_key:7480000000000000ff6c5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:105,start_key:7480000000000000ff695f720000000000fa,end_key:7480000000000000ff695f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:117,start_key:7480000000000000ff755f720000000000fa,end_key:7480000000000000ff755f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b1afa41f-b815-4421-8ba5-f0188c393f33
	{"id":"b1afa41f-b815-4421-8ba5-f0188c393f33","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29dfe5d5d2
	b1afa41f-b815-4421-8ba5-f0188c393f33

/tidb/cdc/default/default/changefeed/info/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
	{"upstream-id":7363268268542343018,"namespace":"default","changefeed-id":"0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a","sink-uri":"mysql://normal:123456@127.0.0.1:3306/","create-time":"2024-04-29T20:41:47.636411637+08:00","start-ts":449418233768837126,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64"},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true},"consistent":{"level":"eventual","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"file:///tmp/tidb_cdc_test/consistent_partition_table/redo","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50,"event-cache-percentage":0}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"sql-mode":"ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION","synced-status":{"synced-check-interval":300,"checkpoint-interval":15}},"state":"normal","error":null,"warning":null,"creator-version":"v7.5.1-21-g88db1a842","epoch":449418233808158722}

/tidb/cdc/default/default/changefeed/status/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
	{"checkpoint-ts":449418241239941126,"min-table-barrier-ts":449418241777336328,"admin-job-type":0}

/tidb/cdc/default/default/task/position/b1afa41f-b815-4421-8ba5-f0188c393f33/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7363268268542343018
	{"id":7363268268542343018,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
{UpstreamID:7363268268542343018 Namespace:default ID:0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a SinkURI:mysql://normal:123456@127.0.0.1:3306/ CreateTime:2024-04-29 20:41:47.636411637 +0800 CST StartTs:449418233768837126 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc002b57170 State:normal Error:<nil> Warning:<nil> CreatorVersion:v7.5.1-21-g88db1a842 Epoch:449418233808158722}
{CheckpointTs:449418241239941126 MinTableBarrierTs:449418241777336328 AdminJobType:noop}
span: {table_id:107,start_key:7480000000000000ff6b5f720000000000fa,end_key:7480000000000000ff6b5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:123,start_key:7480000000000000ff7b5f720000000000fa,end_key:7480000000000000ff7b5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:112,start_key:7480000000000000ff705f720000000000fa,end_key:7480000000000000ff705f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:114,start_key:7480000000000000ff725f720000000000fa,end_key:7480000000000000ff725f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:113,start_key:7480000000000000ff715f720000000000fa,end_key:7480000000000000ff715f730000000000fa}, resolvedTs: 449418241777336322, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:109,start_key:7480000000000000ff6d5f720000000000fa,end_key:7480000000000000ff6d5f730000000000fa}, resolvedTs: 449418241763966982, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:116,start_key:7480000000000000ff745f720000000000fa,end_key:7480000000000000ff745f730000000000fa}, resolvedTs: 449418241763966982, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:119,start_key:7480000000000000ff775f720000000000fa,end_key:7480000000000000ff775f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:108,start_key:7480000000000000ff6c5f720000000000fa,end_key:7480000000000000ff6c5f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:105,start_key:7480000000000000ff695f720000000000fa,end_key:7480000000000000ff695f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating
span: {table_id:117,start_key:7480000000000000ff755f720000000000fa,end_key:7480000000000000ff755f730000000000fa}, resolvedTs: 449418241777336328, checkpointTs: 449418241239941126, state: Replicating



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b1afa41f-b815-4421-8ba5-f0188c393f33
	{"id":"b1afa41f-b815-4421-8ba5-f0188c393f33","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29dfe5d5d2
	b1afa41f-b815-4421-8ba5-f0188c393f33

/tidb/cdc/default/default/changefeed/info/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
	{"upstream-id":7363268268542343018,"namespace":"default","changefeed-id":"0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a","sink-uri":"mysql://normal:123456@127.0.0.1:3306/","create-time":"2024-04-29T20:41:47.636411637+08:00","start-ts":449418233768837126,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"c+ grep -q 'etcd info'
sv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64"},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true},"consistent":{"level":"eventual","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"file:///tmp/tidb_cdc_test/consistent_partition_table/redo","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50,"event-cache-percentage":0}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"sql-mode":"ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION","synced-status":{"synced-check-interval":300,"checkpoint-interval":15}},"state":"normal","error":null,"warning":null,"creator-version":"v7.5.1-21-g88db1a842","epoch":449418233808158722}

/tidb/cdc/default/default/changefeed/status/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
	{"checkpoint-ts":449418241239941126,"min-table-barrier-ts":449418241777336328,"admin-job-type":0}

/tidb/cdc/default/default/task/position/b1afa41f-b815-4421-8ba5-f0188c393f33/0da0b3c2-5cdf-4b1f-9ab0-271c3e92130a
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7363268268542343018
	{"id":7363268268542343018,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ break
+ set +x
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca7827cf00010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-348-j2h8x, pid:23174, start at 2024-04-29 20:42:17.799616331 +0800 CST m=+5.406504449	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-20:44:17.806 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-20:42:17.788 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-20:32:17.788 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca7827dd00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-348-j2h8x, pid:23247, start at 2024-04-29 20:42:17.889911116 +0800 CST m=+5.435766673	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-20:44:17.898 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-20:42:17.895 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-20:32:17.895 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
check diff failed 43-th time, retry later
Starting Upstream TiFlash...
TiFlash
Release Version: v7.5.1-12-g9002cc34d
Edition:         Community
Git Commit Hash: 9002cc34d3b593a718b6c5260ba18f30a45ab314
Git Branch:      HEAD
UTC Build Time:  2024-04-18 07:24:48
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO

Raft Proxy
Git Commit Hash:   521fd9dbc55e58646045d88f91c3c35db50b5981
Git Commit Branch: HEAD
UTC Build Time:    2024-04-18 07:28:40
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:    portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/cdc_server_tips/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/cdc_server_tips/tiflash/log/error.log
arg matches is ArgMatches { args: {"advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/cdc_server_tips/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v7.5.1-12-g9002cc34d"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/cdc_server_tips/tiflash-proxy.toml"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["9002cc34d3b593a718b6c5260ba18f30a45ab314"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/cdc_server_tips/tiflash/log/proxy.log"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
check diff failed 44-th time, retry later
+ pd_host=127.0.0.1
+ pd_port=2379
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.cli.24570.out cli tso query --pd=http://127.0.0.1:2379
check diff failed 45-th time, retry later
+ set +x
+ tso='449418243103260673
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449418243103260673 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
try a VALID cdc server command
[Mon Apr 29 20:42:24 CST 2024] <<<<<< START cdc server in cdc_server_tips case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
+ GO_FAILPOINTS=
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.2461024612.out server --log-file /tmp/tidb_cdc_test/cdc_server_tips/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cdc_server_tips/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
check diff failed 46-th time, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
> GET /debug/info HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Mon, 29 Apr 2024 12:42:27 GMT
< Content-Length: 613
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5f144a46-371a-4975-b371-cf290a22190b
	{"id":"5f144a46-371a-4975-b371-cf290a22190b","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29e07a19ff
	5f144a46-371a-4975-b371-cf290a22190b

/tidb/cdc/default/default/upstream/7363268435023435607
	{"id":7363268435023435607,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5f144a46-371a-4975-b371-cf290a22190b
	{"id":"5f144a46-371a-4975-b371-cf290a22190b","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29e07a19ff
	5f144a46-371a-4975-b371-cf290a22190b

/tidb/cdc/default/default/upstream/7363268435023435607
	{"id":7363268435023435607,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5f144a46-371a-4975-b371-cf290a22190b
	{"id":"5f144a46-371a-4975-b371-cf290a22190b","address":"127.0.0.1:8300","version":"v7.5.1-21-g88db1a842"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f29e07a19ff
	5f144a46-371a-4975-b371-cf290a22190b

/tidb/cdc/default/default/upstream/7363268435023435607
	{"id":7363268435023435607,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
check diff failed 47-th time, retry later
check diff failed 48-th time, retry later
check diff failed 49-th time, retry later
check diff failed 50-th time, retry later
check diff failed 51-th time, retry later
check diff failed 52-th time, retry later
check diff failed 53-th time, retry later
check diff failed 54-th time, retry later
check diff failed 55-th time, retry later
check diff failed 56-th time, retry later
check diff failed 57-th time, retry later
valid ~~~ running cdc  
Failed to start cdc, the usage tips should be printed
 1st test case cdc_server_tips success! 
try an INVALID cdc server command
[Mon Apr 29 20:42:47 CST 2024] <<<<<< START cdc server in cdc_server_tips case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ [[ true != \n\o ]]
+ set +x
+ GO_FAILPOINTS=
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.2471124713.out server --log-file /tmp/tidb_cdc_test/cdc_server_tips/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cdc_server_tips/cdc_data --cluster-id default --pd None
check diff failed 58-th time, retry later
check diff failed 59-th time, retry later
check diff failed 60-th time, retry later
check diff failed at last
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log

[2024/04/29 20:42:55.208 +08:00] [INFO] [printer.go:46] ["Welcome to sync_diff_inspector"] ["Release Version"=v7.4.0] ["Git Commit Hash"=d671b0840063bc2532941f02e02e12627402844c] ["Git Branch"=heads/refs/tags/v7.4.0] ["UTC Build Time"="2023-09-22 03:51:56"] ["Go Version"=go1.21.1]
[2024/04/29 20:42:55.209 +08:00] [INFO] [main.go:101] [config="{\"check-thread-count\":4,\"split-thread-count\":5,\"export-fix-sql\":true,\"check-struct-only\":false,\"dm-addr\":\"\",\"dm-task\":\"\",\"data-sources\":{\"mysql1\":{\"host\":\"127.0.0.1\",\"port\":4000,\"user\":\"root\",\"password\":\"******\",\"sql-mode\":\"\",\"snapshot\":\"\",\"security\":null,\"route-rules\":null,\"Router\":{\"Selector\":{}},\"Conn\":null},\"tidb0\":{\"host\":\"127.0.0.1\",\"port\":3306,\"user\":\"root\",\"password\":\"******\",\"sql-mode\":\"\",\"snapshot\":\"\",\"security\":null,\"route-rules\":null,\"Router\":{\"Selector\":{}},\"Conn\":null}},\"routes\":null,\"table-configs\":null,\"task\":{\"source-instances\":[\"mysql1\"],\"source-routes\":null,\"target-instance\":\"tidb0\",\"target-check-tables\":[\"sequence_test.t1\"],\"target-configs\":null,\"output-dir\":\"/tmp/tidb_cdc_test/sequence/sync_diff/output\",\"SourceInstances\":[{\"host\":\"127.0.0.1\",\"port\":4000,\"user\":\"root\",\"password\":\"******\",\"sql-mode\":\"\",\"snapshot\":\"\",\"security\":null,\"route-rules\":null,\"Router\":{\"Selector\":{}},\"Conn\":null}],\"TargetInstance\":{\"host\":\"127.0.0.1\",\"port\":3306,\"user\":\"root\",\"password\":\"******\",\"sql-mode\":\"\",\"snapshot\":\"\",\"security\":null,\"route-rules\":null,\"Router\":{\"Selector\":{}},\"Conn\":null},\"TargetTableConfigs\":null,\"TargetCheckTables\":[{}],\"FixDir\":\"/tmp/tidb_cdc_test/sequence/sync_diff/output/fix-on-tidb0\",\"CheckpointDir\":\"/tmp/tidb_cdc_test/sequence/sync_diff/output/checkpoint\",\"HashFile\":\"\"},\"ConfigFile\":\"/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/sequence/conf/diff_config.toml\",\"PrintVersion\":false}"]
[2024/04/29 20:42:55.209 +08:00] [DEBUG] [diff.go:842] ["set tidb cfg"]
[2024/04/29 20:42:55.212 +08:00] [DEBUG] [common.go:386] ["query tables"] [query="SHOW FULL TABLES IN `sequence_test` WHERE Table_Type = 'BASE TABLE';"]
[2024/04/29 20:42:55.213 +08:00] [DEBUG] [common.go:386] ["query tables"] [query="SHOW FULL TABLES IN `test` WHERE Table_Type = 'BASE TABLE';"]
[2024/04/29 20:42:55.213 +08:00] [DEBUG] [source.go:326] ["match target table"] [table=`sequence_test`.`t1`]
[2024/04/29 20:42:55.214 +08:00] [FATAL] [main.go:120] ["failed to initialize diff process"] [error="get table sequence_test.t1's information error line 3 column 31 near \"nextval(`sequence_test`.`seq0`)),\n  PRIMARY KEY (`id`) /*T![clustered_index] NONCLUSTERED */\n) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_bin\" \ngithub.com/pingcap/errors.AddStack\n\t/go/pkg/mod/github.com/pingcap/errors@v0.11.5-0.20221009092201-b66cddb77c32/errors.go:174\ngithub.com/pingcap/errors.Trace\n\t/go/pkg/mod/github.com/pingcap/errors@v0.11.5-0.20221009092201-b66cddb77c32/juju_adaptor.go:15\ngithub.com/pingcap/tidb/parser.(*Parser).ParseSQL\n\t/go/pkg/mod/github.com/pingcap/tidb/parser@v0.0.0-20230823131104-05aa17143df8/yy_parser.go:170\ngithub.com/pingcap/tidb/parser.(*Parser).ParseOneStmt\n\t/go/pkg/mod/github.com/pingcap/tidb/parser@v0.0.0-20230823131104-05aa17143df8/yy_parser.go:191\ngithub.com/pingcap/tidb-tools/pkg/dbutil.getTableInfoBySQL\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:149\ngithub.com/pingcap/tidb-tools/pkg/dbutil.GetTableInfoBySQLWithSessionContext\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:140\ngithub.com/pingcap/tidb-tools/pkg/dbutil.GetTableInfoWithVersion\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:121\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.initTables\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:328\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.NewSources\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:121\nmain.(*Diff).init\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:137\nmain.NewDiff\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:95\nmain.checkSyncState\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:117\nmain.main\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:104\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:267\nruntime.goexit\n\t/usr/local/go/src/runtime/asm_amd64.s:1650"] [errorVerbose="get table sequence_test.t1's information error line 3 column 31 near \"nextval(`sequence_test`.`seq0`)),\n  PRIMARY KEY (`id`) /*T![clustered_index] NONCLUSTERED */\n) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_bin\" \ngithub.com/pingcap/errors.AddStack\n\t/go/pkg/mod/github.com/pingcap/errors@v0.11.5-0.20221009092201-b66cddb77c32/errors.go:174\ngithub.com/pingcap/errors.Trace\n\t/go/pkg/mod/github.com/pingcap/errors@v0.11.5-0.20221009092201-b66cddb77c32/juju_adaptor.go:15\ngithub.com/pingcap/tidb/parser.(*Parser).ParseSQL\n\t/go/pkg/mod/github.com/pingcap/tidb/parser@v0.0.0-20230823131104-05aa17143df8/yy_parser.go:170\ngithub.com/pingcap/tidb/parser.(*Parser).ParseOneStmt\n\t/go/pkg/mod/github.com/pingcap/tidb/parser@v0.0.0-20230823131104-05aa17143df8/yy_parser.go:191\ngithub.com/pingcap/tidb-tools/pkg/dbutil.getTableInfoBySQL\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:149\ngithub.com/pingcap/tidb-tools/pkg/dbutil.GetTableInfoBySQLWithSessionContext\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:140\ngithub.com/pingcap/tidb-tools/pkg/dbutil.GetTableInfoWithVersion\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:121\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.initTables\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:328\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.NewSources\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:121\nmain.(*Diff).init\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:137\nmain.NewDiff\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:95\nmain.checkSyncState\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:117\nmain.main\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:104\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:267\nruntime.goexit\n\t/usr/local/go/src/runtime/asm_amd64.s:1650\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.initTables\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:330\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.NewSources\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:121\nmain.(*Diff).init\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:137\nmain.NewDiff\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:95\nmain.checkSyncState\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:117\nmain.main\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:104\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:267\nruntime.goexit\n\t/usr/local/go/src/runtime/asm_amd64.s:1650"] [stack="main.checkSyncState\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:120\nmain.main\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:104\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:267"]

invalid ~~~ running cdc  
Failed to start cdc, the usage tips should be printed
 2nd test case cdc_server_tips success! 
[Mon Apr 29 20:43:07 CST 2024] <<<<<< run all test cases cdc_server_tips success! >>>>>> 
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
Post stage
[Pipeline] sh
+ ls /tmp/tidb_cdc_test/
availability
cov.availability.26302632.out
cov.availability.27922794.out
cov.availability.29682970.out
cov.availability.30823084.out
cov.availability.32213223.out
cov.availability.36173619.out
cov.availability.38543856.out
cov.availability.39663968.out
cov.availability.41114113.out
cov.availability.46214623.out
cov.availability.47464748.out
cov.availability.53195321.out
cov.availability.55005502.out
cov.availability.56125614.out
cov.availability.58005802.out
cov.availability.59885990.out
cov.availability.cli.2576.out
cov.availability.cli.2682.out
cov.http_proxies.cli.12086.out
cov.http_proxies.cli.12170.out
cov.sequence.cli.14586.out
http_proxies
sequence
sql_res.availability.txt
sql_res.sequence.txt
++ find /tmp/tidb_cdc_test/ -type f -name '*.log'
+ tar -cvzf log-G18.tar.gz /tmp/tidb_cdc_test/availability/tidb.log /tmp/tidb_cdc_test/availability/cdctest_expire_capture.server1.log /tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server2.log /tmp/tidb_cdc_test/availability/stdouttest_expire_capture.server1.log /tmp/tidb_cdc_test/availability/cdctest_hang_up_capture.server2.log /tmp/tidb_cdc_test/availability/cdctest_hang_up_owner.server2.log /tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server3.log /tmp/tidb_cdc_test/availability/cdctest_gap_between_watch_capture.server1.log /tmp/tidb_cdc_test/availability/stdouttest_hang_up_owner.server2.log /tmp/tidb_cdc_test/availability/cdctest_stop_processor.log /tmp/tidb_cdc_test/availability/cdctest_kill_owner.server2.log /tmp/tidb_cdc_test/availability/cdctest_gap_between_watch_capture.server2.log /tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server1.log /tmp/tidb_cdc_test/availability/cdctest_owner_retryable_error.server2.log /tmp/tidb_cdc_test/availability/stdouttest_hang_up_owner.server1.log /tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server3.log /tmp/tidb_cdc_test/availability/stdouttest_hang_up_capture.server1.log /tmp/tidb_cdc_test/availability/tikv_down.log /tmp/tidb_cdc_test/availability/tikv3.log /tmp/tidb_cdc_test/availability/stdouttest_kill_owner.server2.log /tmp/tidb_cdc_test/availability/cdctest_kill_capture.server2.log /tmp/tidb_cdc_test/availability/stdouttest_gap_between_watch_capture.server2.log /tmp/tidb_cdc_test/availability/stdouttest_owner_retryable_error.server1.log /tmp/tidb_cdc_test/availability/stdouttest_kill_capture.server1.log /tmp/tidb_cdc_test/availability/stdouttest_kill_capture.server2.log /tmp/tidb_cdc_test/availability/cdctest_expire_owner.server1.log /tmp/tidb_cdc_test/availability/cdctest_kill_capture.server1.log /tmp/tidb_cdc_test/availability/cdctest_hang_up_capture.server1.log /tmp/tidb_cdc_test/availability/cdctest_kill_owner.server1.log /tmp/tidb_cdc_test/availability/pd1.log /tmp/tidb_cdc_test/availability/tidb_other.log /tmp/tidb_cdc_test/availability/cdctest_owner_retryable_error.server1.log /tmp/tidb_cdc_test/availability/stdouttest_stop_processor.log /tmp/tidb_cdc_test/availability/stdouttest_gap_between_watch_capture.server1.log /tmp/tidb_cdc_test/availability/tikv1.log /tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server1.log /tmp/tidb_cdc_test/availability/tidb_down.log /tmp/tidb_cdc_test/availability/stdouttest_expire_owner.server1.log /tmp/tidb_cdc_test/availability/tikv2.log /tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server2.log /tmp/tidb_cdc_test/availability/tidb-slow.log /tmp/tidb_cdc_test/availability/stdouttest_owner_retryable_error.server2.log /tmp/tidb_cdc_test/availability/cdctest_hang_up_owner.server1.log /tmp/tidb_cdc_test/availability/stdouttest_kill_owner.server1.log /tmp/tidb_cdc_test/availability/down_pd.log /tmp/tidb_cdc_test/availability/stdouttest_hang_up_capture.server2.log /tmp/tidb_cdc_test/sequence/tidb.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0003/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0001/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0004/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0000/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0006/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0005/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0002/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0007/000002.log /tmp/tidb_cdc_test/sequence/tikv_down/db/000005.log /tmp/tidb_cdc_test/sequence/tiflash/log/proxy.log /tmp/tidb_cdc_test/sequence/tiflash/log/server.log /tmp/tidb_cdc_test/sequence/tiflash/log/error.log /tmp/tidb_cdc_test/sequence/tiflash/db/proxy/db/000005.log /tmp/tidb_cdc_test/sequence/tikv_down.log /tmp/tidb_cdc_test/sequence/tikv3.log /tmp/tidb_cdc_test/sequence/stdout.log /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log /tmp/tidb_cdc_test/sequence/pd1.log /tmp/tidb_cdc_test/sequence/tidb_other.log /tmp/tidb_cdc_test/sequence/sync_diff_inspector.log /tmp/tidb_cdc_test/sequence/tikv1.log /tmp/tidb_cdc_test/sequence/tikv3/db/000005.log /tmp/tidb_cdc_test/sequence/tidb_down.log /tmp/tidb_cdc_test/sequence/tikv2.log /tmp/tidb_cdc_test/sequence/pd1/region-meta/000001.log /tmp/tidb_cdc_test/sequence/pd1/hot-region/000001.log /tmp/tidb_cdc_test/sequence/tidb-slow.log /tmp/tidb_cdc_test/sequence/tikv2/db/000005.log /tmp/tidb_cdc_test/sequence/down_pd.log /tmp/tidb_cdc_test/sequence/down_pd/region-meta/000001.log /tmp/tidb_cdc_test/sequence/down_pd/hot-region/000001.log /tmp/tidb_cdc_test/sequence/cdc.log /tmp/tidb_cdc_test/sequence/tikv1/db/000005.log /tmp/tidb_cdc_test/http_proxies/tidb.log /tmp/tidb_cdc_test/http_proxies/test_proxy.log /tmp/tidb_cdc_test/http_proxies/tikv_down.log /tmp/tidb_cdc_test/http_proxies/tikv3.log /tmp/tidb_cdc_test/http_proxies/stdout.log /tmp/tidb_cdc_test/http_proxies/pd1.log /tmp/tidb_cdc_test/http_proxies/tidb_other.log /tmp/tidb_cdc_test/http_proxies/tikv1.log /tmp/tidb_cdc_test/http_proxies/tidb_down.log /tmp/tidb_cdc_test/http_proxies/tikv2.log /tmp/tidb_cdc_test/http_proxies/tidb-slow.log /tmp/tidb_cdc_test/http_proxies/down_pd.log /tmp/tidb_cdc_test/http_proxies/cdc.log
tar: Removing leading `/' from member names
/tmp/tidb_cdc_test/availability/tidb.log
/tmp/tidb_cdc_test/availability/cdctest_expire_capture.server1.log
/tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_expire_capture.server1.log
/tmp/tidb_cdc_test/availability/cdctest_hang_up_capture.server2.log
/tmp/tidb_cdc_test/availability/cdctest_hang_up_owner.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server3.log
/tmp/tidb_cdc_test/availability/cdctest_gap_between_watch_capture.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_hang_up_owner.server2.log
/tmp/tidb_cdc_test/availability/cdctest_stop_processor.log
/tmp/tidb_cdc_test/availability/cdctest_kill_owner.server2.log
/tmp/tidb_cdc_test/availability/cdctest_gap_between_watch_capture.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server1.log
/tmp/tidb_cdc_test/availability/cdctest_owner_retryable_error.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_hang_up_owner.server1.log
/tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server3.log
/tmp/tidb_cdc_test/availability/stdouttest_hang_up_capture.server1.log
/tmp/tidb_cdc_test/availability/tikv_down.log
/tmp/tidb_cdc_test/availability/tikv3.log
/tmp/tidb_cdc_test/availability/stdouttest_kill_owner.server2.log
/tmp/tidb_cdc_test/availability/cdctest_kill_capture.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_gap_between_watch_capture.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_retryable_error.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_kill_capture.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_kill_capture.server2.log
/tmp/tidb_cdc_test/availability/cdctest_expire_owner.server1.log
/tmp/tidb_cdc_test/availability/cdctest_kill_capture.server1.log
/tmp/tidb_cdc_test/availability/cdctest_hang_up_capture.server1.log
/tmp/tidb_cdc_test/availability/cdctest_kill_owner.server1.log
/tmp/tidb_cdc_test/availability/pd1.log
/tmp/tidb_cdc_test/availability/tidb_other.log
/tmp/tidb_cdc_test/availability/cdctest_owner_retryable_error.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_stop_processor.log
/tmp/tidb_cdc_test/availability/stdouttest_gap_between_watch_capture.server1.log
/tmp/tidb_cdc_test/availability/tikv1.log
/tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server1.log
/tmp/tidb_cdc_test/availability/tidb_down.log
/tmp/tidb_cdc_test/availability/stdouttest_expire_owner.server1.log
/tmp/tidb_cdc_test/availability/tikv2.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server2.log
/tmp/tidb_cdc_test/availability/tidb-slow.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_retryable_error.server2.log
/tmp/tidb_cdc_test/availability/cdctest_hang_up_owner.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_kill_owner.server1.log
/tmp/tidb_cdc_test/availability/down_pd.log
/tmp/tidb_cdc_test/availability/stdouttest_hang_up_capture.server2.log
/tmp/tidb_cdc_test/sequence/tidb.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0003/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0001/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0004/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0000/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0006/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0005/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0002/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0007/000002.log
/tmp/tidb_cdc_test/sequence/tikv_down/db/000005.log
/tmp/tidb_cdc_test/sequence/tiflash/log/proxy.log
/tmp/tidb_cdc_test/sequence/tiflash/log/server.log
/tmp/tidb_cdc_test/sequence/tiflash/log/error.log
/tmp/tidb_cdc_test/sequence/tiflash/db/proxy/db/000005.log
/tmp/tidb_cdc_test/sequence/tikv_down.log
/tmp/tidb_cdc_test/sequence/tikv3.log
/tmp/tidb_cdc_test/sequence/stdout.log
/tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
/tmp/tidb_cdc_test/sequence/pd1.log
/tmp/tidb_cdc_test/sequence/tidb_other.log
/tmp/tidb_cdc_test/sequence/sync_diff_inspector.log
/tmp/tidb_cdc_test/sequence/tikv1.log
/tmp/tidb_cdc_test/sequence/tikv3/db/000005.log
/tmp/tidb_cdc_test/sequence/tidb_down.log
/tmp/tidb_cdc_test/sequence/tikv2.log
/tmp/tidb_cdc_test/sequence/pd1/region-meta/000001.log
/tmp/tidb_cdc_test/sequence/pd1/hot-region/000001.log
/tmp/tidb_cdc_test/sequence/tidb-slow.log
/tmp/tidb_cdc_test/sequence/tikv2/db/000005.log
/tmp/tidb_cdc_test/sequence/down_pd.log
/tmp/tidb_cdc_test/sequence/down_pd/region-meta/000001.log
/tmp/tidb_cdc_test/sequence/down_pd/hot-region/000001.log
/tmp/tidb_cdc_test/sequence/cdc.log
/tmp/tidb_cdc_test/sequence/tikv1/db/000005.log
/tmp/tidb_cdc_test/http_proxies/tidb.log
/tmp/tidb_cdc_test/http_proxies/test_proxy.log
/tmp/tidb_cdc_test/http_proxies/tikv_down.log
/tmp/tidb_cdc_test/http_proxies/tikv3.log
/tmp/tidb_cdc_test/http_proxies/stdout.log
/tmp/tidb_cdc_test/http_proxies/pd1.log
/tmp/tidb_cdc_test/http_proxies/tidb_other.log
/tmp/tidb_cdc_test/http_proxies/tikv1.log
/tmp/tidb_cdc_test/http_proxies/tidb_down.log
/tmp/tidb_cdc_test/http_proxies/tikv2.log
/tmp/tidb_cdc_test/http_proxies/tidb-slow.log
/tmp/tidb_cdc_test/http_proxies/down_pd.log
/tmp/tidb_cdc_test/http_proxies/cdc.log
+ ls -alh log-G18.tar.gz
-rw-r--r-- 1 jenkins jenkins 9.7M Apr 29 20:43 log-G18.tar.gz
[Pipeline] archiveArtifacts
Archiving artifacts
Recording fingerprints
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G18'
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
script returned exit code 143
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G10'
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   723  100   723    0     0   9064      0 --:--:-- --:--:-- --:--:--  9151
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-04-29 20:41:02.208","puller_resolved_ts":"2024-04-29 20:41:02.208","last_synced_ts":"2024-04-29 20:40:56.609","now_ts":"2024-04-29 20:43:13.000","info":"Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' \u003e '\''Resolved-Ts'\'' \u003e '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-04-29' '20:41:02.208","puller_resolved_ts":"2024-04-29' '20:41:02.208","last_synced_ts":"2024-04-29' '20:40:56.609","now_ts":"2024-04-29' '20:43:13.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-04-29' '20:41:02.208","puller_resolved_ts":"2024-04-29' '20:41:02.208","last_synced_ts":"2024-04-29' '20:40:56.609","now_ts":"2024-04-29' '20:43:13.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq -r .info
+ info='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ target_message='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ '[' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' '!=' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' ']'
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
+ stop_tidb_cluster
++ stop_tidb_cluster
/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/synced_status/run.sh: line 1: 20194 Terminated              stop_tidb_cluster
script returned exit code 143
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G09'
{"level":"warn","ts":1714394597.7731671,"caller":"v3@v3.5.10/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0028c01c0/127.0.0.1:2479","attempt":0,"error":"rpc error: code = Unavailable desc = error reading from server: EOF"}
{"level":"warn","ts":1714394597.7737186,"caller":"v3@v3.5.10/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0020b76c0/127.0.0.1:2379","attempt":0,"error":"rpc error: code = Unavailable desc = error reading from server: EOF"}
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

{"level":"warn","ts":1714394599.7733183,"caller":"v3@v3.5.10/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0020b76c0/127.0.0.1:2379","attempt":1,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
script returned exit code 143
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G02'
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE