Skip to content

Console Output

Skipping 2,619 KB.. Full Log
[2024/04/29 17:28:48.431 +08:00] [INFO] [worker.go:105] ["Transaction dmlSink worker starts"] [changefeedID=.] [workerID=11]
[2024/04/29 17:28:48.431 +08:00] [INFO] [worker.go:105] ["Transaction dmlSink worker starts"] [changefeedID=.] [workerID=2]
[2024/04/29 17:28:48.431 +08:00] [INFO] [worker.go:105] ["Transaction dmlSink worker starts"] [changefeedID=.] [workerID=12]
[2024/04/29 17:28:48.431 +08:00] [INFO] [worker.go:105] ["Transaction dmlSink worker starts"] [changefeedID=.] [workerID=13]
[2024/04/29 17:28:48.431 +08:00] [INFO] [worker.go:105] ["Transaction dmlSink worker starts"] [changefeedID=.] [workerID=6]
[2024/04/29 17:28:48.431 +08:00] [INFO] [worker.go:105] ["Transaction dmlSink worker starts"] [changefeedID=.] [workerID=10]
[2024/04/29 17:28:48.431 +08:00] [INFO] [worker.go:105] ["Transaction dmlSink worker starts"] [changefeedID=.] [workerID=15]
[2024/04/29 17:28:48.432 +08:00] [INFO] [tz.go:35] ["Use the timezone of the TiCDC server machine"] [timezoneName=System] [timezone=Asia/Shanghai]
[2024/04/29 17:28:48.432 +08:00] [WARN] [config.go:380] ["Because time-zone is not specified, the timezone of the TiCDC server will be used. We recommend that you specify the time-zone explicitly. Please make sure that the timezone of the TiCDC server, sink-uri and the downstream database are consistent. If the downstream database does not load the timezone information, you can refer to https://dev.mysql.com/doc/refman/8.0/en/mysql-tzinfo-to-sql.html."] [timezone=Asia/Shanghai]
[2024/04/29 17:28:48.438 +08:00] [INFO] [db_helper.go:175] ["sink uri is configured"] [dsn="normal:******@tcp(127.0.0.1:3306)/?interpolateParams=true&multiStatements=true&allow_auto_random_explicit_insert=1&charset=utf8mb4&foreign_key_checks=0&maxAllowedPacket=0&readTimeout=2m&sql_mode=%22NO_ENGINE_SUBSTITUTION%2CIGNORE_SPACE%2CONLY_FULL_GROUP_BY%2CALLOW_INVALID_DATES%2CNO_AUTO_VALUE_ON_ZERO%22&tidb_enable_external_ts_read=%22OFF%22&tidb_placement_mode=%22ignore%22&tidb_txn_mode=optimistic&time_zone=%22Asia%2FShanghai%22&timeout=2m&transaction_isolation=%22READ-COMMITTED%22&writeTimeout=2m"]
[2024/04/29 17:28:48.440 +08:00] [INFO] [mysql_ddl_sink.go:99] ["MySQL DDL sink is created"] [namespace=] [changefeed=]
[2024/04/29 17:28:48.514 +08:00] [INFO] [file.go:114] ["succeed to download and sort redo logs"] [type=ddl] [duration=96.468647ms]
check diff failed 48-th time, retry later
[2024/04/29 17:28:49.339 +08:00] [INFO] [file.go:114] ["succeed to download and sort redo logs"] [type=row] [duration=921.315863ms]
[2024/04/29 17:28:49.339 +08:00] [ERROR] [redo.go:268] ["ignore unsupported DDL"] [ddl="{\"StartTs\":449415190686728202,\"CommitTs\":449415190699835396,\"Query\":\"ALTER TABLE `consistent_replicate_storage_file`.`t1` EXCHANGE PARTITION `p3` WITH TABLE `consistent_replicate_storage_file`.`t2`\",\"TableInfo\":{\"SchemaID\":0,\"TableName\":{\"db-name\":\"consistent_replicate_storage_file\",\"tbl-name\":\"t1\",\"tbl-id\":108,\"is-partition\":true},\"Version\":0,\"RowColumnsOffset\":null,\"ColumnsFlag\":null,\"HandleIndexID\":0,\"IndexColumnsOffset\":null},\"PreTableInfo\":null,\"Type\":42,\"Done\":{},\"Charset\":\"\",\"Collate\":\"\",\"IsBootstrap\":false}"]
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/generate_column/run.sh using Sink-Type: mysql... <<=================
The 1 times to try to start tidb cluster...
[2024/04/29 17:28:50.435 +08:00] [INFO] [table_sink_impl.go:257] ["Table sink stopped"] [namespace=default] [changefeed=redo-applier] [span={table_id:116,start_key:7480000000000000ff745f720000000000fa,end_key:7480000000000000ff745f730000000000fa}] [checkpointTs=449415197568270347]
[2024/04/29 17:28:50.435 +08:00] [INFO] [table_sink_impl.go:239] ["Stopping table sink"] [namespace=default] [changefeed=redo-applier] [span={table_id:116,start_key:7480000000000000ff745f720000000000fa,end_key:7480000000000000ff745f730000000000fa}] [checkpointTs=449415197568270347]
[2024/04/29 17:28:51.339 +08:00] [INFO] [table_sink_impl.go:257] ["Table sink stopped"] [namespace=default] [changefeed=redo-applier] [span={table_id:118,start_key:7480000000000000ff765f720000000000fa,end_key:7480000000000000ff765f730000000000fa}] [checkpointTs=449415197568270347]
[2024/04/29 17:28:51.339 +08:00] [INFO] [table_sink_impl.go:239] ["Stopping table sink"] [namespace=default] [changefeed=redo-applier] [span={table_id:118,start_key:7480000000000000ff765f720000000000fa,end_key:7480000000000000ff765f730000000000fa}] [checkpointTs=449415197568270347]
[2024/04/29 17:28:51.440 +08:00] [INFO] [table_sink_impl.go:257] ["Table sink stopped"] [namespace=default] [changefeed=redo-applier] [span={table_id:113,start_key:7480000000000000ff715f720000000000fa,end_key:7480000000000000ff715f730000000000fa}] [checkpointTs=449415197568270347]
[2024/04/29 17:28:51.440 +08:00] [INFO] [table_sink_impl.go:239] ["Stopping table sink"] [namespace=default] [changefeed=redo-applier] [span={table_id:113,start_key:7480000000000000ff715f720000000000fa,end_key:7480000000000000ff715f730000000000fa}] [checkpointTs=449415197568270347]
check diff failed 49-th time, retry later
[2024/04/29 17:28:51.540 +08:00] [INFO] [redo.go:219] ["apply redo log finishes"] [appliedLogCount=5008] [appliedDDLCount=0] [currentCheckpoint=449415197568270347]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=9]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=15]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=8]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=13]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=7]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=5]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=14]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=6]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=3]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=12]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=4]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=2]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=0]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=10]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=1]
[2024/04/29 17:28:51.540 +08:00] [INFO] [worker.go:120] ["Transaction dmlSink worker exits as canceled"] [changefeedID=.] [workerID=11]
Apply redo log successfully
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-release-7.5-pull_cdc_integration_mysql_test-339/tiflow-cdc already exists)
check diff successfully
[Mon Apr 29 17:28:52 CST 2024] <<<<<< run test case consistent_replicate_storage_file success! >>>>>>
count(*) 5000
[Pipeline] // cache
[Pipeline] }
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
start tidb cluster in /tmp/tidb_cdc_test/generate_column
Starting Upstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Starting Downstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Verifying upstream PD is started...
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
check diff failed 50-th time, retry later
Starting Upstream TiDB...
Release Version: v7.5.1-45-gbf84e231e6
Edition: Community
Git Commit Hash: bf84e231e6ef26891d0cb524d938345f43aa047c
Git Branch: release-7.5
UTC Build Time: 2024-04-29 02:05:15
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v7.5.1-45-gbf84e231e6
Edition: Community
Git Commit Hash: bf84e231e6ef26891d0cb524d938345f43aa047c
Git Branch: release-7.5
UTC Build Time: 2024-04-29 02:05:15
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 51-th time, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 52-th time, retry later
Starting Upstream TiDB...
Release Version: v7.5.1-45-gbf84e231e6
Edition: Community
Git Commit Hash: bf84e231e6ef26891d0cb524d938345f43aa047c
Git Branch: release-7.5
UTC Build Time: 2024-04-29 02:05:15
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v7.5.1-45-gbf84e231e6
Edition: Community
Git Commit Hash: bf84e231e6ef26891d0cb524d938345f43aa047c
Git Branch: release-7.5
UTC Build Time: 2024-04-29 02:05:15
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
TEST FAILED: OUTPUT DOES NOT CONTAIN 'id: 1'
____________________________________
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
check data failed 1-th time, retry later
check data successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
check diff failed 53-th time, retry later
[Mon Apr 29 17:28:58 CST 2024] <<<<<< run test case ddl_puller_lag success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 54-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4be985c0006	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-4g6k2, pid:23486, start at 2024-04-29 17:28:59.678672772 +0800 CST m=+5.463812677	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:30:59.686 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:28:59.671 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:18:59.671 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4be985c0006	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-4g6k2, pid:23486, start at 2024-04-29 17:28:59.678672772 +0800 CST m=+5.463812677	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:30:59.686 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:28:59.671 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:18:59.671 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4be991c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-4g6k2, pid:23564, start at 2024-04-29 17:28:59.758936811 +0800 CST m=+5.488945867	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:30:59.767 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:28:59.769 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:18:59.769 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v7.5.1-12-g9002cc34d
Edition:         Community
Git Commit Hash: 9002cc34d3b593a718b6c5260ba18f30a45ab314
Git Branch:      HEAD
UTC Build Time:  2024-04-18 07:24:48
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO

Raft Proxy
Git Commit Hash:   521fd9dbc55e58646045d88f91c3c35db50b5981
Git Commit Branch: HEAD
UTC Build Time:    2024-04-18 07:28:40
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:    portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/cdc_server_tips/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/cdc_server_tips/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/cdc_server_tips/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v7.5.1-12-g9002cc34d"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/cdc_server_tips/tiflash/db/proxy"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/cdc_server_tips/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["9002cc34d3b593a718b6c5260ba18f30a45ab314"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
check diff failed 55-th time, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4becc000004	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-4bpdf, pid:29966, start at 2024-04-29 17:29:02.984471556 +0800 CST m=+5.413844751	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:31:02.992 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:29:02.976 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:19:02.976 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ pd_host=127.0.0.1
+ pd_port=2379
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.cli.24850.out cli tso query --pd=http://127.0.0.1:2379
check diff failed 56-th time, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4becc000004	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-4bpdf, pid:29966, start at 2024-04-29 17:29:02.984471556 +0800 CST m=+5.413844751	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:31:02.992 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:29:02.976 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:19:02.976 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4beccc80014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-4bpdf, pid:30035, start at 2024-04-29 17:29:03.067231439 +0800 CST m=+5.436846593	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:31:03.075 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:29:03.075 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:19:03.075 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v7.5.1-12-g9002cc34d
Edition:         Community
Git Commit Hash: 9002cc34d3b593a718b6c5260ba18f30a45ab314
Git Branch:      HEAD
UTC Build Time:  2024-04-18 07:24:48
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO

Raft Proxy
Git Commit Hash:   521fd9dbc55e58646045d88f91c3c35db50b5981
Git Commit Branch: HEAD
UTC Build Time:    2024-04-18 07:28:40
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:    portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/generate_column/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/generate_column/tiflash/log/error.log
arg matches is ArgMatches { args: {"data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/generate_column/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v7.5.1-12-g9002cc34d"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/generate_column/tiflash/log/proxy.log"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["9002cc34d3b593a718b6c5260ba18f30a45ab314"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/generate_column/tiflash-proxy.toml"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ set +x
+ tso='449415202726477825
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449415202726477825 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
try a VALID cdc server command
[Mon Apr 29 17:29:06 CST 2024] <<<<<< START cdc server in cdc_server_tips case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.2488424886.out server --log-file /tmp/tidb_cdc_test/cdc_server_tips/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cdc_server_tips/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/consistent_replicate_storage_file_large_value/run.sh using Sink-Type: mysql... <<=================
The 1 times to try to start tidb cluster...
+ pd_host=127.0.0.1
+ pd_port=2379
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.generate_column.cli.31273.out cli tso query --pd=http://127.0.0.1:2379
check diff failed 57-th time, retry later
+ set +x
+ tso='449415203592863745
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449415203592863745 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Mon Apr 29 17:29:09 CST 2024] <<<<<< START cdc server in generate_column case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.generate_column.3131031312.out server --log-file /tmp/tidb_cdc_test/generate_column/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/generate_column/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/ddl_only_block_related_table/run.sh using Sink-Type: mysql... <<=================
The 1 times to try to start tidb cluster...
check diff failed 58-th time, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
> GET /debug/info HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Mon, 29 Apr 2024 09:29:10 GMT
< Content-Length: 613
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1437cd67-b7ac-4c6c-8d0b-29797d34fc1e
	{"id":"1437cd67-b7ac-4c6c-8d0b-29797d34fc1e","address":"127.0.0.1:8300","version":"v7.5.1-21-g3ba37e9ae"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f292f78f200
	1437cd67-b7ac-4c6c-8d0b-29797d34fc1e

/tidb/cdc/default/default/upstream/7363218623087745414
	{"id":7363218623087745414,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1437cd67-b7ac-4c6c-8d0b-29797d34fc1e
	{"id":"1437cd67-b7ac-4c6c-8d0b-29797d34fc1e","address":"127.0.0.1:8300","version":"v7.5.1-21-g3ba37e9ae"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f292f78f200
	1437cd67-b7ac-4c6c-8d0b-29797d34fc1e

/tidb/cdc/default/default/upstream/7363218623087745414
	{"id":7363218623087745414,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1437cd67-b7ac-4c6c-8d0b-29797d34fc1e
	{"id":"1437cd67-b7ac-4c6c-8d0b-29797d34fc1e","address":"127.0.0.1:8300","version":"v7.5.1-21-g3ba37e9ae"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f292f78f200
	1437cd67-b7ac-4c6c-8d0b-29797d34fc1e

/tidb/cdc/default/default/upstream/7363218623087745414
	{"id":7363218623087745414,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
start tidb cluster in /tmp/tidb_cdc_test/consistent_replicate_storage_file_large_value
Starting Upstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Starting Downstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Verifying upstream PD is started...
check diff failed 59-th time, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
> GET /debug/info HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Mon, 29 Apr 2024 09:29:12 GMT
< Content-Length: 613
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/59b46a69-b9fd-4237-8d93-a994b400770a
	{"id":"59b46a69-b9fd-4237-8d93-a994b400770a","address":"127.0.0.1:8300","version":"v7.5.1-21-g3ba37e9ae"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f292f89f4f5
	59b46a69-b9fd-4237-8d93-a994b400770a

/tidb/cdc/default/default/upstream/7363218637967467375
	{"id":7363218637967467375,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/59b46a69-b9fd-4237-8d93-a994b400770a
	{"id":"59b46a69-b9fd-4237-8d93-a994b400770a","address":"127.0.0.1:8300","version":"v7.5.1-21-g3ba37e9ae"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f292f89f4f5
	59b46a69-b9fd-4237-8d93-a994b400770a

/tidb/cdc/default/default/upstream/7363218637967467375
	{"id":7363218637967467375,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/59b46a69-b9fd-4237-8d93-a994b400770a
	{"id":"59b46a69-b9fd-4237-8d93-a994b400770a","address":"127.0.0.1:8300","version":"v7.5.1-21-g3ba37e9ae"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f292f89f4f5
	59b46a69-b9fd-4237-8d93-a994b400770a

/tidb/cdc/default/default/upstream/7363218637967467375
	{"id":7363218637967467375,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.generate_column.cli.31353.out cli changefeed create --start-ts=449415203592863745 --sink-uri=mysql://normal:123456@127.0.0.1:3306/
start tidb cluster in /tmp/tidb_cdc_test/ddl_only_block_related_table
Starting Upstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Starting Downstream PD...
Release Version: v7.5.1-5-g584533652
Edition: Community
Git Commit Hash: 58453365285465cd90bc4472cff2bad7ce4d764b
Git Branch: release-7.5
UTC Build Time:  2024-04-03 10:04:14
Verifying upstream PD is started...
Create changefeed successfully!
ID: f506701b-de9b-4e0b-b54a-874b917361aa
Info: {"upstream_id":7363218637967467375,"namespace":"default","id":"f506701b-de9b-4e0b-b54a-874b917361aa","sink_uri":"mysql://normal:xxxxx@127.0.0.1:3306/","create_time":"2024-04-29T17:29:13.355711891+08:00","start_ts":449415203592863745,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64"},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50,"event_cache_percentage":0}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"sql_mode":"ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION","synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v7.5.1-21-g3ba37e9ae","resolved_ts":449415203592863745,"checkpoint_ts":449415203592863745,"checkpoint_time":"2024-04-29 17:29:08.276"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Starting Upstream TiDB...
check diff failed 60-th time, retry later
+ set +x
table generate_column.t not exists for 1-th check, retry later
check diff failed at last
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
There is something error when initialize diff, please check log info in /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log

[2024/04/29 17:29:13.842 +08:00] [INFO] [printer.go:46] ["Welcome to sync_diff_inspector"] ["Release Version"=v7.4.0] ["Git Commit Hash"=d671b0840063bc2532941f02e02e12627402844c] ["Git Branch"=heads/refs/tags/v7.4.0] ["UTC Build Time"="2023-09-22 03:51:56"] ["Go Version"=go1.21.1]
[2024/04/29 17:29:13.843 +08:00] [INFO] [main.go:101] [config="{\"check-thread-count\":4,\"split-thread-count\":5,\"export-fix-sql\":true,\"check-struct-only\":false,\"dm-addr\":\"\",\"dm-task\":\"\",\"data-sources\":{\"mysql1\":{\"host\":\"127.0.0.1\",\"port\":4000,\"user\":\"root\",\"password\":\"******\",\"sql-mode\":\"\",\"snapshot\":\"\",\"security\":null,\"route-rules\":null,\"Router\":{\"Selector\":{}},\"Conn\":null},\"tidb0\":{\"host\":\"127.0.0.1\",\"port\":3306,\"user\":\"root\",\"password\":\"******\",\"sql-mode\":\"\",\"snapshot\":\"\",\"security\":null,\"route-rules\":null,\"Router\":{\"Selector\":{}},\"Conn\":null}},\"routes\":null,\"table-configs\":null,\"task\":{\"source-instances\":[\"mysql1\"],\"source-routes\":null,\"target-instance\":\"tidb0\",\"target-check-tables\":[\"sequence_test.t1\"],\"target-configs\":null,\"output-dir\":\"/tmp/tidb_cdc_test/sequence/sync_diff/output\",\"SourceInstances\":[{\"host\":\"127.0.0.1\",\"port\":4000,\"user\":\"root\",\"password\":\"******\",\"sql-mode\":\"\",\"snapshot\":\"\",\"security\":null,\"route-rules\":null,\"Router\":{\"Selector\":{}},\"Conn\":null}],\"TargetInstance\":{\"host\":\"127.0.0.1\",\"port\":3306,\"user\":\"root\",\"password\":\"******\",\"sql-mode\":\"\",\"snapshot\":\"\",\"security\":null,\"route-rules\":null,\"Router\":{\"Selector\":{}},\"Conn\":null},\"TargetTableConfigs\":null,\"TargetCheckTables\":[{}],\"FixDir\":\"/tmp/tidb_cdc_test/sequence/sync_diff/output/fix-on-tidb0\",\"CheckpointDir\":\"/tmp/tidb_cdc_test/sequence/sync_diff/output/checkpoint\",\"HashFile\":\"\"},\"ConfigFile\":\"/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/sequence/conf/diff_config.toml\",\"PrintVersion\":false}"]
[2024/04/29 17:29:13.843 +08:00] [DEBUG] [diff.go:842] ["set tidb cfg"]
[2024/04/29 17:29:13.846 +08:00] [DEBUG] [common.go:386] ["query tables"] [query="SHOW FULL TABLES IN `sequence_test` WHERE Table_Type = 'BASE TABLE';"]
[2024/04/29 17:29:13.846 +08:00] [DEBUG] [common.go:386] ["query tables"] [query="SHOW FULL TABLES IN `test` WHERE Table_Type = 'BASE TABLE';"]
[2024/04/29 17:29:13.847 +08:00] [DEBUG] [source.go:326] ["match target table"] [table=`sequence_test`.`t1`]
[2024/04/29 17:29:13.848 +08:00] [FATAL] [main.go:120] ["failed to initialize diff process"] [error="get table sequence_test.t1's information error line 3 column 31 near \"nextval(`sequence_test`.`seq0`)),\n  PRIMARY KEY (`id`) /*T![clustered_index] NONCLUSTERED */\n) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_bin\" \ngithub.com/pingcap/errors.AddStack\n\t/go/pkg/mod/github.com/pingcap/errors@v0.11.5-0.20221009092201-b66cddb77c32/errors.go:174\ngithub.com/pingcap/errors.Trace\n\t/go/pkg/mod/github.com/pingcap/errors@v0.11.5-0.20221009092201-b66cddb77c32/juju_adaptor.go:15\ngithub.com/pingcap/tidb/parser.(*Parser).ParseSQL\n\t/go/pkg/mod/github.com/pingcap/tidb/parser@v0.0.0-20230823131104-05aa17143df8/yy_parser.go:170\ngithub.com/pingcap/tidb/parser.(*Parser).ParseOneStmt\n\t/go/pkg/mod/github.com/pingcap/tidb/parser@v0.0.0-20230823131104-05aa17143df8/yy_parser.go:191\ngithub.com/pingcap/tidb-tools/pkg/dbutil.getTableInfoBySQL\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:149\ngithub.com/pingcap/tidb-tools/pkg/dbutil.GetTableInfoBySQLWithSessionContext\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:140\ngithub.com/pingcap/tidb-tools/pkg/dbutil.GetTableInfoWithVersion\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:121\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.initTables\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:328\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.NewSources\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:121\nmain.(*Diff).init\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:137\nmain.NewDiff\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:95\nmain.checkSyncState\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:117\nmain.main\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:104\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:267\nruntime.goexit\n\t/usr/local/go/src/runtime/asm_amd64.s:1650"] [errorVerbose="get table sequence_test.t1's information error line 3 column 31 near \"nextval(`sequence_test`.`seq0`)),\n  PRIMARY KEY (`id`) /*T![clustered_index] NONCLUSTERED */\n) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_bin\" \ngithub.com/pingcap/errors.AddStack\n\t/go/pkg/mod/github.com/pingcap/errors@v0.11.5-0.20221009092201-b66cddb77c32/errors.go:174\ngithub.com/pingcap/errors.Trace\n\t/go/pkg/mod/github.com/pingcap/errors@v0.11.5-0.20221009092201-b66cddb77c32/juju_adaptor.go:15\ngithub.com/pingcap/tidb/parser.(*Parser).ParseSQL\n\t/go/pkg/mod/github.com/pingcap/tidb/parser@v0.0.0-20230823131104-05aa17143df8/yy_parser.go:170\ngithub.com/pingcap/tidb/parser.(*Parser).ParseOneStmt\n\t/go/pkg/mod/github.com/pingcap/tidb/parser@v0.0.0-20230823131104-05aa17143df8/yy_parser.go:191\ngithub.com/pingcap/tidb-tools/pkg/dbutil.getTableInfoBySQL\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:149\ngithub.com/pingcap/tidb-tools/pkg/dbutil.GetTableInfoBySQLWithSessionContext\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:140\ngithub.com/pingcap/tidb-tools/pkg/dbutil.GetTableInfoWithVersion\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/pkg/dbutil/table.go:121\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.initTables\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:328\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.NewSources\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:121\nmain.(*Diff).init\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:137\nmain.NewDiff\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:95\nmain.checkSyncState\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:117\nmain.main\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:104\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:267\nruntime.goexit\n\t/usr/local/go/src/runtime/asm_amd64.s:1650\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.initTables\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:330\ngithub.com/pingcap/tidb-tools/sync_diff_inspector/source.NewSources\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/source/source.go:121\nmain.(*Diff).init\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:137\nmain.NewDiff\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/diff.go:95\nmain.checkSyncState\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:117\nmain.main\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:104\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:267\nruntime.goexit\n\t/usr/local/go/src/runtime/asm_amd64.s:1650"] [stack="main.checkSyncState\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:120\nmain.main\n\t/home/jenkins/agent/workspace/build-common/go/src/github.com/pingcap/tidb-tools/sync_diff_inspector/main.go:104\nruntime.main\n\t/usr/local/go/src/runtime/proc.go:267"]

Release Version: v7.5.1-45-gbf84e231e6
Edition: Community
Git Commit Hash: bf84e231e6ef26891d0cb524d938345f43aa047c
Git Branch: release-7.5
UTC Build Time: 2024-04-29 02:05:15
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v7.5.1-45-gbf84e231e6
Edition: Community
Git Commit Hash: bf84e231e6ef26891d0cb524d938345f43aa047c
Git Branch: release-7.5
UTC Build Time: 2024-04-29 02:05:15
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   7.5.2
Edition:           Community
Git Commit Hash:   3478895c2a700e4824bb41940260b6b28013275e
Git Commit Branch: release-7.5
UTC Build Time:    2024-04-28 08:20:54
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Enable Features:   pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Profile:           dist_release
table generate_column.t exists
table generate_column.t1 not exists for 1-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v7.5.1-45-gbf84e231e6
Edition: Community
Git Commit Hash: bf84e231e6ef26891d0cb524d938345f43aa047c
Git Branch: release-7.5
UTC Build Time: 2024-04-29 02:05:15
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v7.5.1-45-gbf84e231e6
Edition: Community
Git Commit Hash: bf84e231e6ef26891d0cb524d938345f43aa047c
Git Branch: release-7.5
UTC Build Time: 2024-04-29 02:05:15
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table generate_column.t1 exists
check diff failed 1-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Mon Apr 29 17:29:22 CST 2024] <<<<<< run test case generate_column success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4c019f00004	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-fr44z, pid:6432, start at 2024-04-29 17:29:24.352979005 +0800 CST m=+5.440452410	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:31:24.361 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:29:24.348 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:19:24.348 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4c019f00004	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-fr44z, pid:6432, start at 2024-04-29 17:29:24.352979005 +0800 CST m=+5.440452410	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:31:24.361 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:29:24.348 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:19:24.348 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4c01c64000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-fr44z, pid:6492, start at 2024-04-29 17:29:24.530015353 +0800 CST m=+5.552252526	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:31:24.539 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:29:24.505 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:19:24.505 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v7.5.1-12-g9002cc34d
Edition:         Community
Git Commit Hash: 9002cc34d3b593a718b6c5260ba18f30a45ab314
Git Branch:      HEAD
UTC Build Time:  2024-04-18 07:24:48
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO

Raft Proxy
Git Commit Hash:   521fd9dbc55e58646045d88f91c3c35db50b5981
Git Commit Branch: HEAD
UTC Build Time:    2024-04-18 07:28:40
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:    portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash-proxy.toml"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["9002cc34d3b593a718b6c5260ba18f30a45ab314"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v7.5.1-12-g9002cc34d"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
Post stage
[Pipeline] sh
pass test: get status
pass test: health
pass test: changefeed apis
pass test: delete changefeed apis
pass test: capture apis
pass test: processor apis
pass test: owner apis
pass test: set log level
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
++ date
+ echo '[Mon Apr 29 17:29:26 CST 2024] <<<<<< run test case api_v2 success! >>>>>>'
[Mon Apr 29 17:29:26 CST 2024] <<<<<< run test case api_v2 success! >>>>>>
+ stop_tidb_cluster
+ ls /tmp/tidb_cdc_test/
availability
cov.availability.26412643.out
cov.availability.27912793.out
cov.availability.29532955.out
cov.availability.30603062.out
cov.availability.31963198.out
cov.availability.35913593.out
cov.availability.38173819.out
cov.availability.39243926.out
cov.availability.40774079.out
cov.availability.45794581.out
cov.availability.46934695.out
cov.availability.52785280.out
cov.availability.54555457.out
cov.availability.55685570.out
cov.availability.57535755.out
cov.availability.59465948.out
cov.availability.cli.2595.out
cov.availability.cli.2693.out
cov.http_proxies.cli.11956.out
cov.http_proxies.cli.12034.out
cov.sequence.cli.14461.out
http_proxies
sequence
sql_res.availability.txt
sql_res.sequence.txt
++ find /tmp/tidb_cdc_test/ -type f -name '*.log'
+ tar -cvzf log-G18.tar.gz /tmp/tidb_cdc_test/availability/stdouttest_hang_up_owner.server2.log /tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server3.log /tmp/tidb_cdc_test/availability/down_pd.log /tmp/tidb_cdc_test/availability/cdctest_kill_capture.server1.log /tmp/tidb_cdc_test/availability/stdouttest_kill_capture.server1.log /tmp/tidb_cdc_test/availability/stdouttest_kill_capture.server2.log /tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server2.log /tmp/tidb_cdc_test/availability/stdouttest_gap_between_watch_capture.server1.log /tmp/tidb_cdc_test/availability/stdouttest_gap_between_watch_capture.server2.log /tmp/tidb_cdc_test/availability/pd1.log /tmp/tidb_cdc_test/availability/stdouttest_expire_owner.server1.log /tmp/tidb_cdc_test/availability/tidb-slow.log /tmp/tidb_cdc_test/availability/stdouttest_kill_owner.server2.log /tmp/tidb_cdc_test/availability/tikv_down.log /tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server1.log /tmp/tidb_cdc_test/availability/stdouttest_stop_processor.log /tmp/tidb_cdc_test/availability/tidb.log /tmp/tidb_cdc_test/availability/cdctest_kill_capture.server2.log /tmp/tidb_cdc_test/availability/tikv3.log /tmp/tidb_cdc_test/availability/tikv2.log /tmp/tidb_cdc_test/availability/tidb_other.log /tmp/tidb_cdc_test/availability/stdouttest_kill_owner.server1.log /tmp/tidb_cdc_test/availability/stdouttest_owner_retryable_error.server2.log /tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server3.log /tmp/tidb_cdc_test/availability/cdctest_kill_owner.server2.log /tmp/tidb_cdc_test/availability/stdouttest_owner_retryable_error.server1.log /tmp/tidb_cdc_test/availability/cdctest_stop_processor.log /tmp/tidb_cdc_test/availability/cdctest_hang_up_capture.server2.log /tmp/tidb_cdc_test/availability/stdouttest_hang_up_owner.server1.log /tmp/tidb_cdc_test/availability/cdctest_gap_between_watch_capture.server1.log /tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server2.log /tmp/tidb_cdc_test/availability/cdctest_expire_capture.server1.log /tmp/tidb_cdc_test/availability/tidb_down.log /tmp/tidb_cdc_test/availability/stdouttest_hang_up_capture.server1.log /tmp/tidb_cdc_test/availability/tikv1.log /tmp/tidb_cdc_test/availability/cdctest_kill_owner.server1.log /tmp/tidb_cdc_test/availability/cdctest_expire_owner.server1.log /tmp/tidb_cdc_test/availability/stdouttest_hang_up_capture.server2.log /tmp/tidb_cdc_test/availability/cdctest_owner_retryable_error.server1.log /tmp/tidb_cdc_test/availability/cdctest_hang_up_capture.server1.log /tmp/tidb_cdc_test/availability/cdctest_hang_up_owner.server1.log /tmp/tidb_cdc_test/availability/cdctest_owner_retryable_error.server2.log /tmp/tidb_cdc_test/availability/stdouttest_expire_capture.server1.log /tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server1.log /tmp/tidb_cdc_test/availability/cdctest_gap_between_watch_capture.server2.log /tmp/tidb_cdc_test/availability/cdctest_hang_up_owner.server2.log /tmp/tidb_cdc_test/http_proxies/test_proxy.log /tmp/tidb_cdc_test/http_proxies/down_pd.log /tmp/tidb_cdc_test/http_proxies/pd1.log /tmp/tidb_cdc_test/http_proxies/tidb-slow.log /tmp/tidb_cdc_test/http_proxies/tikv_down.log /tmp/tidb_cdc_test/http_proxies/tidb.log /tmp/tidb_cdc_test/http_proxies/tikv3.log /tmp/tidb_cdc_test/http_proxies/tikv2.log /tmp/tidb_cdc_test/http_proxies/tidb_other.log /tmp/tidb_cdc_test/http_proxies/stdout.log /tmp/tidb_cdc_test/http_proxies/tidb_down.log /tmp/tidb_cdc_test/http_proxies/tikv1.log /tmp/tidb_cdc_test/http_proxies/cdc.log /tmp/tidb_cdc_test/sequence/down_pd.log /tmp/tidb_cdc_test/sequence/tikv3/db/000005.log /tmp/tidb_cdc_test/sequence/pd1.log /tmp/tidb_cdc_test/sequence/tidb-slow.log /tmp/tidb_cdc_test/sequence/tikv_down.log /tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log /tmp/tidb_cdc_test/sequence/tidb.log /tmp/tidb_cdc_test/sequence/tikv3.log /tmp/tidb_cdc_test/sequence/tikv2.log /tmp/tidb_cdc_test/sequence/tidb_other.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0000/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0002/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0005/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0001/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0007/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0003/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0006/000002.log /tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0004/000002.log /tmp/tidb_cdc_test/sequence/stdout.log /tmp/tidb_cdc_test/sequence/tidb_down.log /tmp/tidb_cdc_test/sequence/pd1/region-meta/000001.log /tmp/tidb_cdc_test/sequence/pd1/hot-region/000001.log /tmp/tidb_cdc_test/sequence/tikv1.log /tmp/tidb_cdc_test/sequence/tiflash/log/server.log /tmp/tidb_cdc_test/sequence/tiflash/log/proxy.log /tmp/tidb_cdc_test/sequence/tiflash/log/error.log /tmp/tidb_cdc_test/sequence/tiflash/db/proxy/db/000005.log /tmp/tidb_cdc_test/sequence/tikv1/db/000005.log /tmp/tidb_cdc_test/sequence/tikv2/db/000005.log /tmp/tidb_cdc_test/sequence/sync_diff_inspector.log /tmp/tidb_cdc_test/sequence/down_pd/region-meta/000001.log /tmp/tidb_cdc_test/sequence/down_pd/hot-region/000001.log /tmp/tidb_cdc_test/sequence/cdc.log /tmp/tidb_cdc_test/sequence/tikv_down/db/000005.log
tar: Removing leading `/' from member names
/tmp/tidb_cdc_test/availability/stdouttest_hang_up_owner.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server3.log
/tmp/tidb_cdc_test/availability/down_pd.log
/tmp/tidb_cdc_test/availability/cdctest_kill_capture.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_kill_capture.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_kill_capture.server2.log
/tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_gap_between_watch_capture.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_gap_between_watch_capture.server2.log
/tmp/tidb_cdc_test/availability/pd1.log
/tmp/tidb_cdc_test/availability/stdouttest_expire_owner.server1.log
/tmp/tidb_cdc_test/availability/tidb-slow.log
/tmp/tidb_cdc_test/availability/stdouttest_kill_owner.server2.log
/tmp/tidb_cdc_test/availability/tikv_down.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_stop_processor.log
/tmp/tidb_cdc_test/availability/tidb.log
/tmp/tidb_cdc_test/availability/cdctest_kill_capture.server2.log
/tmp/tidb_cdc_test/availability/tikv3.log
/tmp/tidb_cdc_test/availability/tikv2.log
[Mon Apr 29 17:29:29 CST 2024] <<<<<< START cdc server in ddl_only_block_related_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_only_block_related_table.77447746.out server --log-file /tmp/tidb_cdc_test/ddl_only_block_related_table/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_only_block_related_table/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4c049c80012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-j9rxd, pid:12505, start at 2024-04-29 17:29:27.452500907 +0800 CST m=+11.158575885	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:31:27.466 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:29:27.461 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:19:27.461 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4c049c80012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-j9rxd, pid:12505, start at 2024-04-29 17:29:27.452500907 +0800 CST m=+11.158575885	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:31:27.466 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:29:27.461 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:19:27.461 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	179	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63ca4c049a80014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:ap-tiflow-release-7-5-pull-cdc-integration-mysql-test-339-j9rxd, pid:12587, start at 2024-04-29 17:29:27.445683506 +0800 CST m=+11.094526199	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240429-17:31:27.455 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240429-17:29:27.452 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240429-17:19:27.452 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
/tmp/tidb_cdc_test/availability/tidb_other.log
/tmp/tidb_cdc_test/availability/stdouttest_kill_owner.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_retryable_error.server2.log
/tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server3.log
/tmp/tidb_cdc_test/availability/cdctest_kill_owner.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_retryable_error.server1.log
/tmp/tidb_cdc_test/availability/cdctest_stop_processor.log
/tmp/tidb_cdc_test/availability/cdctest_hang_up_capture.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_hang_up_owner.server1.log
/tmp/tidb_cdc_test/availability/cdctest_gap_between_watch_capture.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_owner_cleanup_stale_tasks.server2.log
/tmp/tidb_cdc_test/availability/cdctest_expire_capture.server1.log
/tmp/tidb_cdc_test/availability/tidb_down.log
/tmp/tidb_cdc_test/availability/stdouttest_hang_up_capture.server1.log
/tmp/tidb_cdc_test/availability/tikv1.log
/tmp/tidb_cdc_test/availability/cdctest_kill_owner.server1.log
/tmp/tidb_cdc_test/availability/cdctest_expire_owner.server1.log
/tmp/tidb_cdc_test/availability/stdouttest_hang_up_capture.server2.log
/tmp/tidb_cdc_test/availability/cdctest_owner_retryable_error.server1.log
/tmp/tidb_cdc_test/availability/cdctest_hang_up_capture.server1.log
/tmp/tidb_cdc_test/availability/cdctest_hang_up_owner.server1.log
/tmp/tidb_cdc_test/availability/cdctest_owner_retryable_error.server2.log
/tmp/tidb_cdc_test/availability/stdouttest_expire_capture.server1.log
/tmp/tidb_cdc_test/availability/cdctest_owner_cleanup_stale_tasks.server1.log
/tmp/tidb_cdc_test/availability/cdctest_gap_between_watch_capture.server2.log
/tmp/tidb_cdc_test/availability/cdctest_hang_up_owner.server2.log
/tmp/tidb_cdc_test/http_proxies/test_proxy.log
/tmp/tidb_cdc_test/http_proxies/down_pd.log
/tmp/tidb_cdc_test/http_proxies/pd1.log
/tmp/tidb_cdc_test/http_proxies/tidb-slow.log
/tmp/tidb_cdc_test/http_proxies/tikv_down.log
/tmp/tidb_cdc_test/http_proxies/tidb.log
/tmp/tidb_cdc_test/http_proxies/tikv3.log
/tmp/tidb_cdc_test/http_proxies/tikv2.log
/tmp/tidb_cdc_test/http_proxies/tidb_other.log
/tmp/tidb_cdc_test/http_proxies/stdout.log
/tmp/tidb_cdc_test/http_proxies/tidb_down.log
/tmp/tidb_cdc_test/http_proxies/tikv1.log
/tmp/tidb_cdc_test/http_proxies/cdc.log
/tmp/tidb_cdc_test/sequence/down_pd.log
/tmp/tidb_cdc_test/sequence/tikv3/db/000005.log
/tmp/tidb_cdc_test/sequence/pd1.log
/tmp/tidb_cdc_test/sequence/tidb-slow.log
/tmp/tidb_cdc_test/sequence/tikv_down.log
Starting Upstream TiFlash...
TiFlash
Release Version: v7.5.1-12-g9002cc34d
Edition:         Community
Git Commit Hash: 9002cc34d3b593a718b6c5260ba18f30a45ab314
Git Branch:      HEAD
UTC Build Time:  2024-04-18 07:24:48
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO

Raft Proxy
Git Commit Hash:   521fd9dbc55e58646045d88f91c3c35db50b5981
Git Commit Branch: HEAD
UTC Build Time:    2024-04-18 07:28:40
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:    portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/consistent_replicate_storage_file_large_value/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/consistent_replicate_storage_file_large_value/tiflash/log/error.log
arg matches is ArgMatches { args: {"advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/consistent_replicate_storage_file_large_value/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/consistent_replicate_storage_file_large_value/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/consistent_replicate_storage_file_large_value/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v7.5.1-12-g9002cc34d"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["9002cc34d3b593a718b6c5260ba18f30a45ab314"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
/tmp/tidb_cdc_test/sequence/sync_diff/output/sync_diff.log
/tmp/tidb_cdc_test/sequence/tidb.log
/tmp/tidb_cdc_test/sequence/tikv3.log
/tmp/tidb_cdc_test/sequence/tikv2.log
/tmp/tidb_cdc_test/sequence/tidb_other.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0000/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0002/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0005/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0001/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0007/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0003/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0006/000002.log
/tmp/tidb_cdc_test/sequence/cdc_data/tmp/sorter/0004/000002.log
/tmp/tidb_cdc_test/sequence/stdout.log
/tmp/tidb_cdc_test/sequence/tidb_down.log
/tmp/tidb_cdc_test/sequence/pd1/region-meta/000001.log
/tmp/tidb_cdc_test/sequence/pd1/hot-region/000001.log
/tmp/tidb_cdc_test/sequence/tikv1.log
/tmp/tidb_cdc_test/sequence/tiflash/log/server.log
/tmp/tidb_cdc_test/sequence/tiflash/log/proxy.log
/tmp/tidb_cdc_test/sequence/tiflash/log/error.log
/tmp/tidb_cdc_test/sequence/tiflash/db/proxy/db/000005.log
/tmp/tidb_cdc_test/sequence/tikv1/db/000005.log
/tmp/tidb_cdc_test/sequence/tikv2/db/000005.log
/tmp/tidb_cdc_test/sequence/sync_diff_inspector.log
/tmp/tidb_cdc_test/sequence/down_pd/region-meta/000001.log
/tmp/tidb_cdc_test/sequence/down_pd/hot-region/000001.log
/tmp/tidb_cdc_test/sequence/cdc.log
/tmp/tidb_cdc_test/sequence/tikv_down/db/000005.log
+ ls -alh log-G18.tar.gz
-rw-r--r-- 1 jenkins jenkins 9.6M Apr 29 17:29 log-G18.tar.gz
[Pipeline] archiveArtifacts
Archiving artifacts
valid ~~~ running cdc  
Failed to start cdc, the usage tips should be printed
 1st test case cdc_server_tips success! 
try an INVALID cdc server command
[Mon Apr 29 17:29:30 CST 2024] <<<<<< START cdc server in cdc_server_tips case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.2497124973.out server --log-file /tmp/tidb_cdc_test/cdc_server_tips/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cdc_server_tips/cdc_data --cluster-id default --pd None
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ [[ true != \n\o ]]
+ set +x
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
> GET /debug/info HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
Recording fingerprints
[Pipeline] }
< HTTP/1.1 200 OK
< Date: Mon, 29 Apr 2024 09:29:32 GMT
< Content-Length: 613
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c0adc8fe-8a3f-43ee-ac01-67fa71bc8de0
	{"id":"c0adc8fe-8a3f-43ee-ac01-67fa71bc8de0","address":"127.0.0.1:8300","version":"v7.5.1-21-g3ba37e9ae"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f292fd958f3
	c0adc8fe-8a3f-43ee-ac01-67fa71bc8de0

/tidb/cdc/default/default/upstream/7363218726588629125
	{"id":7363218726588629125,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c0adc8fe-8a3f-43ee-ac01-67fa71bc8de0
	{"id":"c0adc8fe-8a3f-43ee-ac01-67fa71bc8de0","address":"127.0.0.1:8300","version":"v7.5.1-21-g3ba37e9ae"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f292fd958f3
	c0adc8fe-8a3f-43ee-ac01-67fa71bc8de0

/tidb/cdc/default/default/upstream/7363218726588629125
	{"id":7363218726588629125,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ grep -q 'etcd info'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c0adc8fe-8a3f-43ee-ac01-67fa71bc8de0
	{"id":"c0adc8fe-8a3f-43ee-ac01-67fa71bc8de0","address":"127.0.0.1:8300","version":"v7.5.1-21-g3ba37e9ae"}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f292fd958f3
	c0adc8fe-8a3f-43ee-ac01-67fa71bc8de0

/tidb/cdc/default/default/upstream/7363218726588629125
	{"id":7363218726588629125,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_only_block_related_table.cli.7789.out cli changefeed create --sink-uri=mysql://root@127.0.0.1:3306/ -c=ddl-only-block-related-table
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
Create changefeed successfully!
ID: ddl-only-block-related-table
Info: {"upstream_id":7363218726588629125,"namespace":"default","id":"ddl-only-block-related-table","sink_uri":"mysql://root@127.0.0.1:3306/","create_time":"2024-04-29T17:29:32.918424231+08:00","start_ts":449415210021158914,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64"},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50,"event_cache_percentage":0}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"sql_mode":"ONLY_FULL_GROUP_BY,STRICT_TRANS_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION","synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v7.5.1-21-g3ba37e9ae","resolved_ts":449415210021158914,"checkpoint_ts":449415210021158914,"checkpoint_time":"2024-04-29 17:29:32.798"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Mon Apr 29 17:29:32 CST 2024] <<<<<< START cdc server in consistent_replicate_storage_file_large_value case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.consistent_replicate_storage_file_large_value.1385913861.out server --log-file /tmp/tidb_cdc_test/consistent_replicate_storage_file_large_value/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/consistent_replicate_storage_file_large_value/cdc_data --cluster-id default
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G18'
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
{"level":"warn","ts":1714382974.2674315,"caller":"v3@v3.5.10/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc002930540/127.0.0.1:2379","attempt":0,"error":"rpc error: code = Unavailable desc = error reading from server: EOF"}
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

script returned exit code 143
+ set +x
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
script returned exit code 143
{"level":"warn","ts":1714382974.5432408,"caller":"v3@v3.5.10/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0021bcfc0/127.0.0.1:2379","attempt":0,"error":"rpc error: code = Unavailable desc = error reading from server: EOF"}
script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
\033[0;36m<<< Run all test success >>>\033[0m
kill finished with exit code 1
{"level":"warn","ts":1714382975.087722,"caller":"v3@v3.5.10/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc002289a40/127.0.0.1:2379","attempt":0,"error":"rpc error: code = Unavailable desc = error reading from server: read tcp 127.0.0.1:57978->127.0.0.1:2379: read: connection reset by peer"}
script returned exit code 143
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-release-7.5-pull_cdc_integration_mysql_test-339/tiflow-cdc already exists)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G01'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G02'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G04'
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G10'
++ stop_tidb_cluster
/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_cdc_integration_mysql_test/tiflow/tests/integration_tests/synced_status/run.sh: line 1: 17167 Terminated              sleep 130
script returned exit code 143
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G09'
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE