Skip to content

Console Output

Skipping 851 KB.. Full Log
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test2"
rpc addr 127.0.0.1:8263 is alive
got=2 expected=2
[Thu May 23 23:44:37 CST 2024] <<<<<< start DM-035 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl1_1/conf/double-source-optimistic.yaml --remove-meta"
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
check diff failed 3-th time, retry later
[Thu May 23 23:44:38 CST 2024] <<<<<< finish DM-122 optimistic >>>>>>
dmctl test cmd: "query-status test"
got=2 expected=2
run tidb sql failed 1-th time, retry later
[Thu May 23 23:44:39 CST 2024] <<<<<< start DM-123 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-pessimistic.yaml --remove-meta"
check diff failed 4-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
dmctl test cmd: "stop-task test"
[Thu May 23 23:44:41 CST 2024] <<<<<< finish DM-035 optimistic >>>>>>
check diff failed 5-th time, retry later
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:44:43 CST 2024] <<<<<< finish DM-123 pessimistic >>>>>>
[Thu May 23 23:44:43 CST 2024] <<<<<< start DM-123 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
process dm-master.test already exit
check diff failed 6-th time, retry later
wait process dm-worker.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:44:46 CST 2024] <<<<<< test case shardddl1_1 success! >>>>>>
start running case: [shardddl2] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:44:46 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
check diff successfully
binlog_pos: 2356 relay_log_size: 2356
============== run_with_prepared_source_config success ===================
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
check diff successfully
dmctl test cmd: "stop-task test"
after restart dm-worker, task should resume automatically
dmctl test cmd: "start-task /tmp/dm_test/all_mode/dm-task.yaml"
wait process dm-master.test exit...
process dm-master.test already exit
[Thu May 23 23:44:47 CST 2024] <<<<<< finish DM-123 optimistic >>>>>>
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:44:47 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
[Thu May 23 23:44:48 CST 2024] <<<<<< start DM-124 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-optimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/shardddl2/source1.yaml"
wait process dm-worker.test exit...
[Thu May 23 23:44:49 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
HTTP 127.0.0.1:8261/apis/v1alpha1/status/t-Ë!s`t is alive
wait process dm-worker.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/shardddl2/source2.yaml"
[Thu May 23 23:44:50 CST 2024] <<<<<< finish DM-124 optimistic >>>>>>
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:44:50 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/duplicate_event/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
[Thu May 23 23:44:51 CST 2024] <<<<<< start DM-DROP_COLUMN_EXEC_ERROR optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
[Thu May 23 23:44:51 CST 2024] <<<<<< start DM-125 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-optimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:44:51 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/duplicate_event/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=2 expected=2
restart dm-worker 1
wait process tidb-server exit...
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/duplicate_event/source1.yaml"
check diff failed 1-th time, retry later
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/duplicate_event/conf/dm-task.yaml --remove-meta"
wait process tidb-server exit...
wait process dm-worker1 exit...
process dm-worker1 already exit
[Thu May 23 23:44:53 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
check log contain failed 1-th time, retry later
wait process tidb-server exit...
rpc addr 127.0.0.1:8262 is alive
check log contain failed 1-th time, retry later
check diff successfully
dmctl test cmd: "shard-ddl-lock"
dmctl test cmd: "stop-task test"
[Thu May 23 23:44:55 CST 2024] <<<<<< finish DM-125 optimistic >>>>>>
wait process tidb-server exit...
check log contain failed 2-th time, retry later
[Thu May 23 23:44:56 CST 2024] <<<<<< start DM-126 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-optimistic.yaml --remove-meta"
wait process tidb-server exit...
dmctl test cmd: "query-status test"
got=1 expected=1
restart dm-master
dmctl test cmd: "query-status test"
wait process tidb-server exit...
got=2 expected=2
wait process dm-master exit...
process dm-master already exit
check diff failed 1-th time, retry later
wait process tidb-server exit...
check log contain failed 3-th time, retry later
wait process tidb-server exit...
check diff successfully
dmctl test cmd: "shard-ddl-lock"
dmctl test cmd: "stop-task test"
[Thu May 23 23:44:59 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
[Thu May 23 23:45:00 CST 2024] <<<<<< finish DM-126 optimistic >>>>>>
wait process tidb-server exit...
check log contain failed 4-th time, retry later
wait process tidb-server exit...
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
[Thu May 23 23:45:01 CST 2024] <<<<<< start DM-127 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-optimistic.yaml --remove-meta"
wait process tidb-server exit...
check log contain failed 5-th time, retry later
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
restart dm-worker 1
got=2 expected=2
check diff failed 1-th time, retry later
wait process tidb-server exit...
wait process dm-worker1 exit...
process dm-worker1 already exit
[Thu May 23 23:45:04 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
check log contain failed 6-th time, retry later
wait process tidb-server exit...
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "query-status test"
got=3 expected=3
check diff successfully
dmctl test cmd: "stop-task test"
wait process tidb-server exit...
[Thu May 23 23:45:05 CST 2024] <<<<<< finish DM-DROP_COLUMN_EXEC_ERROR optimistic >>>>>>
[Thu May 23 23:45:05 CST 2024] <<<<<< start DM-INIT_SCHEMA optimistic >>>>>>
check diff successfully
dmctl test cmd: "shard-ddl-lock"
dmctl test cmd: "stop-task test"
[Thu May 23 23:45:05 CST 2024] <<<<<< finish DM-127 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
check log contain failed 7-th time, retry later
wait process tidb-server exit...
[Thu May 23 23:45:06 CST 2024] <<<<<< start DM-128 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-pessimistic.yaml --remove-meta"
dmctl test cmd: "query-status test"
got=2 expected=2
check log contain failed 1-th time, retry later
wait process tidb-server exit...
process tidb-server already exit
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
check log contain failed 8-th time, retry later
restart dm-master
Starting TiDB on port 4000
Verifying TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff successfully
dmctl test cmd: "stop-task test"
wait process dm-master exit...
process dm-master already exit
check log contain failed 9-th time, retry later
[Thu May 23 23:45:10 CST 2024] <<<<<< finish DM-128 pessimistic >>>>>>
[Thu May 23 23:45:10 CST 2024] <<<<<< start DM-128 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-optimistic.yaml --remove-meta"
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	146	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ERROR 1396 (HY000) at line 1: Operation CREATE USER failed for 'test'@'%'
dmctl test cmd: "query-status test"
got=2 expected=2
[Thu May 23 23:45:12 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
check diff successfully
dmctl test cmd: "stop-task test"
check log contain failed 10-th time, retry later
[Thu May 23 23:45:12 CST 2024] <<<<<< finish DM-128 optimistic >>>>>>
[Thu May 23 23:45:13 CST 2024] <<<<<< start DM-129 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-pessimistic.yaml --remove-meta"
check diff failed 1-th time, retry later
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
check log contain failed 11-th time, retry later
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=3 expected=3
check diff successfully
dmctl test cmd: "stop-task test"
check diff failed 1-th time, retry later
[Thu May 23 23:45:15 CST 2024] <<<<<< finish DM-INIT_SCHEMA optimistic >>>>>>
[Thu May 23 23:45:15 CST 2024] <<<<<< start DM-DROP_COLUMN_ALL_DONE optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
check diff successfully
dmctl test cmd: "pause-relay -s mysql-replica-01"
dmctl test cmd: "resume-relay -s mysql-replica-01"
check log contain failed 12-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
restart dm-worker 2
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:45:17 CST 2024] <<<<<< finish DM-129 pessimistic >>>>>>
[Thu May 23 23:45:17 CST 2024] <<<<<< start DM-129 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-worker2 exit...
process dm-worker2 already exit
[Thu May 23 23:45:17 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
check log contain failed 13-th time, retry later
relay logs dm-it-e71cf6e8-3edc-4614-925a-82d4b88b90cf-w5ncw-9kcft-bin.000002
relay.meta
check diff successfully
check dump files have been cleaned
ls: cannot access /tmp/dm_test/all_mode/worker2/dumped_data.t-Ë!s`t: No such file or directory
worker2 auto removed dump files
check no password in log
dmctl test cmd: "query-status t-Ë!s`t"
got=1 expected=1
dmctl test cmd: "stop-task t-Ë!s`t"
matched
matched
[Thu May 23 23:45:18 CST 2024] <<<<<< start test_source_and_target_with_empty_gtid >>>>>>
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "query-status test"
got=2 expected=2
check log contain failed 1-th time, retry later
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:45:19 CST 2024] <<<<<< finish DM-129 optimistic >>>>>>
wait process dm-master.test exit...
process dm-master.test already exit
[Thu May 23 23:45:20 CST 2024] <<<<<< start DM-130 pessimistic >>>>>>
check log contain failed 14-th time, retry later
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-pessimistic.yaml --remove-meta"
dmctl test cmd: "query-status test"
wait process dm-worker.test exit...
got=1 expected=1
restart dm-master
wait process dm-worker.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "stop-task test"
wait process dm-master exit...
process dm-master already exit
[Thu May 23 23:45:22 CST 2024] <<<<<< finish DM-130 pessimistic >>>>>>
[Thu May 23 23:45:22 CST 2024] <<<<<< start DM-130 optimistic >>>>>>
check log contain failed 15-th time, retry later
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:45:22 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /tmp/dm_test/all_mode/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
dmctl test cmd: "query-status test"
got=2 expected=2
check log contain failed 1-th time, retry later
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:45:24 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /tmp/dm_test/all_mode/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
check log contain failed 16-th time, retry later
[Thu May 23 23:45:24 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/all_mode/source1.yaml"
got=2 expected=2
got=1 expected=1
check master alive
dmctl test cmd: "list-member"
got=1 expected=1
gtid is empty
start task and check stage
dmctl test cmd: "start-task /tmp/dm_test/all_mode/dm-task-no-gtid.yaml --remove-meta=true"
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
check log contain failed 17-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
got=1 expected=1
got=2 expected=2
got=1 expected=1
dmctl test cmd: "shard-ddl-lock unlock test-`shardddl`.`tb` -s mysql-replica-02 -d shardddl1 -t tb1 --action skip"
dmctl test cmd: "query-status test"
got=2 expected=2
got=1 expected=1
got=2 expected=2
got=1 expected=1
dmctl test cmd: "shard-ddl-lock unlock test-`shardddl`.`tb` -s mysql-replica-02 -d shardddl1 -t tb2 --action skip"
dmctl test cmd: "query-status test"
check diff successfully
dmctl test cmd: "stop-task test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=2 expected=2
got=1 expected=1
got=2 expected=2
check data
check diff successfully
ERROR 1146 (42S02) at line 1: Table 'all_mode.t2' doesn't exist
run tidb sql failed 1-th time, retry later
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
restart dm-worker 2
[Thu May 23 23:45:27 CST 2024] <<<<<< finish DM-130 optimistic >>>>>>
wait process dm-worker2 exit...
process dm-worker2 already exit
[Thu May 23 23:45:28 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
check log contain failed 18-th time, retry later
wait process dm-master.test exit...
process dm-master.test already exit
dmctl test cmd: "query-status test"
got=1 expected=1
check log contain failed 1-th time, retry later
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "query-status test"
got=3 expected=3
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:45:29 CST 2024] <<<<<< finish DM-DROP_COLUMN_ALL_DONE optimistic >>>>>>
[Thu May 23 23:45:29 CST 2024] <<<<<< start DM-RECOVER_LOCK optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-worker.test exit...
check log contain failed 19-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
check log contain failed 1-th time, retry later
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:45:32 CST 2024] <<<<<< test case shardddl4 success! >>>>>>
start running case: [shardddl4_1] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:45:32 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
check log contain failed 20-th time, retry later
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:45:33 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
check log contain failed 1-th time, retry later
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/shardddl4_1/source1.yaml"
check log contain failed 21-th time, retry later
[Thu May 23 23:45:34 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
restart dm-master
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/shardddl4_1/source2.yaml"
check log contain failed 22-th time, retry later
wait process dm-master exit...
process dm-master already exit
[Thu May 23 23:45:36 CST 2024] <<<<<< start DM-TABLE_CHECKPOINT_BACKWARD optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
dmctl test cmd: "query-status test"
got=2 expected=2
[Thu May 23 23:45:38 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
check log contain failed 23-th time, retry later
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
check log contain failed 1-th time, retry later
check log contain failed 24-th time, retry later
restart dm-master
check log contain failed 25-th time, retry later
wait process dm-master exit...
process dm-master already exit
check log contain failed 26-th time, retry later
[Thu May 23 23:45:45 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
check log contain failed 27-th time, retry later
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
check diff failed 1-th time, retry later
check log contain failed 28-th time, retry later
check diff successfully
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
dmctl test cmd: "stop-task test"
[Thu May 23 23:45:50 CST 2024] <<<<<< finish DM-RECOVER_LOCK optimistic >>>>>>
run DM_DropAddColumn case #0
[Thu May 23 23:45:50 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
check log contain failed 29-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
check log contain failed 1-th time, retry later
check log contain failed 30-th time, retry later
check log contain failed 1-th time, retry later
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
check diff failed 1-th time, retry later
check diff failed 2-th time, retry later
check diff successfully
[Thu May 23 23:45:56 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/duplicate_event/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/duplicate_event/source2.yaml"
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/duplicate_event/conf/dm-task-relay.yaml --remove-meta"
use sync_diff_inspector to check increment data
check diff successfully
check diff successfully
data checked after one worker was killed
try to kill worker port 8263
wait process dm-worker2 exit...
process dm-worker2 already exit
worker2 was killed
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test2"
check diff failed 3-th time, retry later
got=2 expected=2
[Thu May 23 23:45:59 CST 2024] <<<<<< finish test_multi_task_reduce_and_restart_worker >>>>>>
3 dm-master alive
3 dm-worker alive
0 dm-syncer alive
check diff failed 1-th time, retry later
check diff successfully
dmctl test cmd: "start-relay -s mysql-replica-02 worker2"
wait process dm-master.test exit...
check diff failed 2-th time, retry later
check diff failed at last
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
got=2 expected=2
dmctl test cmd: "query-status -s mysql-replica-02"
got=1 expected=1
check diff successfully
dmctl test cmd: "stop-task test"
check diff successfully
binlog_pos: 2356 relay_log_size: 2356
============== run_with_prepared_source_config success ===================
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
dmctl test cmd: "query-status test"
got=1 expected=1
[Thu May 23 23:46:01 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #1
[Thu May 23 23:46:01 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
wait process dm-master.test exit...
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-master.test exit...
check diff failed 3-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
restart dm-master
wait process dm-master.test exit...
wait process dm-worker.test exit...
wait process dm-master exit...
process dm-master already exit
wait process dm-master.test exit...
wait process dm-worker.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:46:05 CST 2024] <<<<<< finish DM-TABLE_CHECKPOINT_BACKWARD optimistic >>>>>>
[Thu May 23 23:46:05 CST 2024] <<<<<< start DM-RESYNC_NOT_FLUSHED optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:46:05 CST 2024] <<<<<< test case duplicate_event success! >>>>>>
start running case: [expression_filter] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:46:05 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
restart dm-worker1
[Thu May 23 23:46:06 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:46:06 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process dm-master.test exit...
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:46:07 CST 2024] <<<<<< test case ha_cases2 success! >>>>>>
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-release-7.1-pull_dm_integration_test-220/tiflow-dm already exists)
wait process worker1 exit...
process worker1 already exit
[Thu May 23 23:46:07 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/conf/source1.yaml"
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/conf/dm-task2.yaml"
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
rpc addr 127.0.0.1:8262 is alive
restart dm-worker2
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
rpc addr 127.0.0.1:8261 is alive
check log contain failed 1-th time, retry later
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "stop-task test"
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/conf/dm-task2.yaml "
wait process worker2 exit...
process worker2 already exit
[Thu May 23 23:46:09 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "query-status test"
got=2 expected=2
got=0 expected=1
command: query-status test "synced": true count: 0 != expected: 1, failed the 0-th time, will retry again
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
check diff failed 1-th time, retry later
got=2 expected=2
got=1 expected=1
dmctl test cmd: "stop-task test"
ls: cannot access /tmp/dm_test/expression_filter/worker1/schema-tracker*: No such file or directory
schema tracker path has been cleaned
1 dm-master alive
1 dm-worker alive
0 dm-syncer alive
check diff failed 2-th time, retry later
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
check diff failed 3-th time, retry later
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:46:17 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
check diff failed at last
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:46:18 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:46:18 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #2
[Thu May 23 23:46:18 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/conf/source1.yaml"
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/expression_filter/conf/dm-task.yaml "
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
restart dm-master
wait process dm-master exit...
process dm-master already exit
[Thu May 23 23:46:21 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
[Thu May 23 23:46:22 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
[Thu May 23 23:46:23 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
dmctl test cmd: "query-status test"
got=2 expected=2
got=1 expected=1
1 dm-master alive
1 dm-worker alive
0 dm-syncer alive
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
check diff failed 1-th time, retry later
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
check diff failed 2-th time, retry later
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:46:29 CST 2024] <<<<<< test case expression_filter success! >>>>>>
start running case: [extend_column] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/extend_column/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/extend_column/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
running extend_column case with import_mode: sql
[Thu May 23 23:46:30 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/extend_column/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
check diff failed 3-th time, retry later
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:46:31 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/extend_column/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
[Thu May 23 23:46:32 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/extend_column/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
check diff failed at last
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
check diff successfully
dmctl test cmd: "stop-task test"
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/extend_column/source1.yaml"
[Thu May 23 23:46:33 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #3
[Thu May 23 23:46:33 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
dmctl test cmd: "operate-source create /tmp/dm_test/extend_column/source2.yaml"
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "start-task /tmp/dm_test/extend_column/dm-task.yaml --remove-meta"
dmctl test cmd: "start-task /tmp/dm_test/extend_column/dm-task.yaml --remove-meta"
check log contain failed 1-th time, retry later
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
restart dm-master
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-master exit...
process dm-master already exit
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
running extend_column case with import_mode: loader
[Thu May 23 23:46:40 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/extend_column/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
[Thu May 23 23:46:40 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:46:42 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/extend_column/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
check diff failed 1-th time, retry later
rpc addr 127.0.0.1:8262 is alive
[Thu May 23 23:46:43 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/extend_column/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/extend_column/source1.yaml"
dmctl test cmd: "operate-source create /tmp/dm_test/extend_column/source2.yaml"
check diff successfully
restart dm-worker1
wait process worker1 exit...
process worker1 already exit
[Thu May 23 23:46:45 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
check diff failed 2-th time, retry later
rpc addr 127.0.0.1:8262 is alive
restart dm-worker2
dmctl test cmd: "start-task /tmp/dm_test/extend_column/dm-task.yaml --remove-meta"
dmctl test cmd: "start-task /tmp/dm_test/extend_column/dm-task.yaml --remove-meta"
wait process worker2 exit...
process worker2 already exit
[Thu May 23 23:46:47 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
check diff failed 3-th time, retry later
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "stop-task test"
[Thu May 23 23:46:48 CST 2024] <<<<<< finish DM-RESYNC_NOT_FLUSHED optimistic >>>>>>
[Thu May 23 23:46:48 CST 2024] <<<<<< start DM-RESYNC_TXN_INTERRUPT optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
process dm-master.test already exit
check diff failed at last
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
check diff successfully
dmctl test cmd: "stop-task test"
dmctl test cmd: "query-status test"
got=2 expected=2
restart dm-worker1
wait process dm-worker.test exit...
[Thu May 23 23:46:50 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #4
[Thu May 23 23:46:50 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
wait process worker1 exit...
process worker1 already exit
[Thu May 23 23:46:50 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process dm-worker.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
check log contain failed 1-th time, retry later
rpc addr 127.0.0.1:8262 is alive
restart dm-worker2
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:46:52 CST 2024] <<<<<< test case extend_column success! >>>>>>
start running case: [fake_rotate_event] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/fake_rotate_event/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/fake_rotate_event/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:46:52 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/fake_rotate_event/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait process worker2 exit...
process worker2 already exit
[Thu May 23 23:46:52 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:46:53 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/fake_rotate_event/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
begin;
insert into shardddl1.tb2 values (1,1);
insert into shardddl1.tb2 values (2,2);
insert into shardddl1.tb2 values (3,3);
insert into shardddl1.tb2 values (4,4);
insert into shardddl1.tb2 values (5,5);
insert into shardddl1.tb2 values (6,6);
insert into shardddl1.tb2 values (7,7);
insert into shardddl1.tb2 values (8,8);
insert into shardddl1.tb2 values (9,9);
insert into shardddl1.tb2 values (10,10);
commit;
begin;
insert into shardddl1.t_1 values (11,11);
insert into shardddl1.t_1 values (12,12);
insert into shardddl1.t_1 values (13,13);
insert into shardddl1.t_1 values (14,14);
insert into shardddl1.t_1 values (15,15);
insert into shardddl1.t_1 values (16,16);
insert into shardddl1.t_1 values (17,17);
insert into shardddl1.t_1 values (18,18);
insert into shardddl1.t_1 values (19,19);
insert into shardddl1.t_1 values (20,20);
insert into shardddl1.t_1 values (21,21);
insert into shardddl1.t_1 values (22,22);
insert into shardddl1.t_1 values (23,23);
insert into shardddl1.t_1 values (24,24);
insert into shardddl1.t_1 values (25,25);
insert into shardddl1.t_1 values (26,26);
insert into shardddl1.t_1 values (27,27);
insert into shardddl1.t_1 values (28,28);
insert into shardddl1.t_1 values (29,29);
insert into shardddl1.t_1 values (30,30);
insert into shardddl1.t_1 values (31,31);
insert into shardddl1.t_1 values (32,32);
insert into shardddl1.t_1 values (33,33);
insert into shardddl1.t_1 values (34,34);
insert into shardddl1.t_1 values (35,35);
insert into shardddl1.t_1 values (36,36);
insert into shardddl1.t_1 values (37,37);
insert into shardddl1.t_1 values (38,38);
insert into shardddl1.t_1 values (39,39);
insert into shardddl1.t_1 values (40,40);
insert into shardddl1.t_1 values (41,41);
insert into shardddl1.t_1 values (42,42);
insert into shardddl1.t_1 values (43,43);
insert into shardddl1.t_1 values (44,44);
insert into shardddl1.t_1 values (45,45);
insert into shardddl1.t_1 values (46,46);
insert into shardddl1.t_1 values (47,47);
insert into shardddl1.t_1 values (48,48);
insert into shardddl1.t_1 values (49,49);
insert into shardddl1.t_1 values (50,50);
commit;
begin;
insert into shardddl1.tb1 values (51,51);
insert into shardddl1.tb1 values (52,52);
insert into shardddl1.tb1 values (53,53);
insert into shardddl1.tb1 values (54,54);
insert into shardddl1.tb1 values (55,55);
insert into shardddl1.tb1 values (56,56);
insert into shardddl1.tb1 values (57,57);
insert into shardddl1.tb1 values (58,58);
insert into shardddl1.tb1 values (59,59);
insert into shardddl1.tb1 values (60,60);
commit;
begin;
insert into shardddl1.t_1 values (61,61);
insert into shardddl1.t_1 values (62,62);
insert into shardddl1.t_1 values (63,63);
insert into shardddl1.t_1 values (64,64);
insert into shardddl1.t_1 values (65,65);
insert into shardddl1.t_1 values (66,66);
insert into shardddl1.t_1 values (67,67);
insert into shardddl1.t_1 values (68,68);
insert into shardddl1.t_1 values (69,69);
insert into shardddl1.t_1 values (70,70);
commit;
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
restart dm-master
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/fake_rotate_event/source1.yaml"
check diff failed 1-th time, retry later
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/fake_rotate_event/conf/dm-task.yaml --remove-meta"
wait process dm-master exit...
process dm-master already exit
dmctl test cmd: "query-status test"
got=2 expected=2
got=1 expected=1
check diff successfully
kill dm-worker
check diff failed 2-th time, retry later
wait process dm-worker1 exit...
process dm-worker1 already exit
[Thu May 23 23:46:56 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/fake_rotate_event/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
[Thu May 23 23:46:56 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
check diff successfully
dmctl test cmd: "query-status test"
got=2 expected=2
got=1 expected=1
1 dm-master alive
1 dm-worker alive
0 dm-syncer alive
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
check diff failed 3-th time, retry later
wait process dm-master.test exit...
process dm-master.test already exit
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
check diff failed 1-th time, retry later
wait process dm-worker.test exit...
check diff successfully
restart dm-worker1
wait process worker1 exit...
process worker1 already exit
[Thu May 23 23:47:01 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process dm-worker.test exit...
check diff failed 2-th time, retry later
rpc addr 127.0.0.1:8262 is alive
restart dm-worker2
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:47:02 CST 2024] <<<<<< test case fake_rotate_event success! >>>>>>
start running case: [foreign_key] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/foreign_key/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/foreign_key/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:47:02 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/foreign_key/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:47:03 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/foreign_key/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process worker2 exit...
process worker2 already exit
[Thu May 23 23:47:03 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
check diff failed 3-th time, retry later
rpc addr 127.0.0.1:8263 is alive
begin;
insert into shardddl1.tb2 values (101,101);
insert into shardddl1.tb2 values (102,102);
insert into shardddl1.tb2 values (103,103);
insert into shardddl1.tb2 values (104,104);
insert into shardddl1.tb2 values (105,105);
insert into shardddl1.tb2 values (106,106);
insert into shardddl1.tb2 values (107,107);
insert into shardddl1.tb2 values (108,108);
insert into shardddl1.tb2 values (109,109);
insert into shardddl1.tb2 values (110,110);
commit;
begin;
insert into shardddl1.tb1 values (111,111);
insert into shardddl1.tb1 values (112,112);
insert into shardddl1.tb1 values (113,113);
insert into shardddl1.tb1 values (114,114);
insert into shardddl1.tb1 values (115,115);
insert into shardddl1.tb1 values (116,116);
insert into shardddl1.tb1 values (117,117);
insert into shardddl1.tb1 values (118,118);
insert into shardddl1.tb1 values (119,119);
insert into shardddl1.tb1 values (120,120);
commit;
begin;
insert into shardddl1.tb2 values (121,121);
insert into shardddl1.tb2 values (122,122);
insert into shardddl1.tb2 values (123,123);
insert into shardddl1.tb2 values (124,124);
insert into shardddl1.tb2 values (125,125);
insert into shardddl1.tb2 values (126,126);
insert into shardddl1.tb2 values (127,127);
insert into shardddl1.tb2 values (128,128);
insert into shardddl1.tb2 values (129,129);
insert into shardddl1.tb2 values (130,130);
commit;
begin;
insert into shardddl1.t_1 values (131,131);
insert into shardddl1.t_1 values (132,132);
insert into shardddl1.t_1 values (133,133);
insert into shardddl1.t_1 values (134,134);
insert into shardddl1.t_1 values (135,135);
insert into shardddl1.t_1 values (136,136);
insert into shardddl1.t_1 values (137,137);
insert into shardddl1.t_1 values (138,138);
insert into shardddl1.t_1 values (139,139);
insert into shardddl1.t_1 values (140,140);
commit;
check diff successfully
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/foreign_key/source1.yaml"
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/foreign_key/conf/dm-task.yaml --remove-meta"
begin;
insert into shardddl1.tb2 values (201,201);
insert into shardddl1.tb2 values (202,202);
insert into shardddl1.tb2 values (203,203);
insert into shardddl1.tb2 values (204,204);
insert into shardddl1.tb2 values (205,205);
insert into shardddl1.tb2 values (206,206);
insert into shardddl1.tb2 values (207,207);
insert into shardddl1.tb2 values (208,208);
insert into shardddl1.tb2 values (209,209);
insert into shardddl1.tb2 values (210,210);
commit;
begin;
insert into shardddl1.tb1 values (211,211);
insert into shardddl1.tb1 values (212,212);
insert into shardddl1.tb1 values (213,213);
insert into shardddl1.tb1 values (214,214);
insert into shardddl1.tb1 values (215,215);
insert into shardddl1.tb1 values (216,216);
insert into shardddl1.tb1 values (217,217);
insert into shardddl1.tb1 values (218,218);
insert into shardddl1.tb1 values (219,219);
insert into shardddl1.tb1 values (220,220);
commit;
begin;
insert into shardddl1.tb2 values (221,221);
insert into shardddl1.tb2 values (222,222);
insert into shardddl1.tb2 values (223,223);
insert into shardddl1.tb2 values (224,224);
insert into shardddl1.tb2 values (225,225);
insert into shardddl1.tb2 values (226,226);
insert into shardddl1.tb2 values (227,227);
insert into shardddl1.tb2 values (228,228);
insert into shardddl1.tb2 values (229,229);
insert into shardddl1.tb2 values (230,230);
commit;
begin;
insert into shardddl1.t_1 values (231,231);
insert into shardddl1.t_1 values (232,232);
insert into shardddl1.t_1 values (233,233);
insert into shardddl1.t_1 values (234,234);
insert into shardddl1.t_1 values (235,235);
insert into shardddl1.t_1 values (236,236);
insert into shardddl1.t_1 values (237,237);
insert into shardddl1.t_1 values (238,238);
insert into shardddl1.t_1 values (239,239);
insert into shardddl1.t_1 values (240,240);
commit;
check diff failed 1-th time, retry later
dmctl test cmd: "query-status test"
got=1 expected=1
<<<<<< test_source_and_target_with_empty_gtid success! >>>>>>
1 dm-master alive
1 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:47:05 CST 2024] <<<<<< test case all_mode success! >>>>>>
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-release-7.1-pull_dm_integration_test-220/tiflow-dm already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
dmctl test cmd: "query-status test"
got=1 expected=1
check diff successfully
dmctl test cmd: "query-status test"
got=2 expected=2
got=1 expected=1
got=2 expected=2
1 dm-master alive
[Pipeline] }
[Pipeline] // container
check diff failed at last
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
1 dm-worker alive
0 dm-syncer alive
[Pipeline] // node
[Pipeline] }
check diff successfully
dmctl test cmd: "stop-task test"
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Thu May 23 23:47:06 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #5
[Thu May 23 23:47:06 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
process dm-master.test already exit
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:07 CST 2024] <<<<<< finish DM-RESYNC_TXN_INTERRUPT optimistic >>>>>>
[Thu May 23 23:47:07 CST 2024] <<<<<< start DM-STRICT_OPTIMISTIC_SINGLE_SOURCE optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/single-source-strict-optimistic.yaml --remove-meta"
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
wait process dm-worker.test exit...
check log contain failed 1-th time, retry later
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:09 CST 2024] <<<<<< finish DM-STRICT_OPTIMISTIC_SINGLE_SOURCE optimistic >>>>>>
[Thu May 23 23:47:09 CST 2024] <<<<<< start DM-STRICT_OPTIMISTIC_DOUBLE_SOURCE optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-strict-optimistic.yaml --remove-meta"
wait process dm-worker.test exit...
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:47:10 CST 2024] <<<<<< test case foreign_key success! >>>>>>
start running case: [full_mode] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:47:10 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
restart dm-master
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:11 CST 2024] <<<<<< finish DM-STRICT_OPTIMISTIC_DOUBLE_SOURCE optimistic >>>>>>
[Thu May 23 23:47:11 CST 2024] <<<<<< start DM-131 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:47:11 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process dm-master exit...
process dm-master already exit
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/full_mode/source1.yaml"
[Thu May 23 23:47:13 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:13 CST 2024] <<<<<< finish DM-131 optimistic >>>>>>
[Thu May 23 23:47:13 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/full_mode/source2.yaml"
[Thu May 23 23:47:14 CST 2024] <<<<<< start DM-132 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
dmctl test cmd: "start-task /tmp/dm_test/full_mode/dm-task.yaml --remove-meta"
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 0-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
got=1 expected=1
got=1 expected=1
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
process dm-master.test already exit
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:18 CST 2024] <<<<<< finish DM-132 pessimistic >>>>>>
[Thu May 23 23:47:18 CST 2024] <<<<<< start DM-132 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 1-th time, will retry again
wait process dm-worker.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:20 CST 2024] <<<<<< finish DM-132 optimistic >>>>>>
wait process dm-worker.test exit...
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 2-th time, will retry again
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:47:21 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
[Thu May 23 23:47:21 CST 2024] <<<<<< start DM-133 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
dmctl test cmd: "query-status test"
got=2 expected=2
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 3-th time, will retry again
check diff failed 1-th time, retry later
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:47:23 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
[Thu May 23 23:47:24 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:25 CST 2024] <<<<<< finish DM-133 pessimistic >>>>>>
[Thu May 23 23:47:25 CST 2024] <<<<<< start DM-133 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 4-th time, will retry again
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/full_mode/source1.yaml"
dmctl test cmd: "operate-source create /tmp/dm_test/full_mode/source2.yaml"
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:27 CST 2024] <<<<<< finish DM-133 optimistic >>>>>>
dmctl test cmd: "start-task /tmp/dm_test/full_mode/dm-task.yaml --remove-meta"
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 5-th time, will retry again
[Thu May 23 23:47:28 CST 2024] <<<<<< start DM-134 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
check diff failed 1-th time, retry later
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 6-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
check diff successfully
check dump files have been cleaned
ls: cannot access /tmp/dm_test/full_mode/worker1/dumped_data.test: No such file or directory
worker1 auto removed dump files
ls: cannot access /tmp/dm_test/full_mode/worker2/dumped_data.test: No such file or directory
worker2 auto removed dump files
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
process dm-master.test already exit
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 7-th time, will retry again
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:32 CST 2024] <<<<<< finish DM-134 pessimistic >>>>>>
[Thu May 23 23:47:32 CST 2024] <<<<<< start DM-134 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-worker.test exit...
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 8-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
wait process dm-worker.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:34 CST 2024] <<<<<< finish DM-134 optimistic >>>>>>
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Thu May 23 23:47:34 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
[Thu May 23 23:47:35 CST 2024] <<<<<< start DM-135 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8261 is alive
[Thu May 23 23:47:36 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 9-th time, will retry again
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/full_mode/source1.yaml"
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "stop-task test"
[Thu May 23 23:47:37 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
[Thu May 23 23:47:37 CST 2024] <<<<<< finish DM-135 pessimistic >>>>>>
[Thu May 23 23:47:37 CST 2024] <<<<<< start DM-135 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/full_mode/source2.yaml"
{
    "result": true,
    "msg": "",
    "sources": [
        {
            "result": true,
            "msg": "",
            "sourceStatus": {
                "source": "mysql-replica-01",
                "worker": "worker1",
                "result": null,
                "relayStatus": null
            },
            "subTaskStatus": [
                {
                    "name": "test",
                    "stage": "Paused",
                    "unit": "Sync",
                    "result": {
                        "isCanceled": false,
                        "errors": [
                            {
                                "ErrCode": 42501,
                                "ErrClass": "ha",
                                "ErrScope": "internal",
                                "ErrLevel": "high",
                                "Message": "startLocation: [position: (dm-it-b6c62a04-c016-4287-ab56-1db28212f4a0-jgkt3-2c5j9-bin.000001, 42228), gtid-set: 271e1620-191a-11ef-aecd-76249e6bc051:1-194], endLocation: [position: (dm-it-b6c62a04-c016-4287-ab56-1db28212f4a0-jgkt3-2c5j9-bin.000001, 42353), gtid-set: 271e1620-191a-11ef-aecd-76249e6bc051:1-195], origin SQL: [alter table shardddl1.tb1 add column b int after a]: fail to do etcd txn operation: txn commit failed",
                                "RawCause": "rpc error: code = Unavailable desc = error reading from server: EOF",
                                "Workaround": "Please check dm-master's node status and the network between this node and dm-master"
                            }
                        ],
                        "detail": null
                    },
                    "unresolvedDDLLockID": "",
                    "sync": {
                        "totalEvents": "12",
                        "totalTps": "0",
                        "recentTps": "0",
                        "masterBinlog": "(dm-it-b6c62a04-c016-4287-ab56-1db28212f4a0-jgkt3-2c5j9-bin.000001, 42353)",
                        "masterBinlogGtid": "271e1620-191a-11ef-aecd-76249e6bc051:1-195",
                        "syncerBinlog": "(dm-it-b6c62a04-c016-4287-ab56-1db28212f4a0-jgkt3-2c5j9-bin.000001, 42163)",
                        "syncerBinlogGtid": "271e1620-191a-11ef-aecd-76249e6bc051:1-194",
                        "blockingDDLs": [
                        ],
                        "unresolvedGroups": [
                        ],
                        "synced": false,
                        "binlogType": "remote",
                        "secondsBehindMaster": "0",
                        "blockDDLOwner": "",
                        "conflictMsg": "",
                        "totalRows": "12",
                        "totalRps": "0",
                        "recentRps": "0"
                    },
                    "validation": null
                }
            ]
        },
        {
            "result": true,
            "msg": "",
            "sourceStatus": {
                "source": "mysql-replica-02",
                "worker": "worker2",
                "result": null,
                "relayStatus": {
                    "masterBinlog": "(dm-it-b6c62a04-c016-4287-ab56-1db28212f4a0-jgkt3-2c5j9-bin.000001, 39206)",
                    "masterBinlogGtid": "278dbd7d-191a-11ef-90f7-76249e6bc051:1-167",
                    "relaySubDir": "278dbd7d-191a-11ef-90f7-76249e6bc051.000001",
                    "relayBinlog": "(dm-it-b6c62a04-c016-4287-ab56-1db28212f4a0-jgkt3-2c5j9-bin.000001, 39206)",
                    "relayBinlogGtid": "278dbd7d-191a-11ef-90f7-76249e6bc051:1-167",
                    "relayCatchUpMaster": true,
                    "stage": "Running",
                    "result": null
                }
            },
            "subTaskStatus": [
                {
                    "name": "test",
                    "stage": "Running",
                    "unit": "Sync",
                    "result": null,
                    "unresolvedDDLLockID": "",
                    "sync": {
                        "totalEvents": "6",
                        "totalTps": "0",
                        "recentTps": "0",
                        "masterBinlog": "(dm-it-b6c62a04-c016-4287-ab56-1db28212f4a0-jgkt3-2c5j9-bin.000001, 39206)",
                        "masterBinlogGtid": "278dbd7d-191a-11ef-90f7-76249e6bc051:1-167",
                        "syncerBinlog": "(dm-it-b6c62a04-c016-4287-ab56-1db28212f4a0-jgkt3-2c5j9-bin|000001.000001, 38926)",
                        "syncerBinlogGtid": "278dbd7d-191a-11ef-90f7-76249e6bc051:1-166",
                        "blockingDDLs": [
                        ],
                        "unresolvedGroups": [
                        ],
                        "synced": false,
                        "binlogType": "local",
                        "secondsBehindMaster": "0",
                        "blockDDLOwner": "",
                        "conflictMsg": "",
                        "totalRows": "6",
                        "totalRps": "0",
                        "recentRps": "0"
                    },
                    "validation": null
                }
            ]
        }
    ]
}
PASS
	github.com/pingcap/tiflow/dm/checker	coverage: 0.3% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/common	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/config	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/config/dbconfig	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/config/security	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/ctl	coverage: 49.5% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/ctl/common	coverage: 45.1% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/ctl/master	coverage: 13.8% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/loader	coverage: 0.2% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/openapi	coverage: 0.3% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pb	coverage: 4.5% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/binlog	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/binlog/common	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/binlog/event	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/binlog/reader	coverage: 0.5% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/checker	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/conn	coverage: 0.3% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/context	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/cputil	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/dumpling	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/encrypt	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/etcdutil	coverage: 2.2% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/func-rollback	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/gtid	coverage: 57.5% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/ha	coverage: 0.1% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/helper	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/log	coverage: 30.4% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/parser	coverage: 3.1% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/retry	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/storage	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/terror	coverage: 3.3% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/pkg/utils	coverage: 4.3% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/syncer/dbconn	coverage: 0.5% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/syncer/metrics	coverage: 24.8% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/syncer/online-ddl-tools	coverage: 0.4% of statements in github.com/pingcap/tiflow/dm/...
	github.com/pingcap/tiflow/dm/unit	coverage: 0.0% of statements in github.com/pingcap/tiflow/dm/...
curl: (7) Failed connect to 127.0.0.1:8361; Connection refused
curl: (7) Failed connect to 127.0.0.1:8461; Connection refused
curl: (7) Failed connect to 127.0.0.1:8561; Connection refused
curl: (7) Failed connect to 127.0.0.1:8661; Connection refused
curl: (7) Failed connect to 127.0.0.1:8761; Connection refused
curl: (7) Failed connect to 127.0.0.1:8264; Connection refused
curl: (7) Failed connect to 127.0.0.1:18262; Connection refused
curl: (7) Failed connect to 127.0.0.1:18263; Connection refused
make: *** [dm_integration_test_in_group] Error 1
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
Post stage
[Pipeline] sh
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "stop-task test"
+ ls /tmp/dm_test
cov.shardddl1.dmctl.1716478722.646.out
cov.shardddl1.dmctl.1716478723.775.out
cov.shardddl1.dmctl.1716478727.1026.out
cov.shardddl1.dmctl.1716478728.1082.out
cov.shardddl1.dmctl.1716478751.4240.out
cov.shardddl1.dmctl.1716478754.4549.out
cov.shardddl1.dmctl.1716478755.4602.out
cov.shardddl1.dmctl.1716478786.5122.out
cov.shardddl1.dmctl.1716478788.5442.out
cov.shardddl1.dmctl.1716478789.5491.out
cov.shardddl1.dmctl.1716478806.6106.out
cov.shardddl1.dmctl.1716478809.6423.out
cov.shardddl1.dmctl.1716478810.6479.out
cov.shardddl1.dmctl.1716478810.6580.out
cov.shardddl1.dmctl.1716478810.6742.out
cov.shardddl1.dmctl.1716478812.6791.out
cov.shardddl1.dmctl.1716478814.6917.out
cov.shardddl1.dmctl.1716478814.7079.out
cov.shardddl1.dmctl.1716478816.7144.out
cov.shardddl1.dmctl.1716478816.7279.out
cov.shardddl1.dmctl.1716478816.7334.out
cov.shardddl1.dmctl.1716478816.7387.out
cov.shardddl1.dmctl.1716478816.7441.out
cov.shardddl1.dmctl.1716478817.7499.out
cov.shardddl1.dmctl.1716478823.7672.out
cov.shardddl1.dmctl.1716478825.7760.out
cov.shardddl1.dmctl.1716478825.7806.out
cov.shardddl1.dmctl.1716478825.7854.out
cov.shardddl1.dmctl.1716478826.8011.out
cov.shardddl1.dmctl.1716478827.8063.out
cov.shardddl1.dmctl.1716478829.8277.out
cov.shardddl1.dmctl.1716478830.8438.out
cov.shardddl1.dmctl.1716478831.8479.out
cov.shardddl1.dmctl.1716478831.8612.out
cov.shardddl1.dmctl.1716478831.8663.out
cov.shardddl1.dmctl.1716478832.8826.out
cov.shardddl1.dmctl.1716478833.8878.out
cov.shardddl1.dmctl.1716478833.8994.out
cov.shardddl1.dmctl.1716478837.9151.out
cov.shardddl1.dmctl.1716478837.9201.out
cov.shardddl1.dmctl.1716478837.9249.out
cov.shardddl1.dmctl.1716478841.9504.out
cov.shardddl1.dmctl.1716478841.9572.out
cov.shardddl1.dmctl.1716478842.9621.out
cov.shardddl1.dmctl.1716478847.9785.out
cov.shardddl1.dmctl.1716478847.9948.out
cov.shardddl1.dmctl.1716478848.10010.out
cov.shardddl1.dmctl.1716478853.10166.out
cov.shardddl1.dmctl.1716478887.10443.out
cov.shardddl1.dmctl.1716478887.10506.out
cov.shardddl1.dmctl.1716478888.10550.out
cov.shardddl1.dmctl.1716478891.10655.out
cov.shardddl1.dmctl.1716478891.10708.out
cov.shardddl1.dmctl.1716478897.10880.out
cov.shardddl1.dmctl.1716478897.10932.out
cov.shardddl1.dmctl.1716478897.10985.out
cov.shardddl1.dmctl.1716478897.11139.out
cov.shardddl1.dmctl.1716478898.11183.out
cov.shardddl1.dmctl.1716478901.11309.out
cov.shardddl1.dmctl.1716478901.11356.out
cov.shardddl1.dmctl.1716478907.11529.out
cov.shardddl1.dmctl.1716478907.11586.out
cov.shardddl1.dmctl.1716478907.11646.out
cov.shardddl1.dmctl.1716478907.11808.out
cov.shardddl1.dmctl.1716478909.11866.out
cov.shardddl1.dmctl.1716478911.12188.out
cov.shardddl1.dmctl.1716478912.12346.out
cov.shardddl1.dmctl.1716478913.12390.out
cov.shardddl1.dmctl.1716478916.12709.out
cov.shardddl1.dmctl.1716478916.12868.out
cov.shardddl1.dmctl.1716478917.12924.out
cov.shardddl1.dmctl.1716478920.13225.out
cov.shardddl1.dmctl.1716478920.13384.out
cov.shardddl1.dmctl.1716478922.13428.out
cov.shardddl1.dmctl.1716478922.13712.out
cov.shardddl1.dmctl.1716478925.14017.out
cov.shardddl1.dmctl.1716478926.14071.out
cov.shardddl1.dmctl.1716478928.14146.out
cov.shardddl1.dmctl.1716478928.14205.out
cov.shardddl1.dmctl.1716478931.14525.out
cov.shardddl1.dmctl.1716478932.14585.out
cov.shardddl1.dmctl.1716478932.14740.out
cov.shardddl1.dmctl.1716478934.15048.out
cov.shardddl1.dmctl.1716478936.15104.out
cov.shardddl1.dmctl.1716478936.15162.out
cov.shardddl1.dmctl.1716478936.15223.out
cov.shardddl1.dmctl.1716478937.15382.out
cov.shardddl1.dmctl.1716478939.15437.out
cov.shardddl1.dmctl.1716478939.15536.out
cov.shardddl1.dmctl.1716478940.15694.out
cov.shardddl1.dmctl.1716478941.15749.out
cov.shardddl1.dmctl.1716478948.15940.out
cov.shardddl1.dmctl.1716478949.16104.out
cov.shardddl1.dmctl.1716478950.16157.out
cov.shardddl1.dmctl.1716478953.16258.out
cov.shardddl1.dmctl.1716478954.16420.out
cov.shardddl1.dmctl.1716478955.16467.out
cov.shardddl1.dmctl.1716478955.16555.out
cov.shardddl1.master.out
cov.shardddl1.worker.8262.1716478721.out
cov.shardddl1.worker.8262.1716478726.out
cov.shardddl1.worker.8262.1716478752.out
cov.shardddl1.worker.8262.1716478787.out
cov.shardddl1.worker.8262.1716478808.out
cov.shardddl1.worker.8262.1716478924.out
cov.shardddl1.worker.8262.1716478929.out
cov.shardddl1.worker.8262.1716478933.out
cov.shardddl1.worker.8263.1716478722.out
cov.shardddl1.worker.8263.1716478726.out
cov.shardddl1.worker.8263.1716478752.out
cov.shardddl1.worker.8263.1716478787.out
cov.shardddl1.worker.8263.1716478808.out
cov.shardddl1.worker.8263.1716478924.out
cov.shardddl1.worker.8263.1716478929.out
cov.shardddl1.worker.8263.1716478933.out
cov.shardddl1_1.dmctl.1716478964.16996.out
cov.shardddl1_1.dmctl.1716478965.17124.out
cov.shardddl1_1.dmctl.1716478967.17222.out
cov.shardddl1_1.dmctl.1716478968.17289.out
cov.shardddl1_1.dmctl.1716478971.17656.out
cov.shardddl1_1.dmctl.1716478971.17811.out
cov.shardddl1_1.dmctl.1716478972.17858.out
cov.shardddl1_1.dmctl.1716478973.18105.out
cov.shardddl1_1.dmctl.1716478973.18265.out
cov.shardddl1_1.dmctl.1716478974.18325.out
cov.shardddl1_1.dmctl.1716478977.18454.out
cov.shardddl1_1.dmctl.1716478978.18614.out
cov.shardddl1_1.dmctl.1716478979.18661.out
cov.shardddl1_1.dmctl.1716478981.18797.out
cov.shardddl1_1.dmctl.1716478983.18957.out
cov.shardddl1_1.dmctl.1716478984.19006.out
cov.shardddl1_1.dmctl.1716478986.19137.out
cov.shardddl1_1.dmctl.1716478988.19284.out
cov.shardddl1_1.dmctl.1716478989.19337.out
cov.shardddl1_1.dmctl.1716478991.19464.out
cov.shardddl1_1.dmctl.1716478993.19619.out
cov.shardddl1_1.dmctl.1716478994.19673.out
cov.shardddl1_1.dmctl.1716478996.19821.out
cov.shardddl1_1.dmctl.1716478998.19977.out
cov.shardddl1_1.dmctl.1716478999.20028.out
cov.shardddl1_1.dmctl.1716478999.20111.out
cov.shardddl1_1.dmctl.1716479000.20272.out
cov.shardddl1_1.dmctl.1716479002.20322.out
cov.shardddl1_1.dmctl.1716479002.20405.out
cov.shardddl1_1.dmctl.1716479003.20560.out
cov.shardddl1_1.dmctl.1716479004.20602.out
cov.shardddl1_1.dmctl.1716479005.20713.out
cov.shardddl1_1.dmctl.1716479006.20870.out
cov.shardddl1_1.dmctl.1716479007.20922.out
cov.shardddl1_1.dmctl.1716479009.21016.out
cov.shardddl1_1.dmctl.1716479011.21174.out
cov.shardddl1_1.dmctl.1716479012.21217.out
cov.shardddl1_1.dmctl.1716479012.21284.out
cov.shardddl1_1.dmctl.1716479013.21436.out
cov.shardddl1_1.dmctl.1716479015.21482.out
cov.shardddl1_1.dmctl.1716479015.21547.out
cov.shardddl1_1.dmctl.1716479016.21702.out
cov.shardddl1_1.dmctl.1716479017.21755.out
cov.shardddl1_1.dmctl.1716479017.21816.out
cov.shardddl1_1.dmctl.1716479019.21968.out
cov.shardddl1_1.dmctl.1716479020.22018.out
cov.shardddl1_1.dmctl.1716479022.22088.out
cov.shardddl1_1.dmctl.1716479023.22240.out
cov.shardddl1_1.dmctl.1716479025.22287.out
cov.shardddl1_1.dmctl.1716479025.22347.out
cov.shardddl1_1.dmctl.1716479025.22501.out
cov.shardddl1_1.dmctl.1716479026.22549.out
cov.shardddl1_1.dmctl.1716479027.22608.out
cov.shardddl1_1.dmctl.1716479028.22760.out
cov.shardddl1_1.dmctl.1716479029.22810.out
cov.shardddl1_1.dmctl.1716479031.22884.out
cov.shardddl1_1.dmctl.1716479033.23047.out
cov.shardddl1_1.dmctl.1716479034.23094.out
cov.shardddl1_1.dmctl.1716479036.23159.out
cov.shardddl1_1.dmctl.1716479037.23312.out
cov.shardddl1_1.dmctl.1716479038.23354.out
cov.shardddl1_1.dmctl.1716479041.23415.out
cov.shardddl1_1.dmctl.1716479042.23572.out
cov.shardddl1_1.dmctl.1716479043.23627.out
cov.shardddl1_1.dmctl.1716479043.23687.out
cov.shardddl1_1.dmctl.1716479043.23738.out
cov.shardddl1_1.dmctl.1716479045.23896.out
cov.shardddl1_1.dmctl.1716479046.23948.out
cov.shardddl1_1.dmctl.1716479046.24057.out
cov.shardddl1_1.dmctl.1716479048.24217.out
cov.shardddl1_1.dmctl.1716479049.24261.out
cov.shardddl1_1.dmctl.1716479049.24333.out
cov.shardddl1_1.dmctl.1716479049.24384.out
cov.shardddl1_1.dmctl.1716479050.24541.out
cov.shardddl1_1.dmctl.1716479052.24589.out
cov.shardddl1_1.dmctl.1716479052.24648.out
cov.shardddl1_1.dmctl.1716479052.24702.out
cov.shardddl1_1.dmctl.1716479053.24864.out
cov.shardddl1_1.dmctl.1716479054.24904.out
cov.shardddl1_1.dmctl.1716479057.25032.out
cov.shardddl1_1.dmctl.1716479057.25188.out
cov.shardddl1_1.dmctl.1716479058.25238.out
cov.shardddl1_1.dmctl.1716479059.25340.out
cov.shardddl1_1.dmctl.1716479060.25493.out
cov.shardddl1_1.dmctl.1716479061.25542.out
cov.shardddl1_1.dmctl.1716479061.25620.out
cov.shardddl1_1.dmctl.1716479062.25770.out
cov.shardddl1_1.dmctl.1716479063.25818.out
cov.shardddl1_1.dmctl.1716479063.25896.out
cov.shardddl1_1.dmctl.1716479063.25950.out
cov.shardddl1_1.dmctl.1716479065.26110.out
cov.shardddl1_1.dmctl.1716479066.26156.out
cov.shardddl1_1.dmctl.1716479066.26295.out
cov.shardddl1_1.dmctl.1716479067.26458.out
cov.shardddl1_1.dmctl.1716479069.26503.out
cov.shardddl1_1.dmctl.1716479071.26651.out
cov.shardddl1_1.dmctl.1716479072.26816.out
cov.shardddl1_1.dmctl.1716479074.26865.out
cov.shardddl1_1.dmctl.1716479074.26952.out
cov.shardddl1_1.dmctl.1716479074.27006.out
cov.shardddl1_1.dmctl.1716479074.27160.out
cov.shardddl1_1.dmctl.1716479076.27205.out
cov.shardddl1_1.dmctl.1716479076.27286.out
cov.shardddl1_1.dmctl.1716479076.27339.out
cov.shardddl1_1.dmctl.1716479077.27504.out
cov.shardddl1_1.dmctl.1716479078.27554.out
cov.shardddl1_1.dmctl.1716479081.27675.out
cov.shardddl1_1.master.out
cov.shardddl1_1.worker.8262.1716478963.out
cov.shardddl1_1.worker.8263.1716478964.out
cov.shardddl2.dmctl.1716479088.28091.out
cov.shardddl2.dmctl.1716479090.28209.out
cov.shardddl2.dmctl.1716479091.28311.out
cov.shardddl2.dmctl.1716479092.28389.out
cov.shardddl2.dmctl.1716479096.28582.out
cov.shardddl2.dmctl.1716479102.28761.out
cov.shardddl2.dmctl.1716479105.28945.out
cov.shardddl2.dmctl.1716479105.29026.out
cov.shardddl2.dmctl.1716479105.29187.out
cov.shardddl2.dmctl.1716479106.29241.out
cov.shardddl2.dmctl.1716479114.29503.out
cov.shardddl2.dmctl.1716479115.29586.out
cov.shardddl2.dmctl.1716479115.29745.out
cov.shardddl2.dmctl.1716479116.29796.out
cov.shardddl2.dmctl.1716479121.29974.out
cov.shardddl2.dmctl.1716479127.30150.out
cov.shardddl2.dmctl.1716479129.30303.out
cov.shardddl2.dmctl.1716479129.30382.out
cov.shardddl2.dmctl.1716479129.30542.out
cov.shardddl2.dmctl.1716479131.30609.out
cov.shardddl2.dmctl.1716479150.31074.out
cov.shardddl2.dmctl.1716479150.31129.out
cov.shardddl2.dmctl.1716479150.31285.out
cov.shardddl2.dmctl.1716479151.31334.out
cov.shardddl2.dmctl.1716479154.31451.out
cov.shardddl2.dmctl.1716479154.31515.out
cov.shardddl2.dmctl.1716479154.31577.out
cov.shardddl2.dmctl.1716479160.31729.out
cov.shardddl2.dmctl.1716479160.31779.out
cov.shardddl2.dmctl.1716479161.31832.out
cov.shardddl2.dmctl.1716479161.31938.out
cov.shardddl2.dmctl.1716479161.32105.out
cov.shardddl2.dmctl.1716479162.32168.out
cov.shardddl2.dmctl.1716479170.32398.out
cov.shardddl2.dmctl.1716479171.32463.out
cov.shardddl2.dmctl.1716479171.32517.out
cov.shardddl2.dmctl.1716479177.32678.out
cov.shardddl2.dmctl.1716479177.32733.out
cov.shardddl2.dmctl.1716479177.32781.out
cov.shardddl2.dmctl.1716479178.32880.out
cov.shardddl2.dmctl.1716479178.33034.out
cov.shardddl2.dmctl.1716479179.33089.out
cov.shardddl2.dmctl.1716479185.33309.out
cov.shardddl2.dmctl.1716479186.33372.out
cov.shardddl2.dmctl.1716479186.33426.out
cov.shardddl2.dmctl.1716479192.33581.out
cov.shardddl2.dmctl.1716479192.33635.out
cov.shardddl2.dmctl.1716479192.33686.out
cov.shardddl2.dmctl.1716479193.33796.out
cov.shardddl2.dmctl.1716479193.33954.out
cov.shardddl2.dmctl.1716479194.34002.out
cov.shardddl2.dmctl.1716479196.34119.out
cov.shardddl2.dmctl.1716479202.34301.out
cov.shardddl2.dmctl.1716479202.34355.out
cov.shardddl2.dmctl.1716479209.34524.out
cov.shardddl2.dmctl.1716479209.34576.out
cov.shardddl2.dmctl.1716479209.34628.out
cov.shardddl2.dmctl.1716479209.34725.out
cov.shardddl2.dmctl.1716479210.34884.out
cov.shardddl2.dmctl.1716479211.34928.out
cov.shardddl2.dmctl.1716479213.35046.out
cov.shardddl2.dmctl.1716479219.35235.out
cov.shardddl2.dmctl.1716479219.35294.out
cov.shardddl2.dmctl.1716479226.35459.out
cov.shardddl2.dmctl.1716479226.35506.out
cov.shardddl2.dmctl.1716479226.35558.out
cov.shardddl2.dmctl.1716479226.35670.out
cov.shardddl2.dmctl.1716479226.35825.out
cov.shardddl2.dmctl.1716479228.35871.out
cov.shardddl2.dmctl.1716479230.35989.out
cov.shardddl2.dmctl.1716479230.36051.out
cov.shardddl2.dmctl.1716479236.36229.out
cov.shardddl2.master.out
cov.shardddl2.worker.8262.1716479087.out
cov.shardddl2.worker.8262.1716479093.out
cov.shardddl2.worker.8263.1716479089.out
cov.shardddl2.worker.8263.1716479117.out
downstream
goroutines
shardddl1
shardddl1_1
shardddl2
sql_res.shardddl1.txt
sql_res.shardddl1_1.txt
sql_res.shardddl2.txt
tidb.toml
++ find /tmp/dm_test/ -type f -name '*.log'
+ tar -cvzf log-G07.tar.gz /tmp/dm_test/shardddl1/dmctl.1716478810.log /tmp/dm_test/shardddl1/dmctl.1716478937.log /tmp/dm_test/shardddl1/dmctl.1716478907.log /tmp/dm_test/shardddl1/dmctl.1716478825.log /tmp/dm_test/shardddl1/dmctl.1716478817.log /tmp/dm_test/shardddl1/sync_diff_stdout.log /tmp/dm_test/shardddl1/dmctl.1716478949.log /tmp/dm_test/shardddl1/dmctl.1716478931.log /tmp/dm_test/shardddl1/dmctl.1716478953.log /tmp/dm_test/shardddl1/dmctl.1716478727.log /tmp/dm_test/shardddl1/dmctl.1716478950.log /tmp/dm_test/shardddl1/dmctl.1716478841.log /tmp/dm_test/shardddl1/dmctl.1716478816.log /tmp/dm_test/shardddl1/dmctl.1716478847.log /tmp/dm_test/shardddl1/dmctl.1716478934.log /tmp/dm_test/shardddl1/dmctl.1716478830.log /tmp/dm_test/shardddl1/dmctl.1716478898.log /tmp/dm_test/shardddl1/dmctl.1716478722.log /tmp/dm_test/shardddl1/dmctl.1716478901.log /tmp/dm_test/shardddl1/dmctl.1716478909.log /tmp/dm_test/shardddl1/dmctl.1716478723.log /tmp/dm_test/shardddl1/dmctl.1716478936.log /tmp/dm_test/shardddl1/worker1/log/dm-worker.log /tmp/dm_test/shardddl1/worker1/log/stdout.log /tmp/dm_test/shardddl1/dmctl.1716478928.log /tmp/dm_test/shardddl1/dmctl.1716478786.log /tmp/dm_test/shardddl1/dmctl.1716478812.log /tmp/dm_test/shardddl1/dmctl.1716478925.log /tmp/dm_test/shardddl1/dmctl.1716478831.log /tmp/dm_test/shardddl1/dmctl.1716478826.log /tmp/dm_test/shardddl1/dmctl.1716478823.log /tmp/dm_test/shardddl1/dmctl.1716478832.log /tmp/dm_test/shardddl1/dmctl.1716478922.log /tmp/dm_test/shardddl1/dmctl.1716478842.log /tmp/dm_test/shardddl1/dmctl.1716478853.log /tmp/dm_test/shardddl1/dmctl.1716478917.log /tmp/dm_test/shardddl1/dmctl.1716478916.log /tmp/dm_test/shardddl1/dmctl.1716478955.log /tmp/dm_test/shardddl1/dmctl.1716478806.log /tmp/dm_test/shardddl1/dmctl.1716478848.log /tmp/dm_test/shardddl1/dmctl.1716478912.log /tmp/dm_test/shardddl1/dmctl.1716478809.log /tmp/dm_test/shardddl1/dmctl.1716478920.log /tmp/dm_test/shardddl1/dmctl.1716478888.log /tmp/dm_test/shardddl1/dmctl.1716478837.log /tmp/dm_test/shardddl1/dmctl.1716478827.log /tmp/dm_test/shardddl1/master/log/stdout.log /tmp/dm_test/shardddl1/master/log/dm-master.log /tmp/dm_test/shardddl1/dmctl.1716478897.log /tmp/dm_test/shardddl1/dmctl.1716478789.log /tmp/dm_test/shardddl1/dmctl.1716478833.log /tmp/dm_test/shardddl1/dmctl.1716478911.log /tmp/dm_test/shardddl1/dmctl.1716478932.log /tmp/dm_test/shardddl1/dmctl.1716478926.log /tmp/dm_test/shardddl1/dmctl.1716478829.log /tmp/dm_test/shardddl1/dmctl.1716478887.log /tmp/dm_test/shardddl1/dmctl.1716478891.log /tmp/dm_test/shardddl1/dmctl.1716478755.log /tmp/dm_test/shardddl1/dmctl.1716478788.log /tmp/dm_test/shardddl1/dmctl.1716478948.log /tmp/dm_test/shardddl1/worker2/log/dm-worker.log /tmp/dm_test/shardddl1/worker2/log/stdout.log /tmp/dm_test/shardddl1/dmctl.1716478913.log /tmp/dm_test/shardddl1/dmctl.1716478751.log /tmp/dm_test/shardddl1/dmctl.1716478754.log /tmp/dm_test/shardddl1/dmctl.1716478954.log /tmp/dm_test/shardddl1/dmctl.1716478728.log /tmp/dm_test/shardddl1/dmctl.1716478940.log /tmp/dm_test/shardddl1/dmctl.1716478941.log /tmp/dm_test/shardddl1/dmctl.1716478939.log /tmp/dm_test/shardddl1/dmctl.1716478814.log /tmp/dm_test/downstream/tidb/log/tidb.log /tmp/dm_test/shardddl2/dmctl.1716479185.log /tmp/dm_test/shardddl2/dmctl.1716479116.log /tmp/dm_test/shardddl2/dmctl.1716479129.log /tmp/dm_test/shardddl2/dmctl.1716479092.log /tmp/dm_test/shardddl2/sync_diff_stdout.log /tmp/dm_test/shardddl2/dmctl.1716479226.log /tmp/dm_test/shardddl2/dmctl.1716479219.log /tmp/dm_test/shardddl2/dmctl.1716479186.log /tmp/dm_test/shardddl2/dmctl.1716479105.log /tmp/dm_test/shardddl2/dmctl.1716479177.log /tmp/dm_test/shardddl2/dmctl.1716479096.log /tmp/dm_test/shardddl2/dmctl.1716479131.log /tmp/dm_test/shardddl2/dmctl.1716479196.log /tmp/dm_test/shardddl2/dmctl.1716479160.log /tmp/dm_test/shardddl2/dmctl.1716479192.log /tmp/dm_test/shardddl2/worker1/log/dm-worker.log /tmp/dm_test/shardddl2/worker1/log/stdout.log /tmp/dm_test/shardddl2/dmctl.1716479230.log /tmp/dm_test/shardddl2/dmctl.1716479162.log /tmp/dm_test/shardddl2/dmctl.1716479154.log /tmp/dm_test/shardddl2/dmctl.1716479210.log /tmp/dm_test/shardddl2/dmctl.1716479202.log /tmp/dm_test/shardddl2/dmctl.1716479150.log /tmp/dm_test/shardddl2/dmctl.1716479106.log /tmp/dm_test/shardddl2/dmctl.1716479213.log /tmp/dm_test/shardddl2/dmctl.1716479091.log /tmp/dm_test/shardddl2/dmctl.1716479236.log /tmp/dm_test/shardddl2/dmctl.1716479193.log /tmp/dm_test/shardddl2/dmctl.1716479209.log /tmp/dm_test/shardddl2/dmctl.1716479114.log /tmp/dm_test/shardddl2/dmctl.1716479170.log /tmp/dm_test/shardddl2/dmctl.1716479090.log /tmp/dm_test/shardddl2/dmctl.1716479088.log /tmp/dm_test/shardddl2/dmctl.1716479194.log /tmp/dm_test/shardddl2/dmctl.1716479121.log /tmp/dm_test/shardddl2/master/log/stdout.log /tmp/dm_test/shardddl2/master/log/dm-master.log /tmp/dm_test/shardddl2/dmctl.1716479178.log /tmp/dm_test/shardddl2/dmctl.1716479211.log /tmp/dm_test/shardddl2/dmctl.1716479161.log /tmp/dm_test/shardddl2/dmctl.1716479179.log /tmp/dm_test/shardddl2/dmctl.1716479102.log /tmp/dm_test/shardddl2/dmctl.1716479115.log /tmp/dm_test/shardddl2/dmctl.1716479228.log /tmp/dm_test/shardddl2/worker2/log/dm-worker.log /tmp/dm_test/shardddl2/worker2/log/stdout.log /tmp/dm_test/shardddl2/dmctl.1716479171.log /tmp/dm_test/shardddl2/dmctl.1716479151.log /tmp/dm_test/shardddl2/dmctl.1716479127.log /tmp/dm_test/shardddl1_1/dmctl.1716478991.log /tmp/dm_test/shardddl1_1/dmctl.1716479062.log /tmp/dm_test/shardddl1_1/dmctl.1716479077.log /tmp/dm_test/shardddl1_1/dmctl.1716479043.log /tmp/dm_test/shardddl1_1/sync_diff_stdout.log /tmp/dm_test/shardddl1_1/dmctl.1716479028.log /tmp/dm_test/shardddl1_1/dmctl.1716479076.log /tmp/dm_test/shardddl1_1/dmctl.1716478967.log /tmp/dm_test/shardddl1_1/dmctl.1716479038.log /tmp/dm_test/shardddl1_1/dmctl.1716479057.log /tmp/dm_test/shardddl1_1/dmctl.1716479036.log /tmp/dm_test/shardddl1_1/dmctl.1716478978.log /tmp/dm_test/shardddl1_1/dmctl.1716479011.log /tmp/dm_test/shardddl1_1/dmctl.1716478999.log /tmp/dm_test/shardddl1_1/dmctl.1716479016.log /tmp/dm_test/shardddl1_1/dmctl.1716478971.log /tmp/dm_test/shardddl1_1/dmctl.1716479029.log /tmp/dm_test/shardddl1_1/dmctl.1716478989.log /tmp/dm_test/shardddl1_1/dmctl.1716479004.log /tmp/dm_test/shardddl1_1/dmctl.1716478973.log /tmp/dm_test/shardddl1_1/dmctl.1716479052.log /tmp/dm_test/shardddl1_1/dmctl.1716479078.log /tmp/dm_test/shardddl1_1/dmctl.1716479019.log /tmp/dm_test/shardddl1_1/dmctl.1716479053.log /tmp/dm_test/shardddl1_1/dmctl.1716479027.log /tmp/dm_test/shardddl1_1/dmctl.1716478984.log /tmp/dm_test/shardddl1_1/dmctl.1716479034.log /tmp/dm_test/shardddl1_1/dmctl.1716478979.log /tmp/dm_test/shardddl1_1/dmctl.1716479059.log /tmp/dm_test/shardddl1_1/dmctl.1716479023.log /tmp/dm_test/shardddl1_1/dmctl.1716479065.log /tmp/dm_test/shardddl1_1/dmctl.1716479042.log /tmp/dm_test/shardddl1_1/dmctl.1716478983.log /tmp/dm_test/shardddl1_1/worker1/log/dm-worker.log /tmp/dm_test/shardddl1_1/worker1/log/stdout.log /tmp/dm_test/shardddl1_1/dmctl.1716479005.log /tmp/dm_test/shardddl1_1/dmctl.1716478986.log /tmp/dm_test/shardddl1_1/dmctl.1716478981.log /tmp/dm_test/shardddl1_1/dmctl.1716479041.log /tmp/dm_test/shardddl1_1/dmctl.1716479020.log /tmp/dm_test/shardddl1_1/dmctl.1716478977.log /tmp/dm_test/shardddl1_1/dmctl.1716479009.log /tmp/dm_test/shardddl1_1/dmctl.1716479058.log /tmp/dm_test/shardddl1_1/dmctl.1716479013.log /tmp/dm_test/shardddl1_1/dmctl.1716479066.log /tmp/dm_test/shardddl1_1/dmctl.1716478996.log /tmp/dm_test/shardddl1_1/dmctl.1716479045.log /tmp/dm_test/shardddl1_1/dmctl.1716479033.log /tmp/dm_test/shardddl1_1/dmctl.1716478988.log /tmp/dm_test/shardddl1_1/dmctl.1716479072.log /tmp/dm_test/shardddl1_1/dmctl.1716479081.log /tmp/dm_test/shardddl1_1/dmctl.1716479071.log /tmp/dm_test/shardddl1_1/dmctl.1716479031.log /tmp/dm_test/shardddl1_1/dmctl.1716478998.log /tmp/dm_test/shardddl1_1/dmctl.1716478974.log /tmp/dm_test/shardddl1_1/dmctl.1716479000.log /tmp/dm_test/shardddl1_1/dmctl.1716479048.log /tmp/dm_test/shardddl1_1/dmctl.1716479026.log /tmp/dm_test/shardddl1_1/dmctl.1716478964.log /tmp/dm_test/shardddl1_1/dmctl.1716479003.log /tmp/dm_test/shardddl1_1/dmctl.1716479074.log /tmp/dm_test/shardddl1_1/dmctl.1716479060.log /tmp/dm_test/shardddl1_1/dmctl.1716479049.log /tmp/dm_test/shardddl1_1/master/log/stdout.log /tmp/dm_test/shardddl1_1/master/log/dm-master.log /tmp/dm_test/shardddl1_1/dmctl.1716478972.log /tmp/dm_test/shardddl1_1/dmctl.1716478993.log /tmp/dm_test/shardddl1_1/dmctl.1716479025.log /tmp/dm_test/shardddl1_1/dmctl.1716479015.log /tmp/dm_test/shardddl1_1/dmctl.1716479022.log /tmp/dm_test/shardddl1_1/dmctl.1716479069.log /tmp/dm_test/shardddl1_1/dmctl.1716479017.log /tmp/dm_test/shardddl1_1/dmctl.1716479006.log /tmp/dm_test/shardddl1_1/dmctl.1716479002.log /tmp/dm_test/shardddl1_1/dmctl.1716479037.log /tmp/dm_test/shardddl1_1/dmctl.1716479046.log /tmp/dm_test/shardddl1_1/worker2/log/dm-worker.log /tmp/dm_test/shardddl1_1/worker2/log/stdout.log /tmp/dm_test/shardddl1_1/dmctl.1716479054.log /tmp/dm_test/shardddl1_1/dmctl.1716479067.log /tmp/dm_test/shardddl1_1/dmctl.1716479063.log /tmp/dm_test/shardddl1_1/dmctl.1716478968.log /tmp/dm_test/shardddl1_1/dmctl.1716479050.log /tmp/dm_test/shardddl1_1/dmctl.1716479012.log /tmp/dm_test/shardddl1_1/dmctl.1716479007.log /tmp/dm_test/shardddl1_1/dmctl.1716479061.log /tmp/dm_test/shardddl1_1/dmctl.1716478994.log /tmp/dm_test/shardddl1_1/dmctl.1716478965.log /tmp/dm_test/goroutines/stack/log/master-8461.log /tmp/dm_test/goroutines/stack/log/master-8361.log /tmp/dm_test/goroutines/stack/log/master-8661.log /tmp/dm_test/goroutines/stack/log/worker-8264.log /tmp/dm_test/goroutines/stack/log/master-8761.log /tmp/dm_test/goroutines/stack/log/master-8561.log /tmp/dm_test/goroutines/stack/log/master-8261.log /tmp/dm_test/goroutines/stack/log/worker-18263.log /tmp/dm_test/goroutines/stack/log/worker-8263.log /tmp/dm_test/goroutines/stack/log/worker-8262.log /tmp/dm_test/goroutines/stack/log/worker-18262.log
tar: Removing leading `/' from member names
/tmp/dm_test/shardddl1/dmctl.1716478810.log
/tmp/dm_test/shardddl1/dmctl.1716478937.log
/tmp/dm_test/shardddl1/dmctl.1716478907.log
/tmp/dm_test/shardddl1/dmctl.1716478825.log
/tmp/dm_test/shardddl1/dmctl.1716478817.log
/tmp/dm_test/shardddl1/sync_diff_stdout.log
/tmp/dm_test/shardddl1/dmctl.1716478949.log
/tmp/dm_test/shardddl1/dmctl.1716478931.log
/tmp/dm_test/shardddl1/dmctl.1716478953.log
/tmp/dm_test/shardddl1/dmctl.1716478727.log
/tmp/dm_test/shardddl1/dmctl.1716478950.log
/tmp/dm_test/shardddl1/dmctl.1716478841.log
/tmp/dm_test/shardddl1/dmctl.1716478816.log
/tmp/dm_test/shardddl1/dmctl.1716478847.log
/tmp/dm_test/shardddl1/dmctl.1716478934.log
/tmp/dm_test/shardddl1/dmctl.1716478830.log
/tmp/dm_test/shardddl1/dmctl.1716478898.log
/tmp/dm_test/shardddl1/dmctl.1716478722.log
/tmp/dm_test/shardddl1/dmctl.1716478901.log
/tmp/dm_test/shardddl1/dmctl.1716478909.log
/tmp/dm_test/shardddl1/dmctl.1716478723.log
/tmp/dm_test/shardddl1/dmctl.1716478936.log
/tmp/dm_test/shardddl1/worker1/log/dm-worker.log
/tmp/dm_test/shardddl1/worker1/log/stdout.log
/tmp/dm_test/shardddl1/dmctl.1716478928.log
/tmp/dm_test/shardddl1/dmctl.1716478786.log
/tmp/dm_test/shardddl1/dmctl.1716478812.log
/tmp/dm_test/shardddl1/dmctl.1716478925.log
/tmp/dm_test/shardddl1/dmctl.1716478831.log
/tmp/dm_test/shardddl1/dmctl.1716478826.log
/tmp/dm_test/shardddl1/dmctl.1716478823.log
/tmp/dm_test/shardddl1/dmctl.1716478832.log
/tmp/dm_test/shardddl1/dmctl.1716478922.log
/tmp/dm_test/shardddl1/dmctl.1716478842.log
/tmp/dm_test/shardddl1/dmctl.1716478853.log
/tmp/dm_test/shardddl1/dmctl.1716478917.log
/tmp/dm_test/shardddl1/dmctl.1716478916.log
/tmp/dm_test/shardddl1/dmctl.1716478955.log
/tmp/dm_test/shardddl1/dmctl.1716478806.log
/tmp/dm_test/shardddl1/dmctl.1716478848.log
/tmp/dm_test/shardddl1/dmctl.1716478912.log
/tmp/dm_test/shardddl1/dmctl.1716478809.log
/tmp/dm_test/shardddl1/dmctl.1716478920.log
/tmp/dm_test/shardddl1/dmctl.1716478888.log
/tmp/dm_test/shardddl1/dmctl.1716478837.log
/tmp/dm_test/shardddl1/dmctl.1716478827.log
/tmp/dm_test/shardddl1/master/log/stdout.log
/tmp/dm_test/shardddl1/master/log/dm-master.log
/tmp/dm_test/shardddl1/dmctl.1716478897.log
/tmp/dm_test/shardddl1/dmctl.1716478789.log
/tmp/dm_test/shardddl1/dmctl.1716478833.log
/tmp/dm_test/shardddl1/dmctl.1716478911.log
/tmp/dm_test/shardddl1/dmctl.1716478932.log
/tmp/dm_test/shardddl1/dmctl.1716478926.log
/tmp/dm_test/shardddl1/dmctl.1716478829.log
/tmp/dm_test/shardddl1/dmctl.1716478887.log
/tmp/dm_test/shardddl1/dmctl.1716478891.log
/tmp/dm_test/shardddl1/dmctl.1716478755.log
/tmp/dm_test/shardddl1/dmctl.1716478788.log
/tmp/dm_test/shardddl1/dmctl.1716478948.log
/tmp/dm_test/shardddl1/worker2/log/dm-worker.log
/tmp/dm_test/shardddl1/worker2/log/stdout.log
/tmp/dm_test/shardddl1/dmctl.1716478913.log
/tmp/dm_test/shardddl1/dmctl.1716478751.log
/tmp/dm_test/shardddl1/dmctl.1716478754.log
/tmp/dm_test/shardddl1/dmctl.1716478954.log
/tmp/dm_test/shardddl1/dmctl.1716478728.log
/tmp/dm_test/shardddl1/dmctl.1716478940.log
/tmp/dm_test/shardddl1/dmctl.1716478941.log
/tmp/dm_test/shardddl1/dmctl.1716478939.log
/tmp/dm_test/shardddl1/dmctl.1716478814.log
/tmp/dm_test/downstream/tidb/log/tidb.log
/tmp/dm_test/shardddl2/dmctl.1716479185.log
/tmp/dm_test/shardddl2/dmctl.1716479116.log
/tmp/dm_test/shardddl2/dmctl.1716479129.log
/tmp/dm_test/shardddl2/dmctl.1716479092.log
/tmp/dm_test/shardddl2/sync_diff_stdout.log
/tmp/dm_test/shardddl2/dmctl.1716479226.log
/tmp/dm_test/shardddl2/dmctl.1716479219.log
/tmp/dm_test/shardddl2/dmctl.1716479186.log
/tmp/dm_test/shardddl2/dmctl.1716479105.log
/tmp/dm_test/shardddl2/dmctl.1716479177.log
/tmp/dm_test/shardddl2/dmctl.1716479096.log
/tmp/dm_test/shardddl2/dmctl.1716479131.log
/tmp/dm_test/shardddl2/dmctl.1716479196.log
/tmp/dm_test/shardddl2/dmctl.1716479160.log
/tmp/dm_test/shardddl2/dmctl.1716479192.log
/tmp/dm_test/shardddl2/worker1/log/dm-worker.log
/tmp/dm_test/shardddl2/worker1/log/stdout.log
/tmp/dm_test/shardddl2/dmctl.1716479230.log
/tmp/dm_test/shardddl2/dmctl.1716479162.log
/tmp/dm_test/shardddl2/dmctl.1716479154.log
/tmp/dm_test/shardddl2/dmctl.1716479210.log
/tmp/dm_test/shardddl2/dmctl.1716479202.log
/tmp/dm_test/shardddl2/dmctl.1716479150.log
/tmp/dm_test/shardddl2/dmctl.1716479106.log
/tmp/dm_test/shardddl2/dmctl.1716479213.log
/tmp/dm_test/shardddl2/dmctl.1716479091.log
/tmp/dm_test/shardddl2/dmctl.1716479236.log
/tmp/dm_test/shardddl2/dmctl.1716479193.log
/tmp/dm_test/shardddl2/dmctl.1716479209.log
/tmp/dm_test/shardddl2/dmctl.1716479114.log
/tmp/dm_test/shardddl2/dmctl.1716479170.log
/tmp/dm_test/shardddl2/dmctl.1716479090.log
/tmp/dm_test/shardddl2/dmctl.1716479088.log
/tmp/dm_test/shardddl2/dmctl.1716479194.log
/tmp/dm_test/shardddl2/dmctl.1716479121.log
/tmp/dm_test/shardddl2/master/log/stdout.log
/tmp/dm_test/shardddl2/master/log/dm-master.log
/tmp/dm_test/shardddl2/dmctl.1716479178.log
/tmp/dm_test/shardddl2/dmctl.1716479211.log
/tmp/dm_test/shardddl2/dmctl.1716479161.log
/tmp/dm_test/shardddl2/dmctl.1716479179.log
/tmp/dm_test/shardddl2/dmctl.1716479102.log
/tmp/dm_test/shardddl2/dmctl.1716479115.log
/tmp/dm_test/shardddl2/dmctl.1716479228.log
/tmp/dm_test/shardddl2/worker2/log/dm-worker.log
/tmp/dm_test/shardddl2/worker2/log/stdout.log
/tmp/dm_test/shardddl2/dmctl.1716479171.log
/tmp/dm_test/shardddl2/dmctl.1716479151.log
/tmp/dm_test/shardddl2/dmctl.1716479127.log
/tmp/dm_test/shardddl1_1/dmctl.1716478991.log
/tmp/dm_test/shardddl1_1/dmctl.1716479062.log
/tmp/dm_test/shardddl1_1/dmctl.1716479077.log
/tmp/dm_test/shardddl1_1/dmctl.1716479043.log
/tmp/dm_test/shardddl1_1/sync_diff_stdout.log
/tmp/dm_test/shardddl1_1/dmctl.1716479028.log
/tmp/dm_test/shardddl1_1/dmctl.1716479076.log
/tmp/dm_test/shardddl1_1/dmctl.1716478967.log
/tmp/dm_test/shardddl1_1/dmctl.1716479038.log
/tmp/dm_test/shardddl1_1/dmctl.1716479057.log
/tmp/dm_test/shardddl1_1/dmctl.1716479036.log
/tmp/dm_test/shardddl1_1/dmctl.1716478978.log
/tmp/dm_test/shardddl1_1/dmctl.1716479011.log
/tmp/dm_test/shardddl1_1/dmctl.1716478999.log
/tmp/dm_test/shardddl1_1/dmctl.1716479016.log
/tmp/dm_test/shardddl1_1/dmctl.1716478971.log
/tmp/dm_test/shardddl1_1/dmctl.1716479029.log
/tmp/dm_test/shardddl1_1/dmctl.1716478989.log
/tmp/dm_test/shardddl1_1/dmctl.1716479004.log
/tmp/dm_test/shardddl1_1/dmctl.1716478973.log
/tmp/dm_test/shardddl1_1/dmctl.1716479052.log
/tmp/dm_test/shardddl1_1/dmctl.1716479078.log
/tmp/dm_test/shardddl1_1/dmctl.1716479019.log
/tmp/dm_test/shardddl1_1/dmctl.1716479053.log
/tmp/dm_test/shardddl1_1/dmctl.1716479027.log
/tmp/dm_test/shardddl1_1/dmctl.1716478984.log
/tmp/dm_test/shardddl1_1/dmctl.1716479034.log
/tmp/dm_test/shardddl1_1/dmctl.1716478979.log
/tmp/dm_test/shardddl1_1/dmctl.1716479059.log
/tmp/dm_test/shardddl1_1/dmctl.1716479023.log
/tmp/dm_test/shardddl1_1/dmctl.1716479065.log
/tmp/dm_test/shardddl1_1/dmctl.1716479042.log
/tmp/dm_test/shardddl1_1/dmctl.1716478983.log
/tmp/dm_test/shardddl1_1/worker1/log/dm-worker.log
/tmp/dm_test/shardddl1_1/worker1/log/stdout.log
/tmp/dm_test/shardddl1_1/dmctl.1716479005.log
/tmp/dm_test/shardddl1_1/dmctl.1716478986.log
/tmp/dm_test/shardddl1_1/dmctl.1716478981.log
/tmp/dm_test/shardddl1_1/dmctl.1716479041.log
/tmp/dm_test/shardddl1_1/dmctl.1716479020.log
/tmp/dm_test/shardddl1_1/dmctl.1716478977.log
/tmp/dm_test/shardddl1_1/dmctl.1716479009.log
/tmp/dm_test/shardddl1_1/dmctl.1716479058.log
/tmp/dm_test/shardddl1_1/dmctl.1716479013.log
/tmp/dm_test/shardddl1_1/dmctl.1716479066.log
/tmp/dm_test/shardddl1_1/dmctl.1716478996.log
/tmp/dm_test/shardddl1_1/dmctl.1716479045.log
/tmp/dm_test/shardddl1_1/dmctl.1716479033.log
/tmp/dm_test/shardddl1_1/dmctl.1716478988.log
/tmp/dm_test/shardddl1_1/dmctl.1716479072.log
/tmp/dm_test/shardddl1_1/dmctl.1716479081.log
/tmp/dm_test/shardddl1_1/dmctl.1716479071.log
/tmp/dm_test/shardddl1_1/dmctl.1716479031.log
/tmp/dm_test/shardddl1_1/dmctl.1716478998.log
/tmp/dm_test/shardddl1_1/dmctl.1716478974.log
/tmp/dm_test/shardddl1_1/dmctl.1716479000.log
/tmp/dm_test/shardddl1_1/dmctl.1716479048.log
/tmp/dm_test/shardddl1_1/dmctl.1716479026.log
/tmp/dm_test/shardddl1_1/dmctl.1716478964.log
/tmp/dm_test/shardddl1_1/dmctl.1716479003.log
/tmp/dm_test/shardddl1_1/dmctl.1716479074.log
/tmp/dm_test/shardddl1_1/dmctl.1716479060.log
/tmp/dm_test/shardddl1_1/dmctl.1716479049.log
/tmp/dm_test/shardddl1_1/master/log/stdout.log
/tmp/dm_test/shardddl1_1/master/log/dm-master.log
/tmp/dm_test/shardddl1_1/dmctl.1716478972.log
/tmp/dm_test/shardddl1_1/dmctl.1716478993.log
/tmp/dm_test/shardddl1_1/dmctl.1716479025.log
/tmp/dm_test/shardddl1_1/dmctl.1716479015.log
/tmp/dm_test/shardddl1_1/dmctl.1716479022.log
/tmp/dm_test/shardddl1_1/dmctl.1716479069.log
/tmp/dm_test/shardddl1_1/dmctl.1716479017.log
/tmp/dm_test/shardddl1_1/dmctl.1716479006.log
/tmp/dm_test/shardddl1_1/dmctl.1716479002.log
/tmp/dm_test/shardddl1_1/dmctl.1716479037.log
/tmp/dm_test/shardddl1_1/dmctl.1716479046.log
/tmp/dm_test/shardddl1_1/worker2/log/dm-worker.log
/tmp/dm_test/shardddl1_1/worker2/log/stdout.log
/tmp/dm_test/shardddl1_1/dmctl.1716479054.log
/tmp/dm_test/shardddl1_1/dmctl.1716479067.log
/tmp/dm_test/shardddl1_1/dmctl.1716479063.log
/tmp/dm_test/shardddl1_1/dmctl.1716478968.log
/tmp/dm_test/shardddl1_1/dmctl.1716479050.log
/tmp/dm_test/shardddl1_1/dmctl.1716479012.log
/tmp/dm_test/shardddl1_1/dmctl.1716479007.log
/tmp/dm_test/shardddl1_1/dmctl.1716479061.log
/tmp/dm_test/shardddl1_1/dmctl.1716478994.log
/tmp/dm_test/shardddl1_1/dmctl.1716478965.log
/tmp/dm_test/goroutines/stack/log/master-8461.log
/tmp/dm_test/goroutines/stack/log/master-8361.log
/tmp/dm_test/goroutines/stack/log/master-8661.log
/tmp/dm_test/goroutines/stack/log/worker-8264.log
/tmp/dm_test/goroutines/stack/log/master-8761.log
/tmp/dm_test/goroutines/stack/log/master-8561.log
/tmp/dm_test/goroutines/stack/log/master-8261.log
/tmp/dm_test/goroutines/stack/log/worker-18263.log
/tmp/dm_test/goroutines/stack/log/worker-8263.log
/tmp/dm_test/goroutines/stack/log/worker-8262.log
/tmp/dm_test/goroutines/stack/log/worker-18262.log
+ ls -alh log-G07.tar.gz
-rw-r--r--. 1 jenkins jenkins 657K May 23 23:47 log-G07.tar.gz
[Pipeline] archiveArtifacts
Archiving artifacts
[Thu May 23 23:47:39 CST 2024] <<<<<< finish DM-135 optimistic >>>>>>
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/full_mode/conf/dm-task.yaml --remove-meta"
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G07'
Sending interrupt signal to process
Killing processes
[Thu May 23 23:47:40 CST 2024] <<<<<< start DM-136 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.1/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
make: *** [dm_integration_test_in_group] Terminated
script returned exit code 143
make: *** [dm_integration_test_in_group] Terminated
kill finished with exit code 0
script returned exit code 143
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] }
[Pipeline] }
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] }
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] }
[Pipeline] }
[Pipeline] // container
[Pipeline] // container
[Pipeline] }
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] // node
[Pipeline] // node
[Pipeline] }
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G08'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G11'
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE