Skip to content

Console Output

Skipping 1,117 KB.. Full Log
wait for rpc addr 127.0.0.1:8264 alive the 1-th time
got=1 expected=1
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 1\/0\/0" count: 0 != expected: 1, failed the 2-th time, will retry again
rpc addr 127.0.0.1:8264 is alive
start worker4
[Sun May  5 11:29:56 CST 2024] <<<<<< START DM-WORKER on port 18262, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker4.toml >>>>>>
wait for rpc addr 127.0.0.1:18262 alive the 1-th time
[Sun May  5 11:29:57 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
dmctl test cmd: "query-status test"
rpc addr 127.0.0.1:18262 is alive
dmctl test cmd: "list-member --name worker3 --name worker4"
got=1 expected=1
got=1 expected=1
got=1 expected=1
dmctl test cmd: "start-relay -s mysql-replica-01 worker3"
got=1 expected=1
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 1\/0\/0" count: 0 != expected: 1, failed the 3-th time, will retry again
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
check diff successfully
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
dmctl test cmd: "stop-task test"
got=2 expected=2
dmctl test cmd: "start-relay -s mysql-replica-02 worker4"
got=1 expected=1
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 1\/0\/0" count: 0 != expected: 1, failed the 4-th time, will retry again
[Sun May  5 11:30:00 CST 2024] <<<<<< finish DM-RECOVER_LOCK optimistic >>>>>>
run DM_DropAddColumn case #0
[Sun May  5 11:30:01 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
got=2 expected=2
dmctl test cmd: "query-status test"
got=4 expected=4
check diff successfully
kill dm-worker3
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
check log contain failed 1-th time, retry later
error check
wait process dm-worker3 exit...
got=1 expected=1
got=1 expected=1
dmctl test cmd: "pause-task test"
wait process minio exit...
process minio already exit
/home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/s3_dumpling_lightning/run.sh: line 49: 16808 Killed                  bin/minio server --address $S3_ENDPOINT "$s3_DBPATH"
run s3 test error check success
Starting TiDB on port 4000
Verifying TiDB is started...
ERROR 2013 (HY000): Lost connection to MySQL server at 'reading initial communication packet', system error: 104
wait process dm-worker3 exit...
process dm-worker3 already exit
kill dm-worker4
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
check diff failed 1-th time, retry later
wait process dm-worker4 exit...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
1 dm-master alive
0 dm-worker alive
0 dm-syncer alive
wait process dm-worker4 exit...
process dm-worker4 already exit
dmctl test cmd: "list-member --name worker3 --name worker4"
got=2 expected=2
start worker1
[Sun May  5 11:30:05 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
dmctl test cmd: "resume-task test"
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "pause-task test"
rpc addr 127.0.0.1:8262 is alive
start worker2
[Sun May  5 11:30:07 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker2.toml >>>>>>
wait process dm-master.test exit...
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
check diff failed 2-th time, retry later
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "list-member --name worker1 --name worker2"
got=1 expected=1
got=1 expected=1
num1 1 num2 2
[Sun May  5 11:30:08 CST 2024] <<<<<< finish test_last_bound >>>>>>
[Sun May  5 11:30:08 CST 2024] <<<<<< start test_config_name >>>>>>
[Sun May  5 11:30:08 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master-join1.toml >>>>>>
rpc addr 127.0.0.1:8261 is alive
[Sun May  5 11:30:08 CST 2024] <<<<<< START DM-MASTER on port 8361, config: /tmp/dm_test/ha_cases/dm-master-join2.toml >>>>>>
dmctl test cmd: "query-status test"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 2\/0\/0" count: 0 != expected: 1, failed the 0-th time, will retry again
check diff failed 3-th time, retry later
wait process dm-master.test exit...
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Sun May  5 11:30:09 CST 2024] <<<<<< test case s3_dumpling_lightning success! >>>>>>
start running case: [sequence_sharding_optimistic] script: [/home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding_optimistic/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding_optimistic/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Sun May  5 11:30:09 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding_optimistic/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
check diff successfully
restart dm-worker1
wait process worker1 exit...
rpc addr 127.0.0.1:8261 is alive
[Sun May  5 11:30:10 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding_optimistic/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 2\/0\/0" count: 0 != expected: 1, failed the 1-th time, will retry again
wait process worker1 exit...
process worker1 already exit
[Sun May  5 11:30:10 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
check log contain failed 1-th time (file not exist), retry later
[Sun May  5 11:30:10 CST 2024] <<<<<< START DM-MASTER on port 8361, config: /tmp/dm_test/ha_cases/dm-master-join2.toml >>>>>>
rpc addr 127.0.0.1:8361 is alive
[Sun May  5 11:30:10 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
rpc addr 127.0.0.1:8262 is alive
[Sun May  5 11:30:10 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /tmp/dm_test/ha_cases/dm-worker2.toml >>>>>>
check diff failed at last
dmctl test cmd: "binlog skip test"
rpc addr 127.0.0.1:8262 is alive
[Sun May  5 11:30:11 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding_optimistic/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
restart dm-worker2
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:30:12 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #1
[Sun May  5 11:30:12 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
wait process worker2 exit...
[Sun May  5 11:30:12 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /tmp/dm_test/ha_cases/dm-worker2.toml >>>>>>
rpc addr 127.0.0.1:8263 is alive
[Sun May  5 11:30:12 CST 2024] <<<<<< finish test_config_name >>>>>>
[Sun May  5 11:30:12 CST 2024] <<<<<< start test_join_masters_and_worker >>>>>>
3 dm-master alive
3 dm-worker alive
0 dm-syncer alive
rpc addr 127.0.0.1:8263 is alive
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   341  100   216  100   125  66831  38675 --:--:-- --:--:-- --:--:-- 72000
dmctl test cmd: "operate-source create /tmp/dm_test/sequence_sharding_optimistic/source1.yaml"
dmctl test cmd: "operate-source create /tmp/dm_test/sequence_sharding_optimistic/source2.yaml"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 2\/0\/0" count: 0 != expected: 1, failed the 2-th time, will retry again
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
restart dm-master
wait process worker2 exit...
process worker2 already exit
[Sun May  5 11:30:13 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   407  100   282  100   125  68066  30171 --:--:-- --:--:-- --:--:-- 70500
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding_optimistic/conf/dm-task.yaml --remove-meta"
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "stop-task test"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 2\/0\/0" count: 0 != expected: 1, failed the 3-th time, will retry again
wait process dm-master.test exit...
[Sun May  5 11:30:14 CST 2024] <<<<<< finish DM-RESYNC_NOT_FLUSHED optimistic >>>>>>
[Sun May  5 11:30:14 CST 2024] <<<<<< start DM-RESYNC_TXN_INTERRUPT optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master exit...
check diff successfully
ERROR 1146 (42S02) at line 1: Table 'sharding_seq_tmp.t1' doesn't exist
run tidb sql failed 1-th time, retry later
wait process dm-master.test exit...
wait process dm-master exit...
process dm-master already exit
dmctl test cmd: "query-status test"
got=2 expected=2
restart dm-worker1
wait process dm-master.test exit...
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   398  100   273  100   125  69447  31798 --:--:-- --:--:-- --:--:-- 91000
dmctl test cmd: "pause-task sequence_sharding_optimistic"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 2\/0\/0" count: 0 != expected: 1, failed the 4-th time, will retry again
got=3 expected=3
dmctl test cmd: "query-status sequence_sharding_optimistic"
wait process worker1 exit...
wait process dm-master.test exit...
[Sun May  5 11:30:17 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
got=2 expected=2
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   512  100   387  100   125   118k  39148 --:--:-- --:--:-- --:--:--  125k
dmctl test cmd: "resume-task sequence_sharding_optimistic"
got=3 expected=3
dmctl test cmd: "query-status sequence_sharding_optimistic"
got=3 expected=3
restart dm-worker1
wait process worker1 exit...
process worker1 already exit
[Sun May  5 11:30:18 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process dm-master.test exit...
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 2\/0\/0" count: 0 != expected: 1, failed the 5-th time, will retry again
wait process dm-worker1 exit...
rpc addr 127.0.0.1:8262 is alive
restart dm-worker2
wait process dm-master.test exit...
rpc addr 127.0.0.1:8261 is alive
check log contain failed 1-th time, retry later
wait process worker2 exit...
wait process dm-worker1 exit...
process dm-worker1 already exit
[Sun May  5 11:30:20 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding_optimistic/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process dm-master.test exit...
got=1 expected=1
got=1 expected=1
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "shard-ddl-lock unlock non-exist-task-`test_db`.`test_table`"
wait process worker2 exit...
process worker2 already exit
[Sun May  5 11:30:21 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
wait process dm-master.test exit...
dmctl test cmd: "query-status sequence_sharding_optimistic"
got=1 expected=1
dmctl test cmd: "resume-task sequence_sharding_optimistic"
dmctl test cmd: "query-status sequence_sharding_optimistic"
got=1 expected=1
dmctl test cmd: "resume-task sequence_sharding_optimistic"
dmctl test cmd: "query-status sequence_sharding_optimistic"
got=1 expected=1
dmctl test cmd: "resume-task sequence_sharding_optimistic"
dmctl test cmd: "query-status sequence_sharding_optimistic"
got=1 expected=1
dmctl test cmd: "resume-task sequence_sharding_optimistic"
rpc addr 127.0.0.1:8263 is alive
begin;
insert into shardddl1.tb2 values (1,1);
insert into shardddl1.tb2 values (2,2);
insert into shardddl1.tb2 values (3,3);
insert into shardddl1.tb2 values (4,4);
insert into shardddl1.tb2 values (5,5);
insert into shardddl1.tb2 values (6,6);
insert into shardddl1.tb2 values (7,7);
insert into shardddl1.tb2 values (8,8);
insert into shardddl1.tb2 values (9,9);
insert into shardddl1.tb2 values (10,10);
commit;
begin;
insert into shardddl1.t_1 values (11,11);
insert into shardddl1.t_1 values (12,12);
insert into shardddl1.t_1 values (13,13);
insert into shardddl1.t_1 values (14,14);
insert into shardddl1.t_1 values (15,15);
insert into shardddl1.t_1 values (16,16);
insert into shardddl1.t_1 values (17,17);
insert into shardddl1.t_1 values (18,18);
insert into shardddl1.t_1 values (19,19);
insert into shardddl1.t_1 values (20,20);
insert into shardddl1.t_1 values (21,21);
insert into shardddl1.t_1 values (22,22);
insert into shardddl1.t_1 values (23,23);
insert into shardddl1.t_1 values (24,24);
insert into shardddl1.t_1 values (25,25);
insert into shardddl1.t_1 values (26,26);
insert into shardddl1.t_1 values (27,27);
insert into shardddl1.t_1 values (28,28);
insert into shardddl1.t_1 values (29,29);
insert into shardddl1.t_1 values (30,30);
insert into shardddl1.t_1 values (31,31);
insert into shardddl1.t_1 values (32,32);
insert into shardddl1.t_1 values (33,33);
insert into shardddl1.t_1 values (34,34);
insert into shardddl1.t_1 values (35,35);
insert into shardddl1.t_1 values (36,36);
insert into shardddl1.t_1 values (37,37);
insert into shardddl1.t_1 values (38,38);
insert into shardddl1.t_1 values (39,39);
insert into shardddl1.t_1 values (40,40);
insert into shardddl1.t_1 values (41,41);
insert into shardddl1.t_1 values (42,42);
insert into shardddl1.t_1 values (43,43);
insert into shardddl1.t_1 values (44,44);
insert into shardddl1.t_1 values (45,45);
insert into shardddl1.t_1 values (46,46);
insert into shardddl1.t_1 values (47,47);
insert into shardddl1.t_1 values (48,48);
insert into shardddl1.t_1 values (49,49);
insert into shardddl1.t_1 values (50,50);
commit;
begin;
insert into shardddl1.tb1 values (51,51);
insert into shardddl1.tb1 values (52,52);
insert into shardddl1.tb1 values (53,53);
insert into shardddl1.tb1 values (54,54);
insert into shardddl1.tb1 values (55,55);
insert into shardddl1.tb1 values (56,56);
insert into shardddl1.tb1 values (57,57);
insert into shardddl1.tb1 values (58,58);
insert into shardddl1.tb1 values (59,59);
insert into shardddl1.tb1 values (60,60);
commit;
begin;
insert into shardddl1.t_1 values (61,61);
insert into shardddl1.t_1 values (62,62);
insert into shardddl1.t_1 values (63,63);
insert into shardddl1.t_1 values (64,64);
insert into shardddl1.t_1 values (65,65);
insert into shardddl1.t_1 values (66,66);
insert into shardddl1.t_1 values (67,67);
insert into shardddl1.t_1 values (68,68);
insert into shardddl1.t_1 values (69,69);
insert into shardddl1.t_1 values (70,70);
commit;
check diff failed 1-th time, retry later
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
check diff failed 1-th time, retry later
wait process dm-master.test exit...
dmctl test cmd: "query-status sequence_sharding_optimistic"
got=1 expected=1
dmctl test cmd: "resume-task sequence_sharding_optimistic"
dmctl test cmd: "resume-task test"
dmctl test cmd: "query-status test"
got=3 expected=3
check diff failed 1-th time, retry later
got=1 expected=1
got=1 expected=1
dmctl test cmd: "pause-task test"
wait process dm-master.test exit...
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 3\/0\/0" count: 0 != expected: 1, failed the 0-th time, will retry again
check diff failed 2-th time, retry later
check diff failed 2-th time, retry later
check diff successfully
dmctl test cmd: "pause-task sequence_sharding_optimistic"
got=3 expected=3
dmctl test cmd: "query-status sequence_sharding_optimistic"
got=2 expected=2
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   508  100   383  100   125   109k  36485 --:--:-- --:--:-- --:--:--  124k
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   291  100   166  100   125  56118  42258 --:--:-- --:--:-- --:--:-- 83000
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   538  100   370  100   168  73734  33479 --:--:-- --:--:-- --:--:-- 92500
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   508  100   383  100   125   126k  42229 --:--:-- --:--:-- --:--:--  187k
dmctl test cmd: "binlog-schema list -s mysql-replica-01,mysql-replica-02 sequence_sharding_optimistic sharding_seq_opt t2"
dmctl test cmd: "binlog-schema delete -s mysql-replica-01 sequence_sharding_optimistic sharding_seq_opt t2"
dmctl test cmd: "binlog-schema update -s mysql-replica-01 sequence_sharding_optimistic sharding_seq_opt t1 /tmp/dm_test/sequence_sharding_optimistic/schema.sql"
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   508  100   383  100   125   111k  37335 --:--:-- --:--:-- --:--:--  124k
{
  "result": true,
  "msg": "",
  "sources": [
    {
      "result": true,
      "msg": "CREATE TABLE `t1` ( `id` bigint(20) NOT NULL, `c2` varchar(20) DEFAULT NULL, `c3` bigint(11) DEFAULT NULL, PRIMARY KEY (`id`) /*T![clustered_index] CLUSTERED */) ENGINE=InnoDB DEFAULT CHARSET=latin1 COLLATE=latin1_bin",
      "source": "mysql-replica-01",
      "worker": "worker1"
    }
  ]
}dmctl test cmd: "resume-task sequence_sharding_optimistic"
wait process dm-master.test exit...
got=3 expected=3
dmctl test cmd: "query-status sequence_sharding_optimistic"
got=3 expected=3
check diff successfully
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 3\/0\/0" count: 0 != expected: 1, failed the 1-th time, will retry again
check diff failed 3-th time, retry later
check diff failed 3-th time, retry later
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
check diff successfully
restart dm-worker1
check diff failed at last
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
check diff successfully
dmctl test cmd: "stop-task test"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 3\/0\/0" count: 0 != expected: 1, failed the 2-th time, will retry again
wait process dm-worker.test exit...
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=1 expected=1
<<<<<< test_source_and_target_with_empty_gtid success! >>>>>>
1 dm-master alive
1 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
[Sun May  5 11:30:29 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #2
[Sun May  5 11:30:29 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
wait process worker1 exit...
wait process dm-worker.test exit...
wait process dm-master.test exit...
wait process dm-worker.test exit...
wait process worker1 exit...
process worker1 already exit
[Sun May  5 11:30:31 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Sun May  5 11:30:31 CST 2024] <<<<<< test case sequence_sharding_optimistic success! >>>>>>
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 3\/0\/0" count: 0 != expected: 1, failed the 3-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
restart dm-master
wait process dm-master.test exit...
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_dm_integration_test-1901/tiflow-dm already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
rpc addr 127.0.0.1:8262 is alive
restart dm-worker2
[Pipeline] // stage
[Pipeline] }
wait process dm-worker.test exit...
wait process dm-master exit...
wait process dm-master.test exit...
wait process worker2 exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Sun May  5 11:30:33 CST 2024] <<<<<< test case all_mode success! >>>>>>
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_dm_integration_test-1901/tiflow-dm already exists)
wait process dm-master.test exit...
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
wait process dm-master exit...
process dm-master already exit
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 3\/0\/0" count: 0 != expected: 1, failed the 4-th time, will retry again
wait process worker2 exit...
process worker2 already exit
[Sun May  5 11:30:34 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
wait process dm-master.test exit...
rpc addr 127.0.0.1:8263 is alive
begin;
insert into shardddl1.tb2 values (101,101);
insert into shardddl1.tb2 values (102,102);
insert into shardddl1.tb2 values (103,103);
insert into shardddl1.tb2 values (104,104);
insert into shardddl1.tb2 values (105,105);
insert into shardddl1.tb2 values (106,106);
insert into shardddl1.tb2 values (107,107);
insert into shardddl1.tb2 values (108,108);
insert into shardddl1.tb2 values (109,109);
insert into shardddl1.tb2 values (110,110);
commit;
begin;
insert into shardddl1.tb1 values (111,111);
insert into shardddl1.tb1 values (112,112);
insert into shardddl1.tb1 values (113,113);
insert into shardddl1.tb1 values (114,114);
insert into shardddl1.tb1 values (115,115);
insert into shardddl1.tb1 values (116,116);
insert into shardddl1.tb1 values (117,117);
insert into shardddl1.tb1 values (118,118);
insert into shardddl1.tb1 values (119,119);
insert into shardddl1.tb1 values (120,120);
commit;
begin;
insert into shardddl1.tb2 values (121,121);
insert into shardddl1.tb2 values (122,122);
insert into shardddl1.tb2 values (123,123);
insert into shardddl1.tb2 values (124,124);
insert into shardddl1.tb2 values (125,125);
insert into shardddl1.tb2 values (126,126);
insert into shardddl1.tb2 values (127,127);
insert into shardddl1.tb2 values (128,128);
insert into shardddl1.tb2 values (129,129);
insert into shardddl1.tb2 values (130,130);
commit;
begin;
insert into shardddl1.t_1 values (131,131);
insert into shardddl1.t_1 values (132,132);
insert into shardddl1.t_1 values (133,133);
insert into shardddl1.t_1 values (134,134);
insert into shardddl1.t_1 values (135,135);
insert into shardddl1.t_1 values (136,136);
insert into shardddl1.t_1 values (137,137);
insert into shardddl1.t_1 values (138,138);
insert into shardddl1.t_1 values (139,139);
insert into shardddl1.t_1 values (140,140);
commit;
check diff successfully
wait process dm-master.test exit...
begin;
insert into shardddl1.tb2 values (201,201);
insert into shardddl1.tb2 values (202,202);
insert into shardddl1.tb2 values (203,203);
insert into shardddl1.tb2 values (204,204);
insert into shardddl1.tb2 values (205,205);
insert into shardddl1.tb2 values (206,206);
insert into shardddl1.tb2 values (207,207);
insert into shardddl1.tb2 values (208,208);
insert into shardddl1.tb2 values (209,209);
insert into shardddl1.tb2 values (210,210);
commit;
begin;
insert into shardddl1.tb1 values (211,211);
insert into shardddl1.tb1 values (212,212);
insert into shardddl1.tb1 values (213,213);
insert into shardddl1.tb1 values (214,214);
insert into shardddl1.tb1 values (215,215);
insert into shardddl1.tb1 values (216,216);
insert into shardddl1.tb1 values (217,217);
insert into shardddl1.tb1 values (218,218);
insert into shardddl1.tb1 values (219,219);
insert into shardddl1.tb1 values (220,220);
commit;
begin;
insert into shardddl1.tb2 values (221,221);
insert into shardddl1.tb2 values (222,222);
insert into shardddl1.tb2 values (223,223);
insert into shardddl1.tb2 values (224,224);
insert into shardddl1.tb2 values (225,225);
insert into shardddl1.tb2 values (226,226);
insert into shardddl1.tb2 values (227,227);
insert into shardddl1.tb2 values (228,228);
insert into shardddl1.tb2 values (229,229);
insert into shardddl1.tb2 values (230,230);
commit;
begin;
insert into shardddl1.t_1 values (231,231);
insert into shardddl1.t_1 values (232,232);
insert into shardddl1.t_1 values (233,233);
insert into shardddl1.t_1 values (234,234);
insert into shardddl1.t_1 values (235,235);
insert into shardddl1.t_1 values (236,236);
insert into shardddl1.t_1 values (237,237);
insert into shardddl1.t_1 values (238,238);
insert into shardddl1.t_1 values (239,239);
insert into shardddl1.t_1 values (240,240);
commit;
check diff failed 1-th time, retry later
[Sun May  5 11:30:35 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 3\/0\/0" count: 0 != expected: 1, failed the 5-th time, will retry again
wait process dm-master.test exit...
wait process dm-master.test exit...
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 3\/0\/0" count: 0 != expected: 1, failed the 6-th time, will retry again
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:30:38 CST 2024] <<<<<< finish DM-RESYNC_TXN_INTERRUPT optimistic >>>>>>
[Sun May  5 11:30:38 CST 2024] <<<<<< start DM-STRICT_OPTIMISTIC_SINGLE_SOURCE optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/single-source-strict-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
check diff failed 1-th time, retry later
dmctl test cmd: "query-status test"
got=1 expected=1
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "stop-task test"
[Sun May  5 11:30:40 CST 2024] <<<<<< finish DM-STRICT_OPTIMISTIC_SINGLE_SOURCE optimistic >>>>>>
[Sun May  5 11:30:40 CST 2024] <<<<<< start DM-STRICT_OPTIMISTIC_DOUBLE_SOURCE optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-strict-optimistic.yaml --remove-meta"
got=1 expected=1
got=1 expected=1
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
wait process dm-master.test exit...
check diff failed 2-th time, retry later
dmctl test cmd: "resume-task test"
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "stop-task test"
[Sun May  5 11:30:42 CST 2024] <<<<<< finish DM-STRICT_OPTIMISTIC_DOUBLE_SOURCE optimistic >>>>>>
[Sun May  5 11:30:42 CST 2024] <<<<<< start DM-131 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 4\/0\/0" count: 0 != expected: 1, failed the 0-th time, will retry again
check diff failed 3-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:30:44 CST 2024] <<<<<< finish DM-131 optimistic >>>>>>
wait process dm-master.test exit...
[Sun May  5 11:30:45 CST 2024] <<<<<< start DM-132 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
check diff failed at last
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
wait process dm-master.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 4\/0\/0" count: 0 != expected: 1, failed the 1-th time, will retry again
[Sun May  5 11:30:46 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #3
[Sun May  5 11:30:46 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
wait process dm-master.test exit...
check diff successfully
check log contain failed 1-th time, retry later
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 4\/0\/0" count: 0 != expected: 1, failed the 2-th time, will retry again
wait process dm-master.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:30:49 CST 2024] <<<<<< finish DM-132 pessimistic >>>>>>
[Sun May  5 11:30:49 CST 2024] <<<<<< start DM-132 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
dmctl test cmd: "shard-ddl-lock"
wait process dm-master.test exit...
got=1 expected=1
restart dm-master
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 4\/0\/0" count: 0 != expected: 1, failed the 3-th time, will retry again
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
wait process dm-master exit...
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:30:51 CST 2024] <<<<<< finish DM-132 optimistic >>>>>>
wait process dm-master.test exit...
wait process dm-master exit...
process dm-master already exit
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 4\/0\/0" count: 0 != expected: 1, failed the 4-th time, will retry again
wait process dm-master.test exit...
[Sun May  5 11:30:52 CST 2024] <<<<<< start DM-133 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
[Sun May  5 11:30:53 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 4\/0\/0" count: 0 != expected: 1, failed the 5-th time, will retry again
wait process dm-master.test exit...
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
wait process dm-master.test exit...
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:30:56 CST 2024] <<<<<< finish DM-133 pessimistic >>>>>>
[Sun May  5 11:30:56 CST 2024] <<<<<< start DM-133 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
got=1 expected=1
got=1 expected=1
check diff failed 1-th time, retry later
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 4\/0\/0" count: 0 != expected: 1, failed the 6-th time, will retry again
wait process dm-master.test exit...
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:30:58 CST 2024] <<<<<< finish DM-133 optimistic >>>>>>
wait process dm-master.test exit...
got=1 expected=1
got=1 expected=1
check diff failed 2-th time, retry later
[Sun May  5 11:30:59 CST 2024] <<<<<< start DM-134 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
wait process dm-master.test exit...
dmctl test cmd: "resume-task test"
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "pause-task test"
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
check diff failed 3-th time, retry later
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
wait process dm-master.test exit...
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 5\/0\/0" count: 0 != expected: 1, failed the 0-th time, will retry again
check diff failed at last
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
use sync_diff_inspector to check increment data
check diff successfully
check diff successfully
data checked after one worker was killed
try to kill worker port 8263
wait process dm-worker2 exit...
wait process dm-worker2 exit...
process dm-worker2 already exit
worker2 was killed
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test2"
got=2 expected=2
[Sun May  5 11:31:02 CST 2024] <<<<<< finish test_multi_task_reduce_and_restart_worker >>>>>>
3 dm-master alive
3 dm-worker alive
0 dm-syncer alive
dmctl test cmd: "resume-task test"
check diff successfully
dmctl test cmd: "stop-task test"
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:31:03 CST 2024] <<<<<< finish DM-134 pessimistic >>>>>>
[Sun May  5 11:31:03 CST 2024] <<<<<< start DM-134 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
[Sun May  5 11:31:03 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #4
[Sun May  5 11:31:03 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
check diff successfully
dmctl test cmd: "stop-task test"
check log contain failed 1-th time, retry later
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 5\/0\/0" count: 0 != expected: 1, failed the 1-th time, will retry again
[Sun May  5 11:31:05 CST 2024] <<<<<< finish DM-134 optimistic >>>>>>
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
[Sun May  5 11:31:06 CST 2024] <<<<<< start DM-135 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
dmctl test cmd: "shard-ddl-lock"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 5\/0\/0" count: 0 != expected: 1, failed the 2-th time, will retry again
got=1 expected=1
restart dm-master
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
wait process dm-master.test exit...
got=2 expected=2
dmctl test cmd: "stop-task test"
[Sun May  5 11:31:08 CST 2024] <<<<<< finish DM-135 pessimistic >>>>>>
[Sun May  5 11:31:08 CST 2024] <<<<<< start DM-135 optimistic >>>>>>
wait process dm-master exit...
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master exit...
process dm-master already exit
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 5\/0\/0" count: 0 != expected: 1, failed the 3-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "stop-task test"
[Sun May  5 11:31:10 CST 2024] <<<<<< finish DM-135 optimistic >>>>>>
wait process dm-master.test exit...
wait process dm-master.test exit...
[Sun May  5 11:31:11 CST 2024] <<<<<< start DM-136 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
process dm-master.test already exit
process dm-worker.test already exit
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 5\/0\/0" count: 0 != expected: 1, failed the 4-th time, will retry again
[Sun May  5 11:31:11 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
process dm-syncer.test already exit
[Sun May  5 11:31:11 CST 2024] <<<<<< test case ha_cases2 success! >>>>>>
wait process dm-master.test exit...
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_dm_integration_test-1901/tiflow-dm already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
dmctl test cmd: "query-status test"
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
got=2 expected=2
wait process dm-master.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:31:13 CST 2024] <<<<<< finish DM-136 optimistic >>>>>>
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 5\/0\/0" count: 0 != expected: 1, failed the 5-th time, will retry again
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
wait process dm-master.test exit...
[Sun May  5 11:31:14 CST 2024] <<<<<< start DM-137 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
wait process dm-master.test exit...
check diff failed 1-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
wait process dm-master.test exit...
check diff failed 1-th time, retry later
got=1 expected=1
got=1 expected=1
check diff failed 2-th time, retry later
wait process dm-master.test exit...
dmctl test cmd: "resume-task test"
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "query-status test"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 6\/0\/0" count: 0 != expected: 1, failed the 0-th time, will retry again
wait process dm-master.test exit...
check diff failed 2-th time, retry later
wait process dm-master.test exit...
check diff failed 3-th time, retry later
wait process dm-master.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 6\/0\/0" count: 0 != expected: 1, failed the 1-th time, will retry again
[Sun May  5 11:31:20 CST 2024] <<<<<< finish DM-137 optimistic >>>>>>
wait process dm-master.test exit...
check diff failed at last
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
[Sun May  5 11:31:21 CST 2024] <<<<<< start DM-138 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:31:21 CST 2024] <<<<<< finish DM-DropAddColumn optimistic >>>>>>
run DM_DropAddColumn case #5
[Sun May  5 11:31:21 CST 2024] <<<<<< start DM-DropAddColumn optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 6\/0\/0" count: 0 != expected: 1, failed the 2-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
wait process dm-master.test exit...
check diff failed 1-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
check log contain failed 1-th time, retry later
wait process dm-master.test exit...
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 6\/0\/0" count: 0 != expected: 1, failed the 3-th time, will retry again
wait process dm-master.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:31:25 CST 2024] <<<<<< finish DM-138 pessimistic >>>>>>
[Sun May  5 11:31:25 CST 2024] <<<<<< start DM-138 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
dmctl test cmd: "shard-ddl-lock"
got=1 expected=1
dmctl test cmd: "query-status test"
got=3 expected=3
got=2 expected=2
restart dm-master
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 6\/0\/0" count: 0 != expected: 1, failed the 4-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
wait process dm-master exit...
wait process dm-master.test exit...
[Sun May  5 11:31:27 CST 2024] <<<<<< finish DM-138 optimistic >>>>>>
wait process dm-master exit...
process dm-master already exit
wait process dm-master.test exit...
[Sun May  5 11:31:28 CST 2024] <<<<<< start DM-139 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 6\/0\/0" count: 0 != expected: 1, failed the 5-th time, will retry again
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
wait process dm-master.test exit...
check diff failed 1-th time, retry later
[Sun May  5 11:31:29 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl2/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait process dm-master.test exit...
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 6\/0\/0" count: 0 != expected: 1, failed the 6-th time, will retry again
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
wait process dm-master.test exit...
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 0-th time, will retry again
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:31:32 CST 2024] <<<<<< finish DM-139 pessimistic >>>>>>
[Sun May  5 11:31:32 CST 2024] <<<<<< start DM-139 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
got=0 expected=1
command: query-status test "processedRowsStatus": "insert\/update\/delete: 6\/0\/0" count: 0 != expected: 1, failed the 7-th time, will retry again
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
wait process dm-master.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:31:34 CST 2024] <<<<<< finish DM-139 optimistic >>>>>>
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 1-th time, will retry again
wait process dm-master.test exit...
[Sun May  5 11:31:35 CST 2024] <<<<<< start DM-142 pessimistic >>>>>>
got=1 expected=1
got=1 expected=1
--> test duplicate auto-incr pk
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
wait process dm-master.test exit...
wait process dm-master.test exit...
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 2-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-master.test exit...
wait process dm-worker.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 3-th time, will retry again
wait process dm-worker.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
[Sun May  5 11:31:39 CST 2024] <<<<<< finish DM-142 pessimistic >>>>>>
wait process dm-master.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Sun May  5 11:31:39 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/validator_basic/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
[Sun May  5 11:31:40 CST 2024] <<<<<< start DM-143 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
wait process dm-master.test exit...
rpc addr 127.0.0.1:8261 is alive
[Sun May  5 11:31:40 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/validator_basic/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 4-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
wait process dm-master.test exit...
[Sun May  5 11:31:42 CST 2024] <<<<<< finish DM-143 pessimistic >>>>>>
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 5-th time, will retry again
wait process dm-master.test exit...
[Sun May  5 11:31:43 CST 2024] <<<<<< start DM-145 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/validator_basic/conf/source1.yaml"
[Sun May  5 11:31:43 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/validator_basic/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/validator_basic/conf/source2.yaml"
wait process dm-master.test exit...
check diff failed 1-th time, retry later
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 6-th time, will retry again
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/validator_basic/conf/sharding-task.yaml --remove-meta"
wait process dm-master.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
wait process dm-master.test exit...
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 7-th time, will retry again
[Sun May  5 11:31:47 CST 2024] <<<<<< finish DM-145 pessimistic >>>>>>
[Sun May  5 11:31:47 CST 2024] <<<<<< start DM-145 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
wait process dm-master.test exit...
[Sun May  5 11:31:49 CST 2024] <<<<<< finish DM-145 optimistic >>>>>>
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 8-th time, will retry again
wait process dm-master.test exit...
[Sun May  5 11:31:50 CST 2024] <<<<<< start DM-146 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
wait process dm-master.test exit...
dmctl test cmd: "query-status test"
got=1 expected=2
command: query-status test "processedRowsStatus": "insert\/update\/delete: 3\/0\/0" count: 1 != expected: 2, failed the 0-th time, will retry again
dmctl test cmd: "query-status test"
got=0 expected=1
command: query-status test because schema conflict detected count: 0 != expected: 1, failed the 9-th time, will retry again
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "stop-task test"
wait process dm-master.test exit...
got=2 expected=2
got=2 expected=2
got=2 expected=2
[Sun May  5 11:31:52 CST 2024] <<<<<< finish DM-146 pessimistic >>>>>>
[Sun May  5 11:31:52 CST 2024] <<<<<< start DM-146 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
{
    "result": true,
    "msg": "",
    "sources": [
        {
            "result": true,
            "msg": "",
            "sourceStatus": {
                "source": "mysql-replica-01",
                "worker": "worker1",
                "result": null,
                "relayStatus": null
            },
            "subTaskStatus": [
                {
                    "name": "test",
                    "stage": "Paused",
                    "unit": "Sync",
                    "result": {
                        "isCanceled": false,
                        "errors": [
                            {
                                "ErrCode": 42501,
                                "ErrClass": "ha",
                                "ErrScope": "internal",
                                "ErrLevel": "high",
                                "Message": "startLocation: [position: (dm-it-82335d2a-eafa-499d-8fcb-74b04d76f0cc-x79r8-1sxfm-bin.000001, 42228), gtid-set: 1b80cda8-0a8e-11ef-b4de-1e57d4925bec:1-194], endLocation: [position: (dm-it-82335d2a-eafa-499d-8fcb-74b04d76f0cc-x79r8-1sxfm-bin.000001, 42353), gtid-set: 1b80cda8-0a8e-11ef-b4de-1e57d4925bec:1-195], origin SQL: [alter table shardddl1.tb1 add column b int after a]: fail to do etcd txn operation: txn commit failed",
                                "RawCause": "rpc error: code = Unavailable desc = error reading from server: EOF",
                                "Workaround": "Please check dm-master's node status and the network between this node and dm-master"
                            }
                        ],
                        "detail": null
                    },
                    "unresolvedDDLLockID": "",
                    "sync": {
                        "totalEvents": "12",
                        "totalTps": "0",
                        "recentTps": "0",
                        "masterBinlog": "(dm-it-82335d2a-eafa-499d-8fcb-74b04d76f0cc-x79r8-1sxfm-bin.000001, 42353)",
                        "masterBinlogGtid": "1b80cda8-0a8e-11ef-b4de-1e57d4925bec:1-195",
                        "syncerBinlog": "(dm-it-82335d2a-eafa-499d-8fcb-74b04d76f0cc-x79r8-1sxfm-bin.000001, 42163)",
                        "syncerBinlogGtid": "1b80cda8-0a8e-11ef-b4de-1e57d4925bec:1-194",
                        "blockingDDLs": [
                        ],
                        "unresolvedGroups": [
                        ],
                        "synced": false,
                        "binlogType": "remote",
                        "secondsBehindMaster": "0",
                        "blockDDLOwner": "",
                        "conflictMsg": "",
                        "totalRows": "12",
                        "totalRps": "0",
                        "recentRps": "0"
                    },
                    "validation": null
                }
            ]
        },
        {
            "result": true,
            "msg": "",
            "sourceStatus": {
                "source": "mysql-replica-02",
                "worker": "worker2",
                "result": null,
                "relayStatus": {
                    "masterBinlog": "(dm-it-82335d2a-eafa-499d-8fcb-74b04d76f0cc-x79r8-1sxfm-bin.000001, 39206)",
                    "masterBinlogGtid": "1c0688ff-0a8e-11ef-8faa-1e57d4925bec:1-167",
                    "relaySubDir": "1c0688ff-0a8e-11ef-8faa-1e57d4925bec.000001",
                    "relayBinlog": "(dm-it-82335d2a-eafa-499d-8fcb-74b04d76f0cc-x79r8-1sxfm-bin.000001, 39206)",
                    "relayBinlogGtid": "1c0688ff-0a8e-11ef-8faa-1e57d4925bec:1-167",
                    "relayCatchUpMaster": true,
                    "stage": "Running",
                    "result": null
                }
            },
            "subTaskStatus": [
                {
                    "name": "test",
                    "stage": "Running",
                    "unit": "Sync",
                    "result": null,
                    "unresolvedDDLLockID": "",
                    "sync": {
                        "totalEvents": "6",
                        "totalTps": "0",
                        "recentTps": "0",
                        "masterBinlog": "(dm-it-82335d2a-eafa-499d-8fcb-74b04d76f0cc-x79r8-1sxfm-bin.000001, 39206)",
                        "masterBinlogGtid": "1c0688ff-0a8e-11ef-8faa-1e57d4925bec:1-167",
                        "syncerBinlog": "(dm-it-82335d2a-eafa-499d-8fcb-74b04d76f0cc-x79r8-1sxfm-bin|000001.000001, 38926)",
                        "syncerBinlogGtid": "1c0688ff-0a8e-11ef-8faa-1e57d4925bec:1-166",
                        "blockingDDLs": [
                        ],
                        "unresolvedGroups": [
                        ],
                        "synced": false,
                        "binlogType": "local",
                        "secondsBehindMaster": "0",
                        "blockDDLOwner": "",
                        "conflictMsg": "",
                        "totalRows": "6",
                        "totalRps": "0",
                        "recentRps": "0"
                    },
                    "validation": null
                }
            ]
        }
    ]
}
PASS
coverage: 3.8% of statements in github.com/pingcap/tiflow/dm/...
curl: (7) Failed connect to 127.0.0.1:8361; Connection refused
curl: (7) Failed connect to 127.0.0.1:8461; Connection refused
curl: (7) Failed connect to 127.0.0.1:8561; Connection refused
curl: (7) Failed connect to 127.0.0.1:8661; Connection refused
curl: (7) Failed connect to 127.0.0.1:8761; Connection refused
curl: (7) Failed connect to 127.0.0.1:8264; Connection refused
curl: (7) Failed connect to 127.0.0.1:18262; Connection refused
curl: (7) Failed connect to 127.0.0.1:18263; Connection refused
make: *** [dm_integration_test_in_group] Error 1
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
[Pipeline] // dir
Post stage
[Pipeline] sh
got=1 expected=1
dmctl test cmd: "stop-task test"
wait process dm-master.test exit...
[Sun May  5 11:31:53 CST 2024] <<<<<< finish DM-146 optimistic >>>>>>
+ ls /tmp/dm_test
cov.shardddl1.dmctl.1714879298.639.out
cov.shardddl1.dmctl.1714879300.743.out
cov.shardddl1.dmctl.1714879304.1009.out
cov.shardddl1.dmctl.1714879305.1043.out
cov.shardddl1.dmctl.1714879348.4392.out
cov.shardddl1.dmctl.1714879351.4711.out
cov.shardddl1.dmctl.1714879352.4743.out
cov.shardddl1.dmctl.1714879383.5241.out
cov.shardddl1.dmctl.1714879387.5552.out
cov.shardddl1.dmctl.1714879388.5591.out
cov.shardddl1.dmctl.1714879405.6193.out
cov.shardddl1.dmctl.1714879409.6504.out
cov.shardddl1.dmctl.1714879410.6543.out
cov.shardddl1.dmctl.1714879410.6635.out
cov.shardddl1.dmctl.1714879411.6773.out
cov.shardddl1.dmctl.1714879412.6802.out
cov.shardddl1.dmctl.1714879414.6906.out
cov.shardddl1.dmctl.1714879416.7054.out
cov.shardddl1.dmctl.1714879418.7102.out
cov.shardddl1.dmctl.1714879418.7221.out
cov.shardddl1.dmctl.1714879418.7259.out
cov.shardddl1.dmctl.1714879418.7297.out
cov.shardddl1.dmctl.1714879419.7330.out
cov.shardddl1.dmctl.1714879420.7364.out
cov.shardddl1.dmctl.1714879427.7555.out
cov.shardddl1.dmctl.1714879429.7619.out
cov.shardddl1.dmctl.1714879429.7653.out
cov.shardddl1.dmctl.1714879429.7686.out
cov.shardddl1.dmctl.1714879430.7826.out
cov.shardddl1.dmctl.1714879431.7880.out
cov.shardddl1.dmctl.1714879433.8057.out
cov.shardddl1.dmctl.1714879434.8198.out
cov.shardddl1.dmctl.1714879435.8238.out
cov.shardddl1.dmctl.1714879436.8356.out
cov.shardddl1.dmctl.1714879436.8390.out
cov.shardddl1.dmctl.1714879436.8534.out
cov.shardddl1.dmctl.1714879437.8592.out
cov.shardddl1.dmctl.1714879438.8686.out
cov.shardddl1.dmctl.1714879441.8820.out
cov.shardddl1.dmctl.1714879441.8855.out
cov.shardddl1.dmctl.1714879441.8889.out
cov.shardddl1.dmctl.1714879446.9147.out
cov.shardddl1.dmctl.1714879446.9200.out
cov.shardddl1.dmctl.1714879448.9232.out
cov.shardddl1.dmctl.1714879452.9363.out
cov.shardddl1.dmctl.1714879453.9507.out
cov.shardddl1.dmctl.1714879454.9546.out
cov.shardddl1.dmctl.1714879458.9685.out
cov.shardddl1.dmctl.1714879493.10011.out
cov.shardddl1.dmctl.1714879493.9968.out
cov.shardddl1.dmctl.1714879495.10052.out
cov.shardddl1.dmctl.1714879497.10141.out
cov.shardddl1.dmctl.1714879497.10175.out
cov.shardddl1.dmctl.1714879504.10338.out
cov.shardddl1.dmctl.1714879504.10373.out
cov.shardddl1.dmctl.1714879504.10406.out
cov.shardddl1.dmctl.1714879504.10547.out
cov.shardddl1.dmctl.1714879506.10585.out
cov.shardddl1.dmctl.1714879508.10675.out
cov.shardddl1.dmctl.1714879508.10712.out
cov.shardddl1.dmctl.1714879515.10884.out
cov.shardddl1.dmctl.1714879515.10921.out
cov.shardddl1.dmctl.1714879515.10958.out
cov.shardddl1.dmctl.1714879515.11100.out
cov.shardddl1.dmctl.1714879517.11147.out
cov.shardddl1.dmctl.1714879526.11611.out
cov.shardddl1.dmctl.1714879526.11752.out
cov.shardddl1.dmctl.1714879528.11797.out
cov.shardddl1.dmctl.1714879537.12287.out
cov.shardddl1.dmctl.1714879537.12427.out
cov.shardddl1.dmctl.1714879538.12464.out
cov.shardddl1.dmctl.1714879541.12757.out
cov.shardddl1.dmctl.1714879542.12901.out
cov.shardddl1.dmctl.1714879543.12934.out
cov.shardddl1.dmctl.1714879544.13195.out
cov.shardddl1.dmctl.1714879547.13493.out
cov.shardddl1.dmctl.1714879548.13533.out
cov.shardddl1.dmctl.1714879550.13585.out
cov.shardddl1.dmctl.1714879551.13621.out
cov.shardddl1.dmctl.1714879554.13947.out
cov.shardddl1.dmctl.1714879555.13985.out
cov.shardddl1.dmctl.1714879556.14096.out
cov.shardddl1.dmctl.1714879559.14424.out
cov.shardddl1.dmctl.1714879561.14465.out
cov.shardddl1.dmctl.1714879561.14505.out
cov.shardddl1.dmctl.1714879561.14545.out
cov.shardddl1.dmctl.1714879562.14683.out
cov.shardddl1.dmctl.1714879563.14714.out
cov.shardddl1.dmctl.1714879564.14789.out
cov.shardddl1.dmctl.1714879565.14926.out
cov.shardddl1.dmctl.1714879566.14962.out
cov.shardddl1.dmctl.1714879573.15121.out
cov.shardddl1.dmctl.1714879574.15257.out
cov.shardddl1.dmctl.1714879575.15289.out
cov.shardddl1.dmctl.1714879578.15387.out
cov.shardddl1.dmctl.1714879579.15528.out
cov.shardddl1.dmctl.1714879580.15566.out
cov.shardddl1.dmctl.1714879580.15644.out
cov.shardddl1.master.out
cov.shardddl1.worker.8262.1714879297.out
cov.shardddl1.worker.8262.1714879303.out
cov.shardddl1.worker.8262.1714879350.out
cov.shardddl1.worker.8262.1714879386.out
cov.shardddl1.worker.8262.1714879408.out
cov.shardddl1.worker.8262.1714879546.out
cov.shardddl1.worker.8262.1714879553.out
cov.shardddl1.worker.8262.1714879558.out
cov.shardddl1.worker.8263.1714879299.out
cov.shardddl1.worker.8263.1714879303.out
cov.shardddl1.worker.8263.1714879350.out
cov.shardddl1.worker.8263.1714879386.out
cov.shardddl1.worker.8263.1714879408.out
cov.shardddl1.worker.8263.1714879546.out
cov.shardddl1.worker.8263.1714879553.out
cov.shardddl1.worker.8263.1714879558.out
cov.shardddl1_1.dmctl.1714879589.16060.out
cov.shardddl1_1.dmctl.1714879590.16170.out
cov.shardddl1_1.dmctl.1714879592.16255.out
cov.shardddl1_1.dmctl.1714879593.16312.out
cov.shardddl1_1.dmctl.1714879596.16645.out
cov.shardddl1_1.dmctl.1714879596.16778.out
cov.shardddl1_1.dmctl.1714879597.16806.out
cov.shardddl1_1.dmctl.1714879598.17059.out
cov.shardddl1_1.dmctl.1714879598.17200.out
cov.shardddl1_1.dmctl.1714879600.17265.out
cov.shardddl1_1.dmctl.1714879602.17374.out
cov.shardddl1_1.dmctl.1714879603.17511.out
cov.shardddl1_1.dmctl.1714879605.17546.out
cov.shardddl1_1.dmctl.1714879607.17669.out
cov.shardddl1_1.dmctl.1714879608.17807.out
cov.shardddl1_1.dmctl.1714879610.17837.out
cov.shardddl1_1.dmctl.1714879612.17941.out
cov.shardddl1_1.dmctl.1714879613.18079.out
cov.shardddl1_1.dmctl.1714879615.18117.out
cov.shardddl1_1.dmctl.1714879617.18226.out
cov.shardddl1_1.dmctl.1714879618.18367.out
cov.shardddl1_1.dmctl.1714879619.18395.out
cov.shardddl1_1.dmctl.1714879622.18513.out
cov.shardddl1_1.dmctl.1714879623.18653.out
cov.shardddl1_1.dmctl.1714879625.18690.out
cov.shardddl1_1.dmctl.1714879625.18756.out
cov.shardddl1_1.dmctl.1714879626.18896.out
cov.shardddl1_1.dmctl.1714879627.18926.out
cov.shardddl1_1.dmctl.1714879628.18993.out
cov.shardddl1_1.dmctl.1714879629.19135.out
cov.shardddl1_1.dmctl.1714879630.19165.out
cov.shardddl1_1.dmctl.1714879631.19264.out
cov.shardddl1_1.dmctl.1714879632.19405.out
cov.shardddl1_1.dmctl.1714879633.19436.out
cov.shardddl1_1.dmctl.1714879635.19516.out
cov.shardddl1_1.dmctl.1714879637.19647.out
cov.shardddl1_1.dmctl.1714879638.19681.out
cov.shardddl1_1.dmctl.1714879640.19735.out
cov.shardddl1_1.dmctl.1714879642.19870.out
cov.shardddl1_1.dmctl.1714879643.19898.out
cov.shardddl1_1.dmctl.1714879643.19945.out
cov.shardddl1_1.dmctl.1714879644.20084.out
cov.shardddl1_1.dmctl.1714879646.20122.out
cov.shardddl1_1.dmctl.1714879648.20179.out
cov.shardddl1_1.dmctl.1714879649.20313.out
cov.shardddl1_1.dmctl.1714879650.20345.out
cov.shardddl1_1.dmctl.1714879652.20397.out
cov.shardddl1_1.dmctl.1714879654.20535.out
cov.shardddl1_1.dmctl.1714879655.20569.out
cov.shardddl1_1.dmctl.1714879655.20607.out
cov.shardddl1_1.dmctl.1714879655.20744.out
cov.shardddl1_1.dmctl.1714879657.20777.out
cov.shardddl1_1.dmctl.1714879657.20817.out
cov.shardddl1_1.dmctl.1714879658.20949.out
cov.shardddl1_1.dmctl.1714879659.20980.out
cov.shardddl1_1.dmctl.1714879662.21034.out
cov.shardddl1_1.dmctl.1714879663.21174.out
cov.shardddl1_1.dmctl.1714879664.21203.out
cov.shardddl1_1.dmctl.1714879666.21251.out
cov.shardddl1_1.dmctl.1714879668.21384.out
cov.shardddl1_1.dmctl.1714879669.21415.out
cov.shardddl1_1.dmctl.1714879671.21467.out
cov.shardddl1_1.dmctl.1714879672.21607.out
cov.shardddl1_1.dmctl.1714879674.21642.out
cov.shardddl1_1.dmctl.1714879674.21678.out
cov.shardddl1_1.dmctl.1714879674.21709.out
cov.shardddl1_1.dmctl.1714879675.21851.out
cov.shardddl1_1.dmctl.1714879677.21879.out
cov.shardddl1_1.dmctl.1714879677.21977.out
cov.shardddl1_1.dmctl.1714879678.22116.out
cov.shardddl1_1.dmctl.1714879679.22144.out
cov.shardddl1_1.dmctl.1714879680.22197.out
cov.shardddl1_1.dmctl.1714879680.22230.out
cov.shardddl1_1.dmctl.1714879681.22370.out
cov.shardddl1_1.dmctl.1714879682.22396.out
cov.shardddl1_1.dmctl.1714879683.22433.out
cov.shardddl1_1.dmctl.1714879683.22469.out
cov.shardddl1_1.dmctl.1714879684.22614.out
cov.shardddl1_1.dmctl.1714879685.22643.out
cov.shardddl1_1.dmctl.1714879688.22762.out
cov.shardddl1_1.dmctl.1714879688.22899.out
cov.shardddl1_1.dmctl.1714879689.22937.out
cov.shardddl1_1.dmctl.1714879690.23016.out
cov.shardddl1_1.dmctl.1714879691.23156.out
cov.shardddl1_1.dmctl.1714879692.23185.out
cov.shardddl1_1.dmctl.1714879692.23242.out
cov.shardddl1_1.dmctl.1714879693.23379.out
cov.shardddl1_1.dmctl.1714879694.23407.out
cov.shardddl1_1.dmctl.1714879694.23465.out
cov.shardddl1_1.dmctl.1714879694.23500.out
cov.shardddl1_1.dmctl.1714879696.23643.out
cov.shardddl1_1.dmctl.1714879697.23671.out
cov.shardddl1_1.dmctl.1714879702.23848.out
cov.shardddl1_1.dmctl.1714879703.23989.out
cov.shardddl1_1.dmctl.1714879704.24020.out
cov.shardddl1_1.dmctl.1714879707.24146.out
cov.shardddl1_1.dmctl.1714879708.24290.out
cov.shardddl1_1.dmctl.1714879709.24320.out
cov.shardddl1_1.dmctl.1714879709.24383.out
cov.shardddl1_1.dmctl.1714879710.24417.out
cov.shardddl1_1.dmctl.1714879710.24563.out
cov.shardddl1_1.dmctl.1714879711.24592.out
cov.shardddl1_1.dmctl.1714879711.24654.out
cov.shardddl1_1.dmctl.1714879712.24688.out
cov.shardddl1_1.dmctl.1714879713.24831.out
cov.shardddl1_1.dmctl.1714879714.24860.out
cov.shardddl1_1.dmctl.1714879716.24966.out
cov.shardddl1_1.master.out
cov.shardddl1_1.worker.8262.1714879588.out
cov.shardddl1_1.worker.8263.1714879589.out
cov.shardddl2.dmctl.1714879725.25388.out
cov.shardddl2.dmctl.1714879726.25499.out
cov.shardddl2.dmctl.1714879728.25584.out
cov.shardddl2.dmctl.1714879729.25652.out
cov.shardddl2.dmctl.1714879734.25841.out
cov.shardddl2.dmctl.1714879741.26027.out
cov.shardddl2.dmctl.1714879745.26186.out
cov.shardddl2.dmctl.1714879747.26273.out
cov.shardddl2.dmctl.1714879747.26414.out
cov.shardddl2.dmctl.1714879748.26449.out
cov.shardddl2.dmctl.1714879758.26714.out
cov.shardddl2.dmctl.1714879758.26786.out
cov.shardddl2.dmctl.1714879758.26922.out
cov.shardddl2.dmctl.1714879760.26956.out
cov.shardddl2.dmctl.1714879765.27138.out
cov.shardddl2.dmctl.1714879772.27331.out
cov.shardddl2.dmctl.1714879776.27480.out
cov.shardddl2.dmctl.1714879780.27601.out
cov.shardddl2.dmctl.1714879780.27738.out
cov.shardddl2.dmctl.1714879782.27781.out
cov.shardddl2.dmctl.1714879800.28224.out
cov.shardddl2.dmctl.1714879800.28254.out
cov.shardddl2.dmctl.1714879801.28391.out
cov.shardddl2.dmctl.1714879802.28430.out
cov.shardddl2.dmctl.1714879804.28530.out
cov.shardddl2.dmctl.1714879804.28577.out
cov.shardddl2.dmctl.1714879804.28616.out
cov.shardddl2.dmctl.1714879811.28741.out
cov.shardddl2.dmctl.1714879811.28775.out
cov.shardddl2.dmctl.1714879811.28810.out
cov.shardddl2.dmctl.1714879811.28898.out
cov.shardddl2.dmctl.1714879812.29045.out
cov.shardddl2.dmctl.1714879813.29103.out
cov.shardddl2.dmctl.1714879822.29331.out
cov.shardddl2.dmctl.1714879822.29379.out
cov.shardddl2.dmctl.1714879822.29417.out
cov.shardddl2.dmctl.1714879829.29560.out
cov.shardddl2.dmctl.1714879829.29595.out
cov.shardddl2.dmctl.1714879829.29626.out
cov.shardddl2.dmctl.1714879829.29716.out
cov.shardddl2.dmctl.1714879829.29853.out
cov.shardddl2.dmctl.1714879831.29896.out
cov.shardddl2.dmctl.1714879838.30131.out
cov.shardddl2.dmctl.1714879838.30177.out
cov.shardddl2.dmctl.1714879838.30216.out
cov.shardddl2.dmctl.1714879845.30342.out
cov.shardddl2.dmctl.1714879845.30378.out
cov.shardddl2.dmctl.1714879845.30410.out
cov.shardddl2.dmctl.1714879845.30494.out
cov.shardddl2.dmctl.1714879846.30629.out
cov.shardddl2.dmctl.1714879847.30680.out
cov.shardddl2.dmctl.1714879849.30787.out
cov.shardddl2.dmctl.1714879856.30972.out
cov.shardddl2.dmctl.1714879856.31010.out
cov.shardddl2.dmctl.1714879862.31149.out
cov.shardddl2.dmctl.1714879862.31185.out
cov.shardddl2.dmctl.1714879863.31216.out
cov.shardddl2.dmctl.1714879863.31305.out
cov.shardddl2.dmctl.1714879863.31444.out
cov.shardddl2.dmctl.1714879864.31484.out
cov.shardddl2.dmctl.1714879867.31588.out
cov.shardddl2.dmctl.1714879874.31773.out
cov.shardddl2.dmctl.1714879874.31811.out
cov.shardddl2.dmctl.1714879880.31960.out
cov.shardddl2.dmctl.1714879880.31995.out
cov.shardddl2.dmctl.1714879881.32029.out
cov.shardddl2.dmctl.1714879881.32113.out
cov.shardddl2.dmctl.1714879881.32251.out
cov.shardddl2.dmctl.1714879882.32281.out
cov.shardddl2.dmctl.1714879885.32387.out
cov.shardddl2.dmctl.1714879885.32435.out
cov.shardddl2.dmctl.1714879891.32607.out
cov.shardddl2.master.out
cov.shardddl2.worker.8262.1714879724.out
cov.shardddl2.worker.8262.1714879731.out
cov.shardddl2.worker.8263.1714879725.out
cov.shardddl2.worker.8263.1714879762.out
downstream
goroutines
shardddl1
shardddl1_1
shardddl2
sql_res.shardddl1.txt
sql_res.shardddl1_1.txt
sql_res.shardddl2.txt
tidb.toml
++ find /tmp/dm_test/ -type f -name '*.log'
+ tar -cvzf log-G07.tar.gz /tmp/dm_test/shardddl1_1/dmctl.1714879644.log /tmp/dm_test/shardddl1_1/master/log/dm-master.log /tmp/dm_test/shardddl1_1/master/log/stdout.log /tmp/dm_test/shardddl1_1/dmctl.1714879679.log /tmp/dm_test/shardddl1_1/dmctl.1714879602.log /tmp/dm_test/shardddl1_1/dmctl.1714879608.log /tmp/dm_test/shardddl1_1/dmctl.1714879664.log /tmp/dm_test/shardddl1_1/dmctl.1714879610.log /tmp/dm_test/shardddl1_1/dmctl.1714879626.log /tmp/dm_test/shardddl1_1/dmctl.1714879671.log /tmp/dm_test/shardddl1_1/dmctl.1714879716.log /tmp/dm_test/shardddl1_1/dmctl.1714879691.log /tmp/dm_test/shardddl1_1/dmctl.1714879593.log /tmp/dm_test/shardddl1_1/dmctl.1714879654.log /tmp/dm_test/shardddl1_1/dmctl.1714879711.log /tmp/dm_test/shardddl1_1/dmctl.1714879681.log /tmp/dm_test/shardddl1_1/dmctl.1714879694.log /tmp/dm_test/shardddl1_1/dmctl.1714879631.log /tmp/dm_test/shardddl1_1/dmctl.1714879689.log /tmp/dm_test/shardddl1_1/dmctl.1714879598.log /tmp/dm_test/shardddl1_1/dmctl.1714879652.log /tmp/dm_test/shardddl1_1/dmctl.1714879600.log /tmp/dm_test/shardddl1_1/dmctl.1714879677.log /tmp/dm_test/shardddl1_1/dmctl.1714879690.log /tmp/dm_test/shardddl1_1/dmctl.1714879648.log /tmp/dm_test/shardddl1_1/dmctl.1714879629.log /tmp/dm_test/shardddl1_1/dmctl.1714879658.log /tmp/dm_test/shardddl1_1/dmctl.1714879674.log /tmp/dm_test/shardddl1_1/dmctl.1714879650.log /tmp/dm_test/shardddl1_1/dmctl.1714879697.log /tmp/dm_test/shardddl1_1/dmctl.1714879657.log /tmp/dm_test/shardddl1_1/worker2/log/dm-worker.log /tmp/dm_test/shardddl1_1/worker2/log/stdout.log /tmp/dm_test/shardddl1_1/dmctl.1714879617.log /tmp/dm_test/shardddl1_1/dmctl.1714879702.log /tmp/dm_test/shardddl1_1/dmctl.1714879714.log /tmp/dm_test/shardddl1_1/dmctl.1714879663.log /tmp/dm_test/shardddl1_1/dmctl.1714879613.log /tmp/dm_test/shardddl1_1/dmctl.1714879672.log /tmp/dm_test/shardddl1_1/dmctl.1714879643.log /tmp/dm_test/shardddl1_1/dmctl.1714879628.log /tmp/dm_test/shardddl1_1/dmctl.1714879683.log /tmp/dm_test/shardddl1_1/dmctl.1714879632.log /tmp/dm_test/shardddl1_1/dmctl.1714879622.log /tmp/dm_test/shardddl1_1/dmctl.1714879659.log /tmp/dm_test/shardddl1_1/dmctl.1714879640.log /tmp/dm_test/shardddl1_1/dmctl.1714879675.log /tmp/dm_test/shardddl1_1/dmctl.1714879666.log /tmp/dm_test/shardddl1_1/dmctl.1714879649.log /tmp/dm_test/shardddl1_1/dmctl.1714879607.log /tmp/dm_test/shardddl1_1/dmctl.1714879597.log /tmp/dm_test/shardddl1_1/dmctl.1714879703.log /tmp/dm_test/shardddl1_1/dmctl.1714879590.log /tmp/dm_test/shardddl1_1/dmctl.1714879678.log /tmp/dm_test/shardddl1_1/dmctl.1714879618.log /tmp/dm_test/shardddl1_1/dmctl.1714879623.log /tmp/dm_test/shardddl1_1/dmctl.1714879612.log /tmp/dm_test/shardddl1_1/dmctl.1714879696.log /tmp/dm_test/shardddl1_1/dmctl.1714879682.log /tmp/dm_test/shardddl1_1/dmctl.1714879692.log /tmp/dm_test/shardddl1_1/sync_diff_stdout.log /tmp/dm_test/shardddl1_1/dmctl.1714879592.log /tmp/dm_test/shardddl1_1/dmctl.1714879615.log /tmp/dm_test/shardddl1_1/dmctl.1714879710.log /tmp/dm_test/shardddl1_1/dmctl.1714879627.log /tmp/dm_test/shardddl1_1/dmctl.1714879638.log /tmp/dm_test/shardddl1_1/dmctl.1714879684.log /tmp/dm_test/shardddl1_1/dmctl.1714879596.log /tmp/dm_test/shardddl1_1/dmctl.1714879637.log /tmp/dm_test/shardddl1_1/dmctl.1714879713.log /tmp/dm_test/shardddl1_1/worker1/log/dm-worker.log /tmp/dm_test/shardddl1_1/worker1/log/stdout.log /tmp/dm_test/shardddl1_1/dmctl.1714879635.log /tmp/dm_test/shardddl1_1/dmctl.1714879633.log /tmp/dm_test/shardddl1_1/dmctl.1714879625.log /tmp/dm_test/shardddl1_1/dmctl.1714879693.log /tmp/dm_test/shardddl1_1/dmctl.1714879642.log /tmp/dm_test/shardddl1_1/dmctl.1714879668.log /tmp/dm_test/shardddl1_1/dmctl.1714879619.log /tmp/dm_test/shardddl1_1/dmctl.1714879646.log /tmp/dm_test/shardddl1_1/dmctl.1714879655.log /tmp/dm_test/shardddl1_1/dmctl.1714879707.log /tmp/dm_test/shardddl1_1/dmctl.1714879669.log /tmp/dm_test/shardddl1_1/dmctl.1714879685.log /tmp/dm_test/shardddl1_1/dmctl.1714879589.log /tmp/dm_test/shardddl1_1/dmctl.1714879688.log /tmp/dm_test/shardddl1_1/dmctl.1714879708.log /tmp/dm_test/shardddl1_1/dmctl.1714879603.log /tmp/dm_test/shardddl1_1/dmctl.1714879709.log /tmp/dm_test/shardddl1_1/dmctl.1714879662.log /tmp/dm_test/shardddl1_1/dmctl.1714879605.log /tmp/dm_test/shardddl1_1/dmctl.1714879712.log /tmp/dm_test/shardddl1_1/dmctl.1714879630.log /tmp/dm_test/shardddl1_1/dmctl.1714879704.log /tmp/dm_test/shardddl1_1/dmctl.1714879680.log /tmp/dm_test/shardddl2/dmctl.1714879880.log /tmp/dm_test/shardddl2/dmctl.1714879726.log /tmp/dm_test/shardddl2/master/log/dm-master.log /tmp/dm_test/shardddl2/master/log/stdout.log /tmp/dm_test/shardddl2/dmctl.1714879845.log /tmp/dm_test/shardddl2/dmctl.1714879831.log /tmp/dm_test/shardddl2/dmctl.1714879863.log /tmp/dm_test/shardddl2/dmctl.1714879881.log /tmp/dm_test/shardddl2/dmctl.1714879867.log /tmp/dm_test/shardddl2/dmctl.1714879772.log /tmp/dm_test/shardddl2/dmctl.1714879725.log /tmp/dm_test/shardddl2/dmctl.1714879885.log /tmp/dm_test/shardddl2/worker2/log/dm-worker.log /tmp/dm_test/shardddl2/worker2/log/stdout.log /tmp/dm_test/shardddl2/dmctl.1714879782.log /tmp/dm_test/shardddl2/dmctl.1714879758.log /tmp/dm_test/shardddl2/dmctl.1714879760.log /tmp/dm_test/shardddl2/dmctl.1714879728.log /tmp/dm_test/shardddl2/dmctl.1714879745.log /tmp/dm_test/shardddl2/dmctl.1714879838.log /tmp/dm_test/shardddl2/dmctl.1714879849.log /tmp/dm_test/shardddl2/dmctl.1714879741.log /tmp/dm_test/shardddl2/dmctl.1714879822.log /tmp/dm_test/shardddl2/dmctl.1714879874.log /tmp/dm_test/shardddl2/sync_diff_stdout.log /tmp/dm_test/shardddl2/dmctl.1714879804.log /tmp/dm_test/shardddl2/dmctl.1714879801.log /tmp/dm_test/shardddl2/dmctl.1714879734.log /tmp/dm_test/shardddl2/dmctl.1714879748.log /tmp/dm_test/shardddl2/dmctl.1714879846.log /tmp/dm_test/shardddl2/worker1/log/dm-worker.log /tmp/dm_test/shardddl2/worker1/log/stdout.log /tmp/dm_test/shardddl2/dmctl.1714879776.log /tmp/dm_test/shardddl2/dmctl.1714879829.log /tmp/dm_test/shardddl2/dmctl.1714879856.log /tmp/dm_test/shardddl2/dmctl.1714879780.log /tmp/dm_test/shardddl2/dmctl.1714879812.log /tmp/dm_test/shardddl2/dmctl.1714879847.log /tmp/dm_test/shardddl2/dmctl.1714879729.log /tmp/dm_test/shardddl2/dmctl.1714879882.log /tmp/dm_test/shardddl2/dmctl.1714879811.log /tmp/dm_test/shardddl2/dmctl.1714879802.log /tmp/dm_test/shardddl2/dmctl.1714879891.log /tmp/dm_test/shardddl2/dmctl.1714879813.log /tmp/dm_test/shardddl2/dmctl.1714879747.log /tmp/dm_test/shardddl2/dmctl.1714879862.log /tmp/dm_test/shardddl2/dmctl.1714879765.log /tmp/dm_test/shardddl2/dmctl.1714879864.log /tmp/dm_test/shardddl2/dmctl.1714879800.log /tmp/dm_test/goroutines/stack/log/master-8261.log /tmp/dm_test/goroutines/stack/log/master-8361.log /tmp/dm_test/goroutines/stack/log/worker-8262.log /tmp/dm_test/goroutines/stack/log/master-8661.log /tmp/dm_test/goroutines/stack/log/worker-18263.log /tmp/dm_test/goroutines/stack/log/master-8461.log /tmp/dm_test/goroutines/stack/log/worker-18262.log /tmp/dm_test/goroutines/stack/log/master-8561.log /tmp/dm_test/goroutines/stack/log/worker-8264.log /tmp/dm_test/goroutines/stack/log/worker-8263.log /tmp/dm_test/goroutines/stack/log/master-8761.log /tmp/dm_test/shardddl1/dmctl.1714879437.log /tmp/dm_test/shardddl1/dmctl.1714879300.log /tmp/dm_test/shardddl1/dmctl.1714879495.log /tmp/dm_test/shardddl1/master/log/dm-master.log /tmp/dm_test/shardddl1/master/log/stdout.log /tmp/dm_test/shardddl1/dmctl.1714879559.log /tmp/dm_test/shardddl1/dmctl.1714879575.log /tmp/dm_test/shardddl1/dmctl.1714879434.log /tmp/dm_test/shardddl1/dmctl.1714879431.log /tmp/dm_test/shardddl1/dmctl.1714879564.log /tmp/dm_test/shardddl1/dmctl.1714879414.log /tmp/dm_test/shardddl1/dmctl.1714879436.log /tmp/dm_test/shardddl1/dmctl.1714879528.log /tmp/dm_test/shardddl1/dmctl.1714879556.log /tmp/dm_test/shardddl1/dmctl.1714879548.log /tmp/dm_test/shardddl1/dmctl.1714879453.log /tmp/dm_test/shardddl1/dmctl.1714879429.log /tmp/dm_test/shardddl1/dmctl.1714879416.log /tmp/dm_test/shardddl1/worker2/log/dm-worker.log /tmp/dm_test/shardddl1/worker2/log/stdout.log /tmp/dm_test/shardddl1/dmctl.1714879298.log /tmp/dm_test/shardddl1/dmctl.1714879565.log /tmp/dm_test/shardddl1/dmctl.1714879574.log /tmp/dm_test/shardddl1/dmctl.1714879448.log /tmp/dm_test/shardddl1/dmctl.1714879515.log /tmp/dm_test/shardddl1/dmctl.1714879410.log /tmp/dm_test/shardddl1/dmctl.1714879493.log /tmp/dm_test/shardddl1/dmctl.1714879551.log /tmp/dm_test/shardddl1/dmctl.1714879387.log /tmp/dm_test/shardddl1/dmctl.1714879305.log /tmp/dm_test/shardddl1/dmctl.1714879458.log /tmp/dm_test/shardddl1/dmctl.1714879433.log /tmp/dm_test/shardddl1/dmctl.1714879427.log /tmp/dm_test/shardddl1/dmctl.1714879452.log /tmp/dm_test/shardddl1/dmctl.1714879562.log /tmp/dm_test/shardddl1/dmctl.1714879538.log /tmp/dm_test/shardddl1/dmctl.1714879454.log /tmp/dm_test/shardddl1/dmctl.1714879541.log /tmp/dm_test/shardddl1/dmctl.1714879508.log /tmp/dm_test/shardddl1/dmctl.1714879566.log /tmp/dm_test/shardddl1/dmctl.1714879542.log /tmp/dm_test/shardddl1/dmctl.1714879554.log /tmp/dm_test/shardddl1/dmctl.1714879544.log /tmp/dm_test/shardddl1/dmctl.1714879578.log /tmp/dm_test/shardddl1/dmctl.1714879420.log /tmp/dm_test/shardddl1/dmctl.1714879405.log /tmp/dm_test/shardddl1/dmctl.1714879351.log /tmp/dm_test/shardddl1/sync_diff_stdout.log /tmp/dm_test/shardddl1/dmctl.1714879441.log /tmp/dm_test/shardddl1/dmctl.1714879438.log /tmp/dm_test/shardddl1/dmctl.1714879409.log /tmp/dm_test/shardddl1/dmctl.1714879419.log /tmp/dm_test/shardddl1/dmctl.1714879506.log /tmp/dm_test/shardddl1/dmctl.1714879304.log /tmp/dm_test/shardddl1/dmctl.1714879430.log /tmp/dm_test/shardddl1/worker1/log/dm-worker.log /tmp/dm_test/shardddl1/worker1/log/stdout.log /tmp/dm_test/shardddl1/dmctl.1714879504.log /tmp/dm_test/shardddl1/dmctl.1714879563.log /tmp/dm_test/shardddl1/dmctl.1714879348.log /tmp/dm_test/shardddl1/dmctl.1714879435.log /tmp/dm_test/shardddl1/dmctl.1714879550.log /tmp/dm_test/shardddl1/dmctl.1714879537.log /tmp/dm_test/shardddl1/dmctl.1714879388.log /tmp/dm_test/shardddl1/dmctl.1714879547.log /tmp/dm_test/shardddl1/dmctl.1714879411.log /tmp/dm_test/shardddl1/dmctl.1714879497.log /tmp/dm_test/shardddl1/dmctl.1714879555.log /tmp/dm_test/shardddl1/dmctl.1714879526.log /tmp/dm_test/shardddl1/dmctl.1714879573.log /tmp/dm_test/shardddl1/dmctl.1714879446.log /tmp/dm_test/shardddl1/dmctl.1714879352.log /tmp/dm_test/shardddl1/dmctl.1714879383.log /tmp/dm_test/shardddl1/dmctl.1714879517.log /tmp/dm_test/shardddl1/dmctl.1714879412.log /tmp/dm_test/shardddl1/dmctl.1714879579.log /tmp/dm_test/shardddl1/dmctl.1714879543.log /tmp/dm_test/shardddl1/dmctl.1714879580.log /tmp/dm_test/shardddl1/dmctl.1714879561.log /tmp/dm_test/shardddl1/dmctl.1714879418.log /tmp/dm_test/downstream/tidb/log/tidb.log
tar: Removing leading `/' from member names
/tmp/dm_test/shardddl1_1/dmctl.1714879644.log
/tmp/dm_test/shardddl1_1/master/log/dm-master.log
/tmp/dm_test/shardddl1_1/master/log/stdout.log
/tmp/dm_test/shardddl1_1/dmctl.1714879679.log
/tmp/dm_test/shardddl1_1/dmctl.1714879602.log
/tmp/dm_test/shardddl1_1/dmctl.1714879608.log
/tmp/dm_test/shardddl1_1/dmctl.1714879664.log
/tmp/dm_test/shardddl1_1/dmctl.1714879610.log
/tmp/dm_test/shardddl1_1/dmctl.1714879626.log
/tmp/dm_test/shardddl1_1/dmctl.1714879671.log
/tmp/dm_test/shardddl1_1/dmctl.1714879716.log
/tmp/dm_test/shardddl1_1/dmctl.1714879691.log
/tmp/dm_test/shardddl1_1/dmctl.1714879593.log
/tmp/dm_test/shardddl1_1/dmctl.1714879654.log
/tmp/dm_test/shardddl1_1/dmctl.1714879711.log
/tmp/dm_test/shardddl1_1/dmctl.1714879681.log
/tmp/dm_test/shardddl1_1/dmctl.1714879694.log
/tmp/dm_test/shardddl1_1/dmctl.1714879631.log
/tmp/dm_test/shardddl1_1/dmctl.1714879689.log
/tmp/dm_test/shardddl1_1/dmctl.1714879598.log
/tmp/dm_test/shardddl1_1/dmctl.1714879652.log
/tmp/dm_test/shardddl1_1/dmctl.1714879600.log
/tmp/dm_test/shardddl1_1/dmctl.1714879677.log
/tmp/dm_test/shardddl1_1/dmctl.1714879690.log
/tmp/dm_test/shardddl1_1/dmctl.1714879648.log
/tmp/dm_test/shardddl1_1/dmctl.1714879629.log
/tmp/dm_test/shardddl1_1/dmctl.1714879658.log
/tmp/dm_test/shardddl1_1/dmctl.1714879674.log
/tmp/dm_test/shardddl1_1/dmctl.1714879650.log
/tmp/dm_test/shardddl1_1/dmctl.1714879697.log
/tmp/dm_test/shardddl1_1/dmctl.1714879657.log
/tmp/dm_test/shardddl1_1/worker2/log/dm-worker.log
/tmp/dm_test/shardddl1_1/worker2/log/stdout.log
/tmp/dm_test/shardddl1_1/dmctl.1714879617.log
/tmp/dm_test/shardddl1_1/dmctl.1714879702.log
/tmp/dm_test/shardddl1_1/dmctl.1714879714.log
/tmp/dm_test/shardddl1_1/dmctl.1714879663.log
/tmp/dm_test/shardddl1_1/dmctl.1714879613.log
/tmp/dm_test/shardddl1_1/dmctl.1714879672.log
/tmp/dm_test/shardddl1_1/dmctl.1714879643.log
/tmp/dm_test/shardddl1_1/dmctl.1714879628.log
/tmp/dm_test/shardddl1_1/dmctl.1714879683.log
/tmp/dm_test/shardddl1_1/dmctl.1714879632.log
/tmp/dm_test/shardddl1_1/dmctl.1714879622.log
/tmp/dm_test/shardddl1_1/dmctl.1714879659.log
/tmp/dm_test/shardddl1_1/dmctl.1714879640.log
/tmp/dm_test/shardddl1_1/dmctl.1714879675.log
/tmp/dm_test/shardddl1_1/dmctl.1714879666.log
/tmp/dm_test/shardddl1_1/dmctl.1714879649.log
/tmp/dm_test/shardddl1_1/dmctl.1714879607.log
/tmp/dm_test/shardddl1_1/dmctl.1714879597.log
/tmp/dm_test/shardddl1_1/dmctl.1714879703.log
/tmp/dm_test/shardddl1_1/dmctl.1714879590.log
/tmp/dm_test/shardddl1_1/dmctl.1714879678.log
/tmp/dm_test/shardddl1_1/dmctl.1714879618.log
/tmp/dm_test/shardddl1_1/dmctl.1714879623.log
/tmp/dm_test/shardddl1_1/dmctl.1714879612.log
/tmp/dm_test/shardddl1_1/dmctl.1714879696.log
/tmp/dm_test/shardddl1_1/dmctl.1714879682.log
/tmp/dm_test/shardddl1_1/dmctl.1714879692.log
/tmp/dm_test/shardddl1_1/sync_diff_stdout.log
/tmp/dm_test/shardddl1_1/dmctl.1714879592.log
/tmp/dm_test/shardddl1_1/dmctl.1714879615.log
/tmp/dm_test/shardddl1_1/dmctl.1714879710.log
/tmp/dm_test/shardddl1_1/dmctl.1714879627.log
/tmp/dm_test/shardddl1_1/dmctl.1714879638.log
/tmp/dm_test/shardddl1_1/dmctl.1714879684.log
/tmp/dm_test/shardddl1_1/dmctl.1714879596.log
/tmp/dm_test/shardddl1_1/dmctl.1714879637.log
/tmp/dm_test/shardddl1_1/dmctl.1714879713.log
/tmp/dm_test/shardddl1_1/worker1/log/dm-worker.log
/tmp/dm_test/shardddl1_1/worker1/log/stdout.log
/tmp/dm_test/shardddl1_1/dmctl.1714879635.log
/tmp/dm_test/shardddl1_1/dmctl.1714879633.log
/tmp/dm_test/shardddl1_1/dmctl.1714879625.log
/tmp/dm_test/shardddl1_1/dmctl.1714879693.log
/tmp/dm_test/shardddl1_1/dmctl.1714879642.log
/tmp/dm_test/shardddl1_1/dmctl.1714879668.log
/tmp/dm_test/shardddl1_1/dmctl.1714879619.log
/tmp/dm_test/shardddl1_1/dmctl.1714879646.log
/tmp/dm_test/shardddl1_1/dmctl.1714879655.log
/tmp/dm_test/shardddl1_1/dmctl.1714879707.log
/tmp/dm_test/shardddl1_1/dmctl.1714879669.log
/tmp/dm_test/shardddl1_1/dmctl.1714879685.log
/tmp/dm_test/shardddl1_1/dmctl.1714879589.log
/tmp/dm_test/shardddl1_1/dmctl.1714879688.log
/tmp/dm_test/shardddl1_1/dmctl.1714879708.log
/tmp/dm_test/shardddl1_1/dmctl.1714879603.log
/tmp/dm_test/shardddl1_1/dmctl.1714879709.log
/tmp/dm_test/shardddl1_1/dmctl.1714879662.log
/tmp/dm_test/shardddl1_1/dmctl.1714879605.log
/tmp/dm_test/shardddl1_1/dmctl.1714879712.log
/tmp/dm_test/shardddl1_1/dmctl.1714879630.log
/tmp/dm_test/shardddl1_1/dmctl.1714879704.log
/tmp/dm_test/shardddl1_1/dmctl.1714879680.log
/tmp/dm_test/shardddl2/dmctl.1714879880.log
/tmp/dm_test/shardddl2/dmctl.1714879726.log
/tmp/dm_test/shardddl2/master/log/dm-master.log
/tmp/dm_test/shardddl2/master/log/stdout.log
/tmp/dm_test/shardddl2/dmctl.1714879845.log
/tmp/dm_test/shardddl2/dmctl.1714879831.log
/tmp/dm_test/shardddl2/dmctl.1714879863.log
/tmp/dm_test/shardddl2/dmctl.1714879881.log
/tmp/dm_test/shardddl2/dmctl.1714879867.log
/tmp/dm_test/shardddl2/dmctl.1714879772.log
/tmp/dm_test/shardddl2/dmctl.1714879725.log
/tmp/dm_test/shardddl2/dmctl.1714879885.log
/tmp/dm_test/shardddl2/worker2/log/dm-worker.log
/tmp/dm_test/shardddl2/worker2/log/stdout.log
/tmp/dm_test/shardddl2/dmctl.1714879782.log
/tmp/dm_test/shardddl2/dmctl.1714879758.log
/tmp/dm_test/shardddl2/dmctl.1714879760.log
/tmp/dm_test/shardddl2/dmctl.1714879728.log
/tmp/dm_test/shardddl2/dmctl.1714879745.log
/tmp/dm_test/shardddl2/dmctl.1714879838.log
/tmp/dm_test/shardddl2/dmctl.1714879849.log
/tmp/dm_test/shardddl2/dmctl.1714879741.log
/tmp/dm_test/shardddl2/dmctl.1714879822.log
/tmp/dm_test/shardddl2/dmctl.1714879874.log
/tmp/dm_test/shardddl2/sync_diff_stdout.log
/tmp/dm_test/shardddl2/dmctl.1714879804.log
/tmp/dm_test/shardddl2/dmctl.1714879801.log
/tmp/dm_test/shardddl2/dmctl.1714879734.log
/tmp/dm_test/shardddl2/dmctl.1714879748.log
/tmp/dm_test/shardddl2/dmctl.1714879846.log
/tmp/dm_test/shardddl2/worker1/log/dm-worker.log
/tmp/dm_test/shardddl2/worker1/log/stdout.log
/tmp/dm_test/shardddl2/dmctl.1714879776.log
/tmp/dm_test/shardddl2/dmctl.1714879829.log
/tmp/dm_test/shardddl2/dmctl.1714879856.log
/tmp/dm_test/shardddl2/dmctl.1714879780.log
/tmp/dm_test/shardddl2/dmctl.1714879812.log
/tmp/dm_test/shardddl2/dmctl.1714879847.log
/tmp/dm_test/shardddl2/dmctl.1714879729.log
/tmp/dm_test/shardddl2/dmctl.1714879882.log
/tmp/dm_test/shardddl2/dmctl.1714879811.log
/tmp/dm_test/shardddl2/dmctl.1714879802.log
/tmp/dm_test/shardddl2/dmctl.1714879891.log
/tmp/dm_test/shardddl2/dmctl.1714879813.log
/tmp/dm_test/shardddl2/dmctl.1714879747.log
/tmp/dm_test/shardddl2/dmctl.1714879862.log
/tmp/dm_test/shardddl2/dmctl.1714879765.log
/tmp/dm_test/shardddl2/dmctl.1714879864.log
/tmp/dm_test/shardddl2/dmctl.1714879800.log
/tmp/dm_test/goroutines/stack/log/master-8261.log
/tmp/dm_test/goroutines/stack/log/master-8361.log
/tmp/dm_test/goroutines/stack/log/worker-8262.log
/tmp/dm_test/goroutines/stack/log/master-8661.log
/tmp/dm_test/goroutines/stack/log/worker-18263.log
/tmp/dm_test/goroutines/stack/log/master-8461.log
/tmp/dm_test/goroutines/stack/log/worker-18262.log
/tmp/dm_test/goroutines/stack/log/master-8561.log
/tmp/dm_test/goroutines/stack/log/worker-8264.log
/tmp/dm_test/goroutines/stack/log/worker-8263.log
/tmp/dm_test/goroutines/stack/log/master-8761.log
/tmp/dm_test/shardddl1/dmctl.1714879437.log
/tmp/dm_test/shardddl1/dmctl.1714879300.log
/tmp/dm_test/shardddl1/dmctl.1714879495.log
/tmp/dm_test/shardddl1/master/log/dm-master.log
/tmp/dm_test/shardddl1/master/log/stdout.log
/tmp/dm_test/shardddl1/dmctl.1714879559.log
/tmp/dm_test/shardddl1/dmctl.1714879575.log
/tmp/dm_test/shardddl1/dmctl.1714879434.log
/tmp/dm_test/shardddl1/dmctl.1714879431.log
/tmp/dm_test/shardddl1/dmctl.1714879564.log
/tmp/dm_test/shardddl1/dmctl.1714879414.log
/tmp/dm_test/shardddl1/dmctl.1714879436.log
/tmp/dm_test/shardddl1/dmctl.1714879528.log
/tmp/dm_test/shardddl1/dmctl.1714879556.log
/tmp/dm_test/shardddl1/dmctl.1714879548.log
/tmp/dm_test/shardddl1/dmctl.1714879453.log
/tmp/dm_test/shardddl1/dmctl.1714879429.log
/tmp/dm_test/shardddl1/dmctl.1714879416.log
/tmp/dm_test/shardddl1/worker2/log/dm-worker.log
/tmp/dm_test/shardddl1/worker2/log/stdout.log
/tmp/dm_test/shardddl1/dmctl.1714879298.log
/tmp/dm_test/shardddl1/dmctl.1714879565.log
/tmp/dm_test/shardddl1/dmctl.1714879574.log
/tmp/dm_test/shardddl1/dmctl.1714879448.log
/tmp/dm_test/shardddl1/dmctl.1714879515.log
/tmp/dm_test/shardddl1/dmctl.1714879410.log
/tmp/dm_test/shardddl1/dmctl.1714879493.log
/tmp/dm_test/shardddl1/dmctl.1714879551.log
/tmp/dm_test/shardddl1/dmctl.1714879387.log
/tmp/dm_test/shardddl1/dmctl.1714879305.log
/tmp/dm_test/shardddl1/dmctl.1714879458.log
/tmp/dm_test/shardddl1/dmctl.1714879433.log
/tmp/dm_test/shardddl1/dmctl.1714879427.log
/tmp/dm_test/shardddl1/dmctl.1714879452.log
/tmp/dm_test/shardddl1/dmctl.1714879562.log
/tmp/dm_test/shardddl1/dmctl.1714879538.log
/tmp/dm_test/shardddl1/dmctl.1714879454.log
/tmp/dm_test/shardddl1/dmctl.1714879541.log
/tmp/dm_test/shardddl1/dmctl.1714879508.log
/tmp/dm_test/shardddl1/dmctl.1714879566.log
/tmp/dm_test/shardddl1/dmctl.1714879542.log
/tmp/dm_test/shardddl1/dmctl.1714879554.log
/tmp/dm_test/shardddl1/dmctl.1714879544.log
/tmp/dm_test/shardddl1/dmctl.1714879578.log
/tmp/dm_test/shardddl1/dmctl.1714879420.log
/tmp/dm_test/shardddl1/dmctl.1714879405.log
/tmp/dm_test/shardddl1/dmctl.1714879351.log
/tmp/dm_test/shardddl1/sync_diff_stdout.log
/tmp/dm_test/shardddl1/dmctl.1714879441.log
/tmp/dm_test/shardddl1/dmctl.1714879438.log
/tmp/dm_test/shardddl1/dmctl.1714879409.log
/tmp/dm_test/shardddl1/dmctl.1714879419.log
/tmp/dm_test/shardddl1/dmctl.1714879506.log
/tmp/dm_test/shardddl1/dmctl.1714879304.log
/tmp/dm_test/shardddl1/dmctl.1714879430.log
/tmp/dm_test/shardddl1/worker1/log/dm-worker.log
/tmp/dm_test/shardddl1/worker1/log/stdout.log
/tmp/dm_test/shardddl1/dmctl.1714879504.log
/tmp/dm_test/shardddl1/dmctl.1714879563.log
/tmp/dm_test/shardddl1/dmctl.1714879348.log
/tmp/dm_test/shardddl1/dmctl.1714879435.log
/tmp/dm_test/shardddl1/dmctl.1714879550.log
/tmp/dm_test/shardddl1/dmctl.1714879537.log
/tmp/dm_test/shardddl1/dmctl.1714879388.log
/tmp/dm_test/shardddl1/dmctl.1714879547.log
/tmp/dm_test/shardddl1/dmctl.1714879411.log
/tmp/dm_test/shardddl1/dmctl.1714879497.log
/tmp/dm_test/shardddl1/dmctl.1714879555.log
/tmp/dm_test/shardddl1/dmctl.1714879526.log
/tmp/dm_test/shardddl1/dmctl.1714879573.log
/tmp/dm_test/shardddl1/dmctl.1714879446.log
/tmp/dm_test/shardddl1/dmctl.1714879352.log
/tmp/dm_test/shardddl1/dmctl.1714879383.log
/tmp/dm_test/shardddl1/dmctl.1714879517.log
/tmp/dm_test/shardddl1/dmctl.1714879412.log
/tmp/dm_test/shardddl1/dmctl.1714879579.log
/tmp/dm_test/shardddl1/dmctl.1714879543.log
/tmp/dm_test/shardddl1/dmctl.1714879580.log
/tmp/dm_test/shardddl1/dmctl.1714879561.log
/tmp/dm_test/shardddl1/dmctl.1714879418.log
/tmp/dm_test/downstream/tidb/log/tidb.log
+ ls -alh log-G07.tar.gz
-rw-r--r--. 1 jenkins jenkins 657K May  5 11:31 log-G07.tar.gz
[Pipeline] archiveArtifacts
Archiving artifacts
wait process dm-master.test exit...
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
wait process dm-master.test exit...
[Pipeline] // podTemplate
[Pipeline] }
[Sun May  5 11:31:54 CST 2024] <<<<<< start DM-147 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G07'
Sending interrupt signal to process
Killing processes
wait process dm-master.test exit...
process dm-master.test already exit
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
make: *** [dm_integration_test_in_group] Terminated
script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
make: *** [dm_integration_test_in_group] Terminated
script returned exit code 143
kill finished with exit code 0
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] }
[Pipeline] }
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
restore config
make: *** [dm_integration_test_in_group] Terminated
/home/jenkins/agent/workspace/pingcap/tiflow/pull_dm_integration_test/tiflow/dm/tests/validator_basic/run.sh: line 53: 23237 Terminated              wait_process_exit dm-worker.test
restore time_zone
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
script returned exit code 143
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] }
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // cache
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // container
[Pipeline] // container
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // timeout
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // node
[Pipeline] // node
[Pipeline] // stage
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // container
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // node
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G05'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G08'
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G11'
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE