Skip to content

Console Output

Skipping 998 KB.. Full Log
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
[Fri May 17 17:37:19 CST 2024] <<<<<< finish DM-068 optimistic >>>>>>
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:37:20 CST 2024] <<<<<< finish DM-142 pessimistic >>>>>>
rpc addr 127.0.0.1:8262 is alive
[Fri May 17 17:37:20 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
[Fri May 17 17:37:21 CST 2024] <<<<<< start DM-143 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
wait process dm-master.test exit...
rpc addr 127.0.0.1:8263 is alive
start test adding UNIQUE on column with duplicate data
check cancelled error
dmctl test cmd: "query-status gbk"
got=0 expected=1
command: query-status gbk origin SQL: \[ALTER TABLE gbk.invalid_conn_test1 ADD UNIQUE(i)\]: DDL ALTER TABLE `gbk`.`invalid_conn_test1` ADD UNIQUE(`i`) executed in background and met error count: 0 != expected: 1, failed the 0-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
wait process dm-master.test exit...
process dm-master.test already exit
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:37:23 CST 2024] <<<<<< finish DM-143 pessimistic >>>>>>
wait process dm-worker.test exit...
got=1 expected=1
dmctl test cmd: "resume-task gbk"
got=3 expected=3
check test adding UNIQUE on column with duplicate data successfully
[Fri May 17 17:37:24 CST 2024] <<<<<< start DM-145 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
wait process dm-worker.test exit...
wait process dm-worker.test exit...
dmctl test cmd: "query-status test"
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:37:25 CST 2024] <<<<<< test case shardddl2_1 success! >>>>>>
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-release-7.5-pull_dm_integration_test-373/tiflow-dm already exists)
[Pipeline] // cache
[Pipeline] }
got=2 expected=2
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
check diff failed 1-th time, retry later
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
wait process dm-worker.test exit...
process dm-worker.test already exit
[Fri May 17 17:37:26 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
rpc addr 127.0.0.1:8262 is alive
[Fri May 17 17:37:27 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
start test invalid connection with status running (multi-schema change)
check count 1
run tidb sql failed 1-th time, retry later
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:37:28 CST 2024] <<<<<< finish DM-145 pessimistic >>>>>>
[Fri May 17 17:37:28 CST 2024] <<<<<< start DM-145 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
check count 2
check diff successfully
check test invalid connection with status running (multi-schema change) successfully
[Fri May 17 17:37:30 CST 2024] <<<<<< finish DM-145 optimistic >>>>>>
[Fri May 17 17:37:31 CST 2024] <<<<<< start DM-146 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
[Fri May 17 17:37:32 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "stop-task test"
rpc addr 127.0.0.1:8262 is alive
[Fri May 17 17:37:33 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
[Fri May 17 17:37:34 CST 2024] <<<<<< finish DM-146 pessimistic >>>>>>
[Fri May 17 17:37:34 CST 2024] <<<<<< start DM-146 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "stop-task test"
[Fri May 17 17:37:36 CST 2024] <<<<<< finish DM-146 optimistic >>>>>>
rpc addr 127.0.0.1:8263 is alive
start test invalid connection with status queueing (multi-schema change)
[Fri May 17 17:37:37 CST 2024] <<<<<< start DM-147 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
check count 1
run tidb sql failed 1-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "binlog-schema update test shardddl1 tb1 /tmp/dm_test/shardddl4_1/schema.sql -s mysql-replica-01"
dmctl test cmd: "binlog replace test "alter table shardddl1.tb1 drop column b""
got=2 expected=2
got=1 expected=1
check diff successfully
dmctl test cmd: "stop-task test"
check count 2
check diff successfully
check test invalid connection with status queueing (multi-schema change) successfully
[Fri May 17 17:37:39 CST 2024] <<<<<< finish DM-147 optimistic >>>>>>
wait process dm-worker.test exit...
[Fri May 17 17:37:40 CST 2024] <<<<<< start DM-148 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
wait process dm-worker.test exit...
process dm-worker.test already exit
[Fri May 17 17:37:41 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
rpc addr 127.0.0.1:8262 is alive
[Fri May 17 17:37:42 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
start test invalid connection with status none (multi-schema change)
check count 1
check count 2
check diff successfully
check test invalid connection with status none (multi-schema change) successfully
check diff successfully
dmctl test cmd: "stop-task test"
wait process dm-worker.test exit...
[Fri May 17 17:37:44 CST 2024] <<<<<< finish DM-148 pessimistic >>>>>>
[Fri May 17 17:37:44 CST 2024] <<<<<< start DM-148 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-worker.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
wait process dm-worker.test exit...
process dm-worker.test already exit
[Fri May 17 17:37:46 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
[Fri May 17 17:37:46 CST 2024] <<<<<< finish DM-148 optimistic >>>>>>
rpc addr 127.0.0.1:8262 is alive
[Fri May 17 17:37:47 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
[Fri May 17 17:37:47 CST 2024] <<<<<< start DM-149 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8263 is alive
start test inserting data after invalid connection (multi-schema change)
check count 1
run tidb sql failed 1-th time, retry later
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
check count 2
check diff successfully
check test inserting data after invalid connection (multi-schema change) successfully
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:37:52 CST 2024] <<<<<< finish DM-149 pessimistic >>>>>>
[Fri May 17 17:37:52 CST 2024] <<<<<< start DM-149 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-worker.test exit...
dmctl test cmd: "query-status test"
wait process dm-worker.test exit...
process dm-worker.test already exit
[Fri May 17 17:37:53 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
got=2 expected=2
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:37:54 CST 2024] <<<<<< finish DM-149 optimistic >>>>>>
rpc addr 127.0.0.1:8262 is alive
[Fri May 17 17:37:54 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gbk/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
start test adding UNIQUE on column with duplicate data (multi-schema change)
[Fri May 17 17:37:55 CST 2024] <<<<<< start DM-150 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
check cancelled error
dmctl test cmd: "query-status gbk"
got=0 expected=1
command: query-status gbk origin SQL: \[ALTER TABLE gbk.invalid_conn_test1 ADD UNIQUE(k), ADD UNIQUE(m)\]: DDL ALTER TABLE `gbk`.`invalid_conn_test1` ADD UNIQUE(`k`) executed in background and met error count: 0 != expected: 1, failed the 0-th time, will retry again
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "show-ddl-locks"
got=1 expected=1
check diff failed 1-th time, retry later
got=1 expected=1
check test adding UNIQUE on column with duplicate data (multi-schema change) successfully
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:37:59 CST 2024] <<<<<< finish DM-150 pessimistic >>>>>>
[Fri May 17 17:37:59 CST 2024] <<<<<< start DM-150 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=3 expected=3
dmctl test cmd: "stop-task test"
[Fri May 17 17:38:01 CST 2024] <<<<<< finish DM-150 optimistic >>>>>>
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:38:02 CST 2024] <<<<<< test case gbk success! >>>>>>
start running case: [gtid] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:38:02 CST 2024] <<<<<< start DM-151 pessimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-pessimistic.yaml --remove-meta"
[Fri May 17 17:38:02 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "show-ddl-locks"
got=1 expected=1
rpc addr 127.0.0.1:8261 is alive
[Fri May 17 17:38:06 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/gtid/source1.yaml"
[Fri May 17 17:38:07 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:38:08 CST 2024] <<<<<< finish DM-151 pessimistic >>>>>>
[Fri May 17 17:38:08 CST 2024] <<<<<< start DM-151 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/gtid/source2.yaml"
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-task.yaml --remove-meta"
got=3 expected=3
check diff successfully
dmctl test cmd: "pause-task test"
dmctl test cmd: "resume-task test"
check diff successfully
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
check diff failed 1-th time, retry later
wait process dm-master.test exit...
process dm-master.test already exit
check diff successfully
dmctl test cmd: "stop-task test"
wait process dm-worker.test exit...
[Fri May 17 17:38:15 CST 2024] <<<<<< finish DM-151 optimistic >>>>>>
wait process dm-worker.test exit...
[Fri May 17 17:38:16 CST 2024] <<<<<< start DM-152 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:38:16 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
dmctl test cmd: "query-status test"
got=2 expected=2
check diff failed 1-th time, retry later
rpc addr 127.0.0.1:8261 is alive
[Fri May 17 17:38:17 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/gtid/source1.yaml"
[Fri May 17 17:38:19 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
check diff successfully
check diff failed 1-th time, retry later
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/gtid/source2.yaml"
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-task.yaml --remove-meta"
check diff successfully
check diff failed 1-th time, retry later
check diff successfully
new_gtid1 3ac23ed8-142f-11ef-b6cf-269687f4cb15:6 new_gtid2 3b9fe965-142f-11ef-98de-269687f4cb15:6
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:38:23 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-worker1.toml >>>>>>
[Fri May 17 17:38:23 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-task.yaml"
check diff successfully
dmctl test cmd: "stop-task test"
check diff successfully
check diff failed 1-th time, retry later
[Fri May 17 17:38:27 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-worker1.toml >>>>>>
[Fri May 17 17:38:27 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
check diff successfully
check diff failed 1-th time, retry later
rpc addr 127.0.0.1:8262 is alive
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/gtid/conf/dm-task.yaml"
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:38:31 CST 2024] <<<<<< finish DM-152 optimistic >>>>>>
wait process dm-master.test exit...
[Fri May 17 17:38:32 CST 2024] <<<<<< start DM-153 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-master.test exit...
process dm-master.test already exit
dmctl test cmd: "query-status test"
wait process dm-worker.test exit...
got=2 expected=2
dmctl test cmd: "query-status test"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "binlog-schema update test shardddl1 tb1 -s mysql-replica-01 --from-target"
dmctl test cmd: "binlog replace test "alter table shardddl1.tb1 drop column b""
got=2 expected=2
got=1 expected=1
check diff successfully
dmctl test cmd: "stop-task test"
wait process dm-worker.test exit...
[Fri May 17 17:38:34 CST 2024] <<<<<< finish DM-153 optimistic >>>>>>
[Fri May 17 17:38:35 CST 2024] <<<<<< start DM-154 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:38:36 CST 2024] <<<<<< test case gtid success! >>>>>>
start running case: [ha_cases] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:38:36 CST 2024] <<<<<< start test_exclusive_relay >>>>>>
start DM worker and master cluster
[Fri May 17 17:38:36 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master-standalone.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
dmctl test cmd: "query-status test"
got=2 expected=2
dmctl test cmd: "query-status test"
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
got=1 expected=1
got=1 expected=1
dmctl test cmd: "binlog-schema update test shardddl1 tb1 -s mysql-replica-01 --from-source"
dmctl test cmd: "binlog skip test"
got=2 expected=2
got=1 expected=1
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:38:38 CST 2024] <<<<<< finish DM-154 optimistic >>>>>>
rpc addr 127.0.0.1:8261 is alive
[Fri May 17 17:38:38 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
[Fri May 17 17:38:39 CST 2024] <<<<<< start DM-155 optimistic >>>>>>
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/double-source-optimistic.yaml --remove-meta"
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/ha_cases/source1.yaml"
[Fri May 17 17:38:39 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
dmctl test cmd: "query-status test"
got=2 expected=2
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "start-relay -s mysql-replica-01 worker1 worker2"
restart master
restart dm-master
got=3 expected=3
dmctl test cmd: "operate-source create /tmp/dm_test/ha_cases/source2.yaml"
dmctl test cmd: "list-member --worker"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "operate-source show -s mysql-replica-02"
got=1 expected=1
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
wait process dm-master exit...
wait process dm-master.test exit...
wait process dm-master exit...
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-master exit...
process dm-master already exit
wait process dm-worker.test exit...
wait process dm-worker.test exit...
[Fri May 17 17:38:47 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
clean source table
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
[Fri May 17 17:38:49 CST 2024] <<<<<< finish test_exclusive_relay >>>>>>
[Fri May 17 17:38:49 CST 2024] <<<<<< start test_exclusive_relay_2 >>>>>>
start DM worker and master cluster
[Fri May 17 17:38:49 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master-standalone.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8261 is alive
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
restart worker2
restart dm-worker2
rpc addr 127.0.0.1:8261 is alive
[Fri May 17 17:38:51 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process worker2 exit...
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/ha_cases/source1.yaml"
wait process worker2 exit...
wait process worker2 exit...
[Fri May 17 17:38:54 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
wait process worker2 exit...
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/ha_cases/source2.yaml"
wait process worker2 exit...
dmctl test cmd: "start-relay -s mysql-replica-01 worker1"
wait process worker2 exit...
got=2 expected=2
dmctl test cmd: "start-relay -s mysql-replica-02 worker2"
wait process worker2 exit...
got=2 expected=2
[Fri May 17 17:38:59 CST 2024] <<<<<< START DM-WORKER on port 8264, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker3.toml >>>>>>
wait for rpc addr 127.0.0.1:8264 alive the 1-th time
wait process worker2 exit...
rpc addr 127.0.0.1:8264 is alive
kill dm-worker1
wait process worker2 exit...
wait process dm-worker1 exit...
wait process worker2 exit...
process worker2 already exit
[Fri May 17 17:39:01 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
wait process dm-worker1 exit...
process dm-worker1 already exit
dmctl test cmd: "list-member --name worker3"
rpc addr 127.0.0.1:8263 is alive
got=1 expected=1
[Fri May 17 17:39:02 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
restart master
restart dm-master
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "list-member --name worker3"
got=1 expected=1
dmctl test cmd: "list-member --name worker1"
got=1 expected=1
kill dm-worker2
wait process dm-master exit...
wait process dm-worker2 exit...
wait process dm-master exit...
process dm-master already exit
wait process dm-worker2 exit...
process dm-worker2 already exit
dmctl test cmd: "operate-source show -s mysql-replica-02"
got=1 expected=1
[Fri May 17 17:39:06 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
[Fri May 17 17:39:07 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "list-member --name worker2"
got=1 expected=1
1 dm-master alive
3 dm-worker alive
0 dm-syncer alive
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
wait process dm-master.test exit...
rpc addr 127.0.0.1:8261 is alive
wait process dm-master.test exit...
process dm-master.test already exit
restart worker1
restart dm-worker1
wait process dm-worker.test exit...
wait process worker1 exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process worker1 exit...
wait process dm-worker.test exit...
wait process worker1 exit...
wait process dm-worker.test exit...
wait process worker1 exit...
wait process dm-worker.test exit...
wait process worker1 exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
clean source table
wait process worker1 exit...
wait process worker1 exit...
[Fri May 17 17:39:20 CST 2024] <<<<<< finish test_exclusive_relay_2 >>>>>>
[Fri May 17 17:39:20 CST 2024] <<<<<< start test_last_bound >>>>>>
[Fri May 17 17:39:20 CST 2024] <<<<<< start test_running >>>>>>
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
clean source table
wait process worker1 exit...
wait process worker1 exit...
process worker1 already exit
[Fri May 17 17:39:21 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
import prepare data
start DM worker and master cluster
[Fri May 17 17:39:22 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master1.toml >>>>>>
[Fri May 17 17:39:22 CST 2024] <<<<<< START DM-MASTER on port 8361, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master2.toml >>>>>>
[Fri May 17 17:39:22 CST 2024] <<<<<< START DM-MASTER on port 8461, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master3.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
restart master
restart dm-master
wait process dm-master exit...
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
rpc addr 127.0.0.1:8361 is alive
wait process dm-master exit...
process dm-master already exit
rpc addr 127.0.0.1:8461 is alive
start worker and operate mysql config to worker
[Fri May 17 17:39:26 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/ha_cases/source1.yaml"
[Fri May 17 17:39:28 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
[Fri May 17 17:39:28 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/ha_cases/source2.yaml"
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
start DM task
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-task.yaml "
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "query-status test"
restart master
restart dm-master
got=2 expected=2
got=2 expected=2
use sync_diff_inspector to check full dump loader
check diff successfully
flush logs to force rotate binlog file
apply increment data before restart dm-worker to ensure entering increment phase
wait process dm-master exit...
wait process dm-master exit...
process dm-master already exit
use sync_diff_inspector to check increment data
check diff successfully
[Fri May 17 17:39:36 CST 2024] <<<<<< finish test_running >>>>>>
worker1bound  "mysql-replica-01"
worker2bound  "mysql-replica-02"
dmctl test cmd: "start-relay -s mysql-replica-01 worker1"
[Fri May 17 17:39:36 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
got=2 expected=2
dmctl test cmd: "start-relay -s mysql-replica-02 worker2"
got=2 expected=2
dmctl test cmd: "query-status test"
got=4 expected=4
kill dm-worker1
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
wait process dm-worker1 exit...
restart worker2
restart dm-worker2
wait process dm-worker1 exit...
process dm-worker1 already exit
kill dm-worker2
wait process dm-worker2 exit...
wait process worker2 exit...
wait process dm-worker2 exit...
process dm-worker2 already exit
dmctl test cmd: "list-member --name worker1 --name worker2"
wait process worker2 exit...
got=2 expected=2
start worker1
[Fri May 17 17:39:43 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process worker2 exit...
rpc addr 127.0.0.1:8262 is alive
start worker2
[Fri May 17 17:39:44 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "list-member --name worker1 --name worker2"
got=1 expected=1
got=1 expected=1
kill dm-worker1
wait process worker2 exit...
wait process worker2 exit...
wait process dm-worker1 exit...
wait process dm-worker1 exit...
process dm-worker1 already exit
kill dm-worker2
wait process worker2 exit...
wait process dm-worker2 exit...
wait process worker2 exit...
wait process dm-worker2 exit...
process dm-worker2 already exit
dmctl test cmd: "list-member --name worker1 --name worker2"
wait process worker2 exit...
got=2 expected=2
start worker2
[Fri May 17 17:39:50 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
wait process worker2 exit...
process worker2 already exit
[Fri May 17 17:39:51 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
start worker1
[Fri May 17 17:39:51 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "list-member --name worker2 --name worker1"
got=1 expected=1
got=1 expected=1
kill dm-worker1
restart worker1
restart dm-worker1
wait process dm-worker1 exit...
wait process worker1 exit...
wait process worker1 exit...
process worker1 already exit
[Fri May 17 17:39:55 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process dm-worker1 exit...
process dm-worker1 already exit
kill dm-worker2
wait process dm-worker2 exit...
rpc addr 127.0.0.1:8262 is alive
check log contain failed 1-th time, retry later
wait process dm-worker2 exit...
process dm-worker2 already exit
dmctl test cmd: "list-member --name worker1 --name worker2"
got=2 expected=2
start worker3
[Fri May 17 17:39:57 CST 2024] <<<<<< START DM-WORKER on port 8264, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker3.toml >>>>>>
wait for rpc addr 127.0.0.1:8264 alive the 1-th time
rpc addr 127.0.0.1:8264 is alive
start worker4
[Fri May 17 17:39:58 CST 2024] <<<<<< START DM-WORKER on port 18262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker4.toml >>>>>>
wait for rpc addr 127.0.0.1:18262 alive the 1-th time
check log contain failed 1-th time, retry later
rpc addr 127.0.0.1:18262 is alive
dmctl test cmd: "list-member --name worker3 --name worker4"
got=1 expected=1
got=1 expected=1
check log contain failed 1-th time, retry later
dmctl test cmd: "start-relay -s mysql-replica-01 worker3"
got=2 expected=2
dmctl test cmd: "start-relay -s mysql-replica-02 worker4"
got=2 expected=2
dmctl test cmd: "query-status test"
restart worker2
restart dm-worker2
got=4 expected=4
check diff successfully
kill dm-worker3
wait process worker2 exit...
wait process dm-worker3 exit...
wait process worker2 exit...
process worker2 already exit
[Fri May 17 17:40:04 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
wait process dm-worker3 exit...
process dm-worker3 already exit
kill dm-worker4
rpc addr 127.0.0.1:8263 is alive
check log contain failed 1-th time, retry later
wait process dm-worker4 exit...
wait process dm-worker4 exit...
process dm-worker4 already exit
dmctl test cmd: "list-member --name worker3 --name worker4"
got=2 expected=2
start worker1
[Fri May 17 17:40:07 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
check log contain failed 1-th time, retry later
rpc addr 127.0.0.1:8262 is alive
start worker2
[Fri May 17 17:40:08 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "list-member --name worker1 --name worker2"
got=1 expected=1
got=1 expected=1
check log contain failed 1-th time, retry later
num1 1 num2 2
[Fri May 17 17:40:10 CST 2024] <<<<<< finish test_last_bound >>>>>>
[Fri May 17 17:40:10 CST 2024] <<<<<< start test_config_name >>>>>>
[Fri May 17 17:40:10 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master-join1.toml >>>>>>
rpc addr 127.0.0.1:8261 is alive
[Fri May 17 17:40:10 CST 2024] <<<<<< START DM-MASTER on port 8361, config: /tmp/dm_test/ha_cases/dm-master-join2.toml >>>>>>
restart master
restart dm-master
check log contain failed 1-th time (file not exist), retry later
[Fri May 17 17:40:12 CST 2024] <<<<<< START DM-MASTER on port 8361, config: /tmp/dm_test/ha_cases/dm-master-join2.toml >>>>>>
rpc addr 127.0.0.1:8361 is alive
[Fri May 17 17:40:12 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
rpc addr 127.0.0.1:8262 is alive
[Fri May 17 17:40:12 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /tmp/dm_test/ha_cases/dm-worker2.toml >>>>>>
wait process dm-master exit...
wait process dm-master exit...
process dm-master already exit
[Fri May 17 17:40:14 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /tmp/dm_test/ha_cases/dm-worker2.toml >>>>>>
rpc addr 127.0.0.1:8263 is alive
[Fri May 17 17:40:14 CST 2024] <<<<<< finish test_config_name >>>>>>
[Fri May 17 17:40:14 CST 2024] <<<<<< start test_join_masters_and_worker >>>>>>
3 dm-master alive
3 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
[Fri May 17 17:40:15 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait process dm-master.test exit...
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
wait process dm-master.test exit...
rpc addr 127.0.0.1:8261 is alive
check log contain failed 1-th time, retry later
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
check log contain failed 1-th time, retry later
wait process dm-master.test exit...
check log contain failed 1-th time, retry later
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
clean source table
restart worker1
restart dm-worker1
wait process worker1 exit...
[Fri May 17 17:40:26 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master-join1.toml >>>>>>
wait process worker1 exit...
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait process worker1 exit...
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
wait process worker1 exit...
rpc addr 127.0.0.1:8261 is alive
query-status from unique master
dmctl test cmd: "query-status"
got=1 expected=1
[Fri May 17 17:40:29 CST 2024] <<<<<< START DM-MASTER on port 8361, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master-join2.toml >>>>>>
wait for rpc addr 127.0.0.1:8361 alive the 1-th time
wait process worker1 exit...
process worker1 already exit
[Fri May 17 17:40:29 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
restart worker1
restart dm-worker1
wait process worker1 exit...
wait process worker1 exit...
process worker1 already exit
[Fri May 17 17:40:33 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/shardddl4_1/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
wait for rpc addr 127.0.0.1:8361 alive the 2-th time
rpc addr 127.0.0.1:8361 is alive
[Fri May 17 17:40:45 CST 2024] <<<<<< START DM-MASTER on port 8461, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master-join3.toml >>>>>>
wait for rpc addr 127.0.0.1:8461 alive the 1-th time
rpc addr 127.0.0.1:8461 is alive
[Fri May 17 17:40:52 CST 2024] <<<<<< START DM-MASTER on port 8561, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master-join4.toml >>>>>>
wait for rpc addr 127.0.0.1:8561 alive the 1-th time
check diff successfully
dmctl test cmd: "stop-task test"
[Fri May 17 17:40:52 CST 2024] <<<<<< finish DM-155 optimistic >>>>>>
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
rpc addr 127.0.0.1:8561 is alive
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:40:58 CST 2024] <<<<<< test case shardddl4_1 success! >>>>>>
start running case: [sharding] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sharding/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sharding/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:40:58 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sharding/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
[Fri May 17 17:40:58 CST 2024] <<<<<< START DM-MASTER on port 8661, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master-join5.toml >>>>>>
wait for rpc addr 127.0.0.1:8661 alive the 1-th time
rpc addr 127.0.0.1:8661 is alive
dmctl test cmd: "query-status"
rpc addr 127.0.0.1:8261 is alive
[Fri May 17 17:40:59 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sharding/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
got=1 expected=1
dmctl test cmd: "query-status"
got=1 expected=1
dmctl test cmd: "query-status"
got=1 expected=1
dmctl test cmd: "query-status"
got=1 expected=1
join worker with dm-master1 endpoint
[Fri May 17 17:41:00 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker-join2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/sharding/source1.yaml"
[Fri May 17 17:41:01 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sharding/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "list-member --worker --name=worker2"
got=1 expected=1
kill dm-master-join1
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/sharding/source2.yaml"
wait process dm-master-join1 exit...
wait process dm-master-join1 exit...
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sharding/conf/dm-task.yaml --remove-meta"
wait process dm-master-join1 exit...
process dm-master-join1 already exit
dmctl test cmd: "list-member --worker --name=worker2"
got=1 expected=1
join worker with 5 masters endpoint
[Fri May 17 17:41:04 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker-join1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
dmctl test cmd: "query-status test"
got=2 expected=2
check sync diff for full dump and load
check diff successfully
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "resume-task test"
dmctl test cmd: "query-status test"
wait for rpc addr 127.0.0.1:8262 alive the 2-th time
got=1 expected=1
dmctl test cmd: "resume-task test"
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "resume-task test"
check sync diff for the first increment replication
check diff failed 1-th time, retry later
wait for rpc addr 127.0.0.1:8262 alive the 3-th time
wait for rpc addr 127.0.0.1:8262 alive the 4-th time
check diff successfully
check sync diff for the second increment replication
check diff successfully
check sync diff for the third increment replication
rpc addr 127.0.0.1:8262 is alive
query-status from master2
dmctl test cmd: "query-status"
check diff successfully
checksum before drop/truncate: checksum: 2273109362, checksum after drop/truncate: checksum: 2273109362
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "resume-task test"
got=1 expected=1
[Fri May 17 17:41:09 CST 2024] <<<<<< finish test_join_masters_and_worker >>>>>>
[Fri May 17 17:41:09 CST 2024] <<<<<< start test_standalone_running >>>>>>
4 dm-master alive
2 dm-worker alive
0 dm-syncer alive
dmctl test cmd: "query-status test"
got=1 expected=1
dmctl test cmd: "stop-task test"
dmctl test cmd: "stop-task test"
got=1 expected=1
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-master.test exit...
wait process dm-worker.test exit...
wait process dm-master.test exit...
wait process dm-worker.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:41:16 CST 2024] <<<<<< test case sharding success! >>>>>>
start running case: [sequence_sharding] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:41:16 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait process dm-master.test exit...
rpc addr 127.0.0.1:8261 is alive
[Fri May 17 17:41:17 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait process dm-master.test exit...
rpc addr 127.0.0.1:8262 is alive
[Fri May 17 17:41:19 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
wait process dm-master.test exit...
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/sequence_sharding/source1.yaml"
wait process dm-master.test exit...
dmctl test cmd: "operate-source create /tmp/dm_test/sequence_sharding/source2.yaml"
wait process dm-master.test exit...
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/sequence_sharding/conf/dm-task.yaml "
wait process dm-master.test exit...
wait process dm-master.test exit...
check diff successfully
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
check diff successfully
dmctl test cmd: "query-status sequence_sharding"
got=0 expected=2
command: query-status sequence_sharding detect inconsistent DDL sequence count: 0 != expected: 2, failed the 0-th time, will retry again
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
got=2 expected=2
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
dmctl test cmd: "resume-task sequence_sharding"
dmctl test cmd: "query-status sequence_sharding"
got=2 expected=2
dmctl test cmd: "stop-task sequence_sharding"
dmctl test cmd: "start-task /tmp/dm_test/sequence_sharding/task.yaml"
wait process dm-master.test exit...
dmctl test cmd: "query-status sequence_sharding"
got=2 expected=2
1 dm-master alive
2 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:41:47 CST 2024] <<<<<< test case sequence_sharding success! >>>>>>
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-release-7.5-pull_dm_integration_test-373/tiflow-dm already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
clean source table
import prepare data
start DM worker and master standalone cluster
[Fri May 17 17:42:13 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master1.toml >>>>>>
[Fri May 17 17:42:13 CST 2024] <<<<<< START DM-MASTER on port 8361, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master2.toml >>>>>>
[Fri May 17 17:42:13 CST 2024] <<<<<< START DM-MASTER on port 8461, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-master3.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
rpc addr 127.0.0.1:8361 is alive
rpc addr 127.0.0.1:8461 is alive
[Fri May 17 17:42:17 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
operate mysql config to worker
dmctl test cmd: "operate-source create /tmp/dm_test/ha_cases/source1.yaml"
start DM task
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/standalone-task.yaml "
use sync_diff_inspector to check full dump loader
check diff successfully
flush logs to force rotate binlog file
apply increment data before restart dm-worker to ensure entering increment phase
use sync_diff_inspector to check increment data
check diff failed 1-th time, retry later
check diff successfully
dmctl test cmd: "operate-source create /tmp/dm_test/ha_cases/source2.yaml"
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/standalone-task2.yaml"
[Fri May 17 17:42:23 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/standalone-task2.yaml"
dmctl test cmd: "query-status"
got=2 expected=2
kill worker2
wait process dm-worker2 exit...
wait process dm-worker2 exit...
process dm-worker2 already exit
dmctl test cmd: "query-status"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "stop-task test2"
got=1 expected=1
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/ha_cases/conf/standalone-task2.yaml"
got=1 expected=1
dmctl test cmd: "query-status test"
got=1 expected=1
[Fri May 17 17:42:30 CST 2024] <<<<<< finish test_standalone_running >>>>>>
3 dm-master alive
1 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:43:32 CST 2024] <<<<<< test case ha_cases success! >>>>>>
start running case: [http_proxies] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/http_proxies/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/http_proxies/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
test dm grpc proxy env setting checking for http_proxy=http://127.0.0.1:8080
[Fri May 17 17:43:32 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/http_proxies/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
tests/_utils/check_log_contains: line 15: [: proxy: integer expression expected
tests/_utils/check_log_contains: line 21: [: proxy: integer expression expected
[Fri May 17 17:43:34 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/http_proxies/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
wait for rpc addr 127.0.0.1:8262 alive the 2-th time
rpc addr 127.0.0.1:8262 is alive
./tests/_utils/check_log_contains: line 15: [: proxy: integer expression expected
./tests/_utils/check_log_contains: line 21: [: proxy: integer expression expected
dmctl test cmd: "query-status test"
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
test dm grpc proxy env setting checking for https_proxy=https://127.0.0.1:8080
[Fri May 17 17:43:44 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/http_proxies/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
./tests/_utils/check_log_contains: line 15: [: proxy: integer expression expected
./tests/_utils/check_log_contains: line 21: [: proxy: integer expression expected
[Fri May 17 17:43:47 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/http_proxies/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
./tests/_utils/check_log_contains: line 15: [: proxy: integer expression expected
./tests/_utils/check_log_contains: line 21: [: proxy: integer expression expected
dmctl test cmd: "query-status test"
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
test dm grpc proxy env setting checking for no_proxy=localhost,127.0.0.1
[Fri May 17 17:43:55 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/http_proxies/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
./tests/_utils/check_log_contains: line 15: [: proxy: integer expression expected
./tests/_utils/check_log_contains: line 21: [: proxy: integer expression expected
[Fri May 17 17:43:58 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/http_proxies/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
./tests/_utils/check_log_contains: line 15: [: proxy: integer expression expected
./tests/_utils/check_log_contains: line 21: [: proxy: integer expression expected
dmctl test cmd: "query-status test"
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:44:07 CST 2024] <<<<<< test case http_proxies success! >>>>>>
start running case: [lightning_load_task] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
import prepare data
start DM master, workers and sources
[Fri May 17 17:44:07 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8261 is alive
[Fri May 17 17:44:08 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/lightning_load_task/source1.yaml"
[Fri May 17 17:44:09 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/lightning_load_task/source2.yaml"
[Fri May 17 17:44:12 CST 2024] <<<<<< START DM-WORKER on port 8264, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-worker3.toml >>>>>>
wait for rpc addr 127.0.0.1:8264 alive the 1-th time
rpc addr 127.0.0.1:8264 is alive
start DM task
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-task.yaml --remove-meta"
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-task2.yaml --remove-meta"
dmctl test cmd: "query-status load_task1"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "query-status load_task2"
got=1 expected=1
got=1 expected=1
test worker restart
wait process dm-worker1 exit...
wait process dm-worker1 exit...
process dm-worker1 already exit
dmctl test cmd: "list-member -w -n worker3"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "list-member -w -n worker1"
got=1 expected=1
dmctl test cmd: "query-status load_task1"
got=1 expected=1
got=1 expected=1
[Fri May 17 17:44:18 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "list-member -w -n worker3"
got=1 expected=1
dmctl test cmd: "list-member -w -n worker1"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "query-status load_task1"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "query-status load_task2"
got=1 expected=1
got=1 expected=1
test_transfer_two_sources
wait process dm-worker2 exit...
wait process dm-worker2 exit...
process dm-worker2 already exit
dmctl test cmd: "list-member -w -n worker3"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "query-status load_task2"
got=1 expected=1
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-task3.yaml --remove-meta"
got=2 expected=2
dmctl test cmd: "query-status load_task3"
got=1 expected=1
[Fri May 17 17:44:23 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "list-member -w -n worker2"
got=1 expected=1
wait process dm-worker1 exit...
wait process dm-worker1 exit...
process dm-worker1 already exit
dmctl test cmd: "list-member -w -n worker2"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-task4.yaml --remove-meta"
got=2 expected=2
dmctl test cmd: "query-status load_task4"
got=1 expected=1
[Fri May 17 17:44:28 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "list-member -w -n worker1"
got=1 expected=1
wait process dm-worker3 exit...
wait process dm-worker3 exit...
process dm-worker3 already exit
dmctl test cmd: "list-member -w -n worker1"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "query-status load_task1"
got=1 expected=1
dmctl test cmd: "query-status load_task2"
got=1 expected=1
dmctl test cmd: "list-member -w -n worker1"
got=1 expected=1
got=0 expected=1
command: list-member -w -n worker1 "source": "mysql-replica-01" count: 0 != expected: 1, failed the 0-th time, will retry again
got=1 expected=1
got=0 expected=1
command: list-member -w -n worker1 "source": "mysql-replica-01" count: 0 != expected: 1, failed the 1-th time, will retry again
got=1 expected=1
got=0 expected=1
command: list-member -w -n worker1 "source": "mysql-replica-01" count: 0 != expected: 1, failed the 2-th time, will retry again
got=1 expected=1
got=0 expected=1
command: list-member -w -n worker1 "source": "mysql-replica-01" count: 0 != expected: 1, failed the 3-th time, will retry again
got=1 expected=1
got=0 expected=1
command: list-member -w -n worker1 "source": "mysql-replica-01" count: 0 != expected: 1, failed the 4-th time, will retry again
got=1 expected=1
got=1 expected=1
dmctl test cmd: "list-member -w -n worker2"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "query-status"
got=3 expected=3
got=1 expected=1
[Fri May 17 17:44:43 CST 2024] <<<<<< START DM-WORKER on port 8264, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_load_task/conf/dm-worker3.toml >>>>>>
wait for rpc addr 127.0.0.1:8264 alive the 1-th time
rpc addr 127.0.0.1:8264 is alive
dmctl test cmd: "list-member -w -n worker2"
got=1 expected=1
got=1 expected=1
dmctl test cmd: "query-status"
got=3 expected=4
command: query-status "taskStatus": "Running" count: 3 != expected: 4, failed the 0-th time, will retry again
got=3 expected=4
command: query-status "taskStatus": "Running" count: 3 != expected: 4, failed the 1-th time, will retry again
got=3 expected=4
command: query-status "taskStatus": "Running" count: 3 != expected: 4, failed the 2-th time, will retry again
got=3 expected=4
command: query-status "taskStatus": "Running" count: 3 != expected: 4, failed the 3-th time, will retry again
got=3 expected=4
command: query-status "taskStatus": "Running" count: 3 != expected: 4, failed the 4-th time, will retry again
got=3 expected=4
command: query-status "taskStatus": "Running" count: 3 != expected: 4, failed the 5-th time, will retry again
got=3 expected=4
command: query-status "taskStatus": "Running" count: 3 != expected: 4, failed the 6-th time, will retry again
got=4 expected=4
check diff successfully
check diff successfully
check diff successfully
check diff successfully
1 dm-master alive
3 dm-worker alive
0 dm-syncer alive
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
wait process dm-worker.test exit...
wait process dm-worker.test exit...
wait process dm-worker.test exit...
process dm-worker.test already exit
process dm-syncer.test already exit
[Fri May 17 17:45:05 CST 2024] <<<<<< test case lightning_load_task success! >>>>>>
start running case: [lightning_mode] script: [/home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_mode/run.sh]
Running test /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_mode/run.sh...
Verbose mode = false
0 dm-master alive
0 dm-worker alive
0 dm-syncer alive
process dm-master.test already exit
process dm-worker.test already exit
process dm-syncer.test already exit
Starting PD...
Release Version: v7.5.1-7-g7eb188c4f
Edition: Community
Git Commit Hash: 7eb188c4f8caba495a33eafedd4540afbc4ca6fc
Git Branch: release-7.5
UTC Build Time:  2024-05-13 04:29:07
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0curl: (7) Failed connect to 127.0.0.1:2379; Connection refused
2024-05-17 17:45:05.241904 W | pkg/fileutil: check file permission: directory "/tmp/dm_test/lightning_mode/pd" exist, but the permission is "drwxr-xr-x". The recommended permission is "-rwx------" to prevent possible unprivileged access to the data.
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   107  100   107    0     0    98k      0 --:--:-- --:--:-- --:--:--  104k
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   121  100   121    0     0  95275      0 --:--:-- --:--:-- --:--:--  118k
  "is_initialized": true,
Starting TiDB...
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0curl: (7) Failed connect to 127.0.0.1:10080; Connection refused
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0curl: (7) Failed connect to 127.0.0.1:10080; Connection refused
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   117  100   117    0     0   211k      0 --:--:-- --:--:-- --:--:--  114k
{"connections":0,"version":"8.0.11-TiDB-v7.5.1-70-gbe578f5db8","git_hash":"be578f5db8a899a19030344cbac6b4d3629ec872"}[Fri May 17 17:45:19 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_mode/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
rpc addr 127.0.0.1:8261 is alive
[Fri May 17 17:45:20 CST 2024] <<<<<< START DM-WORKER on port 8262, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_mode/conf/dm-worker1.toml >>>>>>
wait for rpc addr 127.0.0.1:8262 alive the 1-th time
rpc addr 127.0.0.1:8262 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/lightning_mode/source1.yaml"
[Fri May 17 17:45:21 CST 2024] <<<<<< START DM-WORKER on port 8263, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_mode/conf/dm-worker2.toml >>>>>>
wait for rpc addr 127.0.0.1:8263 alive the 1-th time
rpc addr 127.0.0.1:8263 is alive
dmctl test cmd: "operate-source create /tmp/dm_test/lightning_mode/source2.yaml"
dmctl test cmd: "check-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_mode/conf/dm-task.yaml"
wait process dm-master.test exit...
wait process dm-master.test exit...
process dm-master.test already exit
[Fri May 17 17:45:27 CST 2024] <<<<<< START DM-MASTER on port 8261, config: /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_mode/conf/dm-master.toml >>>>>>
wait for rpc addr 127.0.0.1:8261 alive the 1-th time
wait for rpc addr 127.0.0.1:8261 alive the 2-th time
rpc addr 127.0.0.1:8261 is alive
dmctl test cmd: "start-task /home/jenkins/agent/workspace/pingcap/tiflow/release-7.5/pull_dm_integration_test/tiflow/dm/tests/lightning_mode/conf/dm-task-dup.yaml --remove-meta"
dmctl test cmd: "query-status test"
got=0 expected=1
command: query-status test "stage": "Paused" count: 0 != expected: 1, failed the 0-th time, will retry again
got=0 expected=1
command: query-status test "stage": "Paused" count: 0 != expected: 1, failed the 1-th time, will retry again
got=0 expected=1
command: query-status test "stage": "Paused" count: 0 != expected: 1, failed the 2-th time, will retry again
got=1 expected=1
got=1 expected=2
command: query-status test "progress": "100.00 %" count: 1 != expected: 2, failed the 3-th time, will retry again
got=1 expected=1
got=1 expected=2
command: query-status test "progress": "100.00 %" count: 1 != expected: 2, failed the 4-th time, will retry again
got=1 expected=1
got=1 expected=2
command: query-status test "progress": "100.00 %" count: 1 != expected: 2, failed the 5-th time, will retry again
got=1 expected=1
got=1 expected=2
command: query-status test "progress": "100.00 %" count: 1 != expected: 2, failed the 6-th time, will retry again
got=1 expected=1
got=1 expected=2
command: query-status test "progress": "100.00 %" count: 1 != expected: 2, failed the 7-th time, will retry again
got=1 expected=1
got=1 expected=2
command: query-status test "progress": "100.00 %" count: 1 != expected: 2, failed the 8-th time, will retry again
got=1 expected=1
got=1 expected=2
command: query-status test "progress": "100.00 %" count: 1 != expected: 2, failed the 9-th time, will retry again
{
    "result": true,
    "msg": "",
    "sources": [
        {
            "result": true,
            "msg": "",
            "sourceStatus": {
                "source": "mysql-replica-01",
                "worker": "worker1",
                "result": null,
                "relayStatus": {
                    "masterBinlog": "(dm-it-30f51eb9-346d-46da-b651-9daf84f9ee31-529zl-fggqt-bin.000001, 2946)",
                    "masterBinlogGtid": "3ac23ed8-142f-11ef-b6cf-269687f4cb15:1-12",
                    "relaySubDir": "3ac23ed8-142f-11ef-b6cf-269687f4cb15.000001",
                    "relayBinlog": "(dm-it-30f51eb9-346d-46da-b651-9daf84f9ee31-529zl-fggqt-bin.000001, 2946)",
                    "relayBinlogGtid": "3ac23ed8-142f-11ef-b6cf-269687f4cb15:1-12",
                    "relayCatchUpMaster": true,
                    "stage": "Running",
                    "result": null
                }
            },
            "subTaskStatus": [
                {
                    "name": "test",
                    "stage": "Paused",
                    "unit": "Load",
                    "result": {
                        "isCanceled": false,
                        "errors": [
                            {
                                "ErrCode": 34019,
                                "ErrClass": "load-unit",
                                "ErrScope": "internal",
                                "ErrLevel": "high",
                                "Message": "",
                                "RawCause": "[Lightning:MetaMgr:ErrMetaMgrUnknown]unknown error occur on meta manager: check whether this task has started before failed: fetch task meta failed: Error 1146 (42S02): Table 'lightning_metadata.task_meta_v2' doesn't exist",
                                "Workaround": ""
                            }
                        ],
                        "detail": null
                    },
                    "unresolvedDDLLockID": "",
                    "load": {
                        "finishedBytes": "0",
                        "totalBytes": "416",
                        "progress": "0.00 %",
                        "metaBinlog": "(dm-it-30f51eb9-346d-46da-b651-9daf84f9ee31-529zl-fggqt-bin.000001, 2946)",
                        "metaBinlogGTID": "3ac23ed8-142f-11ef-b6cf-269687f4cb15:1-12",
                        "bps": "0"
                    },
                    "validation": null
                }
            ]
        },
        {
            "result": true,
            "msg": "",
            "sourceStatus": {
                "source": "mysql-replica-02",
                "worker": "worker2",
                "result": null,
                "relayStatus": null
            },
            "subTaskStatus": [
                {
                    "name": "test",
                    "stage": "Running",
                    "unit": "Load",
                    "result": null,
                    "unresolvedDDLLockID": "",
                    "load": {
                        "finishedBytes": "141",
                        "totalBytes": "141",
                        "progress": "100.00 %",
                        "metaBinlog": "(dm-it-30f51eb9-346d-46da-b651-9daf84f9ee31-529zl-fggqt-bin.000001, 1816)",
                        "metaBinlogGTID": "3b9fe965-142f-11ef-98de-269687f4cb15:1-7",
                        "bps": "18"
                    },
                    "validation": null
                }
            ]
        }
    ]
}
PASS
coverage: 3.8% of statements in github.com/pingcap/tiflow/dm/...
curl: (7) Failed connect to 127.0.0.1:8361; Connection refused
curl: (7) Failed connect to 127.0.0.1:8461; Connection refused
curl: (7) Failed connect to 127.0.0.1:8561; Connection refused
curl: (7) Failed connect to 127.0.0.1:8661; Connection refused
curl: (7) Failed connect to 127.0.0.1:8761; Connection refused
curl: (7) Failed connect to 127.0.0.1:8264; Connection refused
curl: (7) Failed connect to 127.0.0.1:18262; Connection refused
curl: (7) Failed connect to 127.0.0.1:18263; Connection refused
make: *** [dm_integration_test_in_group] Error 1
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
Post stage
[Pipeline] sh
+ ls /tmp/dm_test
adjust_gtid
async_checkpoint_flush
binlog_parse
case_sensitive
check_task
checkpoint_transaction
cov.adjust_gtid.dmctl.1715937962.687.out
cov.adjust_gtid.dmctl.1715937964.816.out
cov.adjust_gtid.dmctl.1715937965.865.out
cov.adjust_gtid.dmctl.1715937967.1003.out
cov.adjust_gtid.dmctl.1715937977.1048.out
cov.adjust_gtid.dmctl.1715937979.1274.out
cov.adjust_gtid.master.out
cov.adjust_gtid.worker.8262.1715937961.out
cov.adjust_gtid.worker.8262.1715937977.out
cov.adjust_gtid.worker.8262.1715937982.out
cov.adjust_gtid.worker.8263.1715937963.out
cov.adjust_gtid.worker.8263.1715937978.out
cov.adjust_gtid.worker.8263.1715937983.out
cov.async_checkpoint_flush.dmctl.1715937992.1960.out
cov.async_checkpoint_flush.dmctl.1715937993.2008.out
cov.async_checkpoint_flush.master.out
cov.async_checkpoint_flush.worker.8262.1715937991.out
cov.binlog_parse.dmctl.1715938032.13843.out
cov.binlog_parse.dmctl.1715938032.13890.out
cov.binlog_parse.dmctl.1715938034.13968.out
cov.binlog_parse.dmctl.1715938034.14055.out
cov.binlog_parse.dmctl.1715938035.14107.out
cov.binlog_parse.master.out
cov.binlog_parse.worker.8262.1715938031.out
cov.case_sensitive.dmctl.1715938043.14515.out
cov.case_sensitive.dmctl.1715938045.14636.out
cov.case_sensitive.dmctl.1715938046.14677.out
cov.case_sensitive.dmctl.1715938050.14899.out
cov.case_sensitive.dmctl.1715938068.15188.out
cov.case_sensitive.dmctl.1715938068.15230.out
cov.case_sensitive.dmctl.1715938068.15274.out
cov.case_sensitive.dmctl.1715938077.15646.out
cov.case_sensitive.dmctl.1715938078.15769.out
cov.case_sensitive.dmctl.1715938079.15808.out
cov.case_sensitive.dmctl.1715938087.16030.out
cov.case_sensitive.dmctl.1715938104.16316.out
cov.case_sensitive.dmctl.1715938104.16358.out
cov.case_sensitive.dmctl.1715938104.16404.out
cov.case_sensitive.master.out
cov.case_sensitive.worker.8262.1715938042.out
cov.case_sensitive.worker.8262.1715938049.out
cov.case_sensitive.worker.8262.1715938076.out
cov.case_sensitive.worker.8262.1715938082.out
cov.case_sensitive.worker.8263.1715938044.out
cov.case_sensitive.worker.8263.1715938052.out
cov.case_sensitive.worker.8263.1715938077.out
cov.case_sensitive.worker.8263.1715938088.out
cov.check_task.dmctl.1715938186.18079.out
cov.check_task.dmctl.1715938187.18297.out
cov.check_task.dmctl.1715938187.18355.out
cov.check_task.dmctl.1715938188.18425.out
cov.check_task.dmctl.1715938190.18528.out
cov.check_task.dmctl.1715938190.18579.out
cov.check_task.dmctl.1715938191.18632.out
cov.check_task.dmctl.1715938191.18680.out
cov.check_task.dmctl.1715938191.18739.out
cov.check_task.master.out
cov.check_task.worker.8262.1715938185.out
cov.checkpoint_transaction.dmctl.1715938112.16790.out
cov.checkpoint_transaction.dmctl.1715938113.16828.out
cov.checkpoint_transaction.dmctl.1715938142.17334.out
cov.checkpoint_transaction.dmctl.1715938157.17387.out
cov.checkpoint_transaction.dmctl.1715938157.17457.out
cov.checkpoint_transaction.dmctl.1715938157.17497.out
cov.checkpoint_transaction.dmctl.1715938162.17648.out
cov.checkpoint_transaction.dmctl.1715938162.17694.out
cov.checkpoint_transaction.master.out
cov.checkpoint_transaction.worker.8262.1715938111.out
cov.checkpoint_transaction.worker.8262.1715938161.out
cov.dm_syncer.dmctl.1715938200.19205.out
cov.dm_syncer.dmctl.1715938200.19248.out
cov.dm_syncer.dmctl.1715938201.19292.out
cov.dm_syncer.dmctl.1715938203.19389.out
cov.dm_syncer.master.out
cov.dm_syncer.syncer.out
cov.dm_syncer.worker.8262.1715938198.out
cov.dm_syncer.worker.8263.1715938199.out
cov.downstream_diff_index.dmctl.1715938232.20040.out
cov.downstream_diff_index.dmctl.1715938233.20161.out
cov.downstream_diff_index.dmctl.1715938234.20207.out
cov.downstream_diff_index.master.out
cov.downstream_diff_index.worker.8262.1715938231.out
cov.downstream_diff_index.worker.8263.1715938232.out
cov.downstream_more_column.dmctl.1715938250.20650.out
cov.downstream_more_column.dmctl.1715938251.20696.out
cov.downstream_more_column.dmctl.1715938254.20768.out
cov.downstream_more_column.dmctl.1715938254.20811.out
cov.downstream_more_column.dmctl.1715938256.20866.out
cov.downstream_more_column.dmctl.1715938256.20903.out
cov.downstream_more_column.dmctl.1715938256.20942.out
cov.downstream_more_column.dmctl.1715938256.20981.out
cov.downstream_more_column.master.out
cov.downstream_more_column.worker.8262.1715938249.out
cov.drop_column_with_index.dmctl.1715938263.21310.out
cov.drop_column_with_index.dmctl.1715938264.21359.out
cov.drop_column_with_index.dmctl.1715938266.21400.out
cov.drop_column_with_index.dmctl.1715938268.21486.out
cov.drop_column_with_index.dmctl.1715938268.21525.out
cov.drop_column_with_index.dmctl.1715938268.21570.out
cov.drop_column_with_index.dmctl.1715938273.21627.out
cov.drop_column_with_index.dmctl.1715938274.21705.out
cov.drop_column_with_index.master.out
cov.drop_column_with_index.worker.8262.1715938262.out
cov.duplicate_event.dmctl.1715938285.22146.out
cov.duplicate_event.dmctl.1715938285.22189.out
cov.duplicate_event.dmctl.1715938349.22514.out
cov.duplicate_event.dmctl.1715938350.22557.out
cov.duplicate_event.dmctl.1715938351.22628.out
cov.duplicate_event.dmctl.1715938353.22667.out
cov.duplicate_event.dmctl.1715938373.23222.out
cov.duplicate_event.dmctl.1715938373.23263.out
cov.duplicate_event.dmctl.1715938439.23604.out
cov.duplicate_event.dmctl.1715938440.23653.out
cov.duplicate_event.dmctl.1715938441.23740.out
cov.duplicate_event.dmctl.1715938443.23783.out
cov.duplicate_event.master.out
cov.duplicate_event.worker.8262.1715938284.out
cov.duplicate_event.worker.8262.1715938372.out
cov.duplicate_event.worker.8263.1715938348.out
cov.duplicate_event.worker.8263.1715938438.out
cov.expression_filter.dmctl.1715938451.24184.out
cov.expression_filter.dmctl.1715938451.24230.out
cov.expression_filter.dmctl.1715938452.24274.out
cov.expression_filter.dmctl.1715938452.24318.out
cov.expression_filter.dmctl.1715938452.24356.out
cov.expression_filter.dmctl.1715938454.24401.out
cov.expression_filter.dmctl.1715938456.24571.out
cov.expression_filter.dmctl.1715938464.24865.out
cov.expression_filter.dmctl.1715938464.24914.out
cov.expression_filter.dmctl.1715938470.24971.out
cov.expression_filter.master.out
cov.expression_filter.worker.8262.1715938450.out
cov.expression_filter.worker.8262.1715938463.out
cov.extend_column.dmctl.1715938479.25426.out
cov.extend_column.dmctl.1715938479.25468.out
cov.extend_column.dmctl.1715938481.25520.out
cov.extend_column.dmctl.1715938481.25568.out
cov.extend_column.dmctl.1715938492.26211.out
cov.extend_column.dmctl.1715938493.26260.out
cov.extend_column.dmctl.1715938494.26316.out
cov.extend_column.dmctl.1715938494.26370.out
cov.extend_column.master.out
cov.extend_column.worker.8262.1715938477.out
cov.extend_column.worker.8262.1715938490.out
cov.extend_column.worker.8263.1715938478.out
cov.extend_column.worker.8263.1715938491.out
cov.fake_rotate_event.dmctl.1715938505.26951.out
cov.fake_rotate_event.dmctl.1715938507.26996.out
cov.fake_rotate_event.dmctl.1715938508.27041.out
cov.fake_rotate_event.dmctl.1715938511.27259.out
cov.fake_rotate_event.master.out
cov.fake_rotate_event.worker.8262.1715938503.out
cov.fake_rotate_event.worker.8262.1715938510.out
cov.foreign_key.dmctl.1715938519.27604.out
cov.foreign_key.dmctl.1715938519.27642.out
cov.foreign_key.dmctl.1715938521.27688.out
cov.foreign_key.dmctl.1715938521.27766.out
cov.foreign_key.master.out
cov.foreign_key.worker.8262.1715938518.out
cov.full_mode.dmctl.1715938530.28152.out
cov.full_mode.dmctl.1715938531.28275.out
cov.full_mode.dmctl.1715938532.28322.out
cov.full_mode.dmctl.1715938536.28374.out
cov.full_mode.dmctl.1715938545.28816.out
cov.full_mode.dmctl.1715938547.28859.out
cov.full_mode.dmctl.1715938548.28909.out
cov.full_mode.dmctl.1715938560.29353.out
cov.full_mode.dmctl.1715938561.29485.out
cov.full_mode.dmctl.1715938562.29527.out
cov.full_mode.dmctl.1715938565.29607.out
cov.full_mode.dmctl.1715938573.29962.out
cov.full_mode.dmctl.1715938575.30097.out
cov.full_mode.dmctl.1715938576.30134.out
cov.full_mode.dmctl.1715938578.30169.out
cov.full_mode.dmctl.1715938587.30644.out
cov.full_mode.dmctl.1715938587.30687.out
cov.full_mode.dmctl.1715938588.30733.out
cov.full_mode.dmctl.1715938590.30792.out
cov.full_mode.master.out
cov.full_mode.worker.8262.1715938528.out
cov.full_mode.worker.8262.1715938543.out
cov.full_mode.worker.8262.1715938559.out
cov.full_mode.worker.8262.1715938572.out
cov.full_mode.worker.8262.1715938585.out
cov.full_mode.worker.8263.1715938530.out
cov.full_mode.worker.8263.1715938544.out
cov.full_mode.worker.8263.1715938560.out
cov.full_mode.worker.8263.1715938574.out
cov.full_mode.worker.8263.1715938586.out
cov.gbk.dmctl.1715938599.31314.out
cov.gbk.dmctl.1715938600.31351.out
cov.gbk.dmctl.1715938602.31402.out
cov.gbk.dmctl.1715938641.32882.out
cov.gbk.dmctl.1715938644.32973.out
cov.gbk.dmctl.1715938675.34294.out
cov.gbk.master.out
cov.gbk.worker.8262.1715938597.out
cov.gbk.worker.8262.1715938617.out
cov.gbk.worker.8262.1715938623.out
cov.gbk.worker.8262.1715938629.out
cov.gbk.worker.8262.1715938635.out
cov.gbk.worker.8262.1715938639.out
cov.gbk.worker.8262.1715938646.out
cov.gbk.worker.8262.1715938652.out
cov.gbk.worker.8262.1715938661.out
cov.gbk.worker.8262.1715938666.out
cov.gbk.worker.8262.1715938673.out
cov.gbk.worker.8263.1715938598.out
cov.gbk.worker.8263.1715938619.out
cov.gbk.worker.8263.1715938625.out
cov.gbk.worker.8263.1715938630.out
cov.gbk.worker.8263.1715938636.out
cov.gbk.worker.8263.1715938640.out
cov.gbk.worker.8263.1715938647.out
cov.gbk.worker.8263.1715938653.out
cov.gbk.worker.8263.1715938662.out
cov.gbk.worker.8263.1715938667.out
cov.gbk.worker.8263.1715938674.out
cov.gtid.dmctl.1715938687.34674.out
cov.gtid.dmctl.1715938688.34792.out
cov.gtid.dmctl.1715938689.34835.out
cov.gtid.dmctl.1715938691.34957.out
cov.gtid.dmctl.1715938691.35003.out
cov.gtid.dmctl.1715938698.35370.out
cov.gtid.dmctl.1715938700.35501.out
cov.gtid.dmctl.1715938701.35542.out
cov.gtid.dmctl.1715938703.35731.out
cov.gtid.dmctl.1715938704.35932.out
cov.gtid.dmctl.1715938705.36013.out
cov.gtid.dmctl.1715938709.36244.out
cov.gtid.dmctl.1715938710.36283.out
cov.gtid.master.out
cov.gtid.worker.8262.1715938686.out
cov.gtid.worker.8262.1715938697.out
cov.gtid.worker.8262.1715938703.out
cov.gtid.worker.8262.1715938707.out
cov.gtid.worker.8263.1715938687.out
cov.gtid.worker.8263.1715938699.out
cov.gtid.worker.8263.1715938703.out
cov.gtid.worker.8263.1715938707.out
cov.ha_cases.dmctl.1715938719.36654.out
cov.ha_cases.dmctl.1715938720.36779.out
cov.ha_cases.dmctl.1715938722.36823.out
cov.ha_cases.dmctl.1715938722.36866.out
cov.ha_cases.dmctl.1715938722.36908.out
cov.ha_cases.dmctl.1715938733.37273.out
cov.ha_cases.dmctl.1715938735.37405.out
cov.ha_cases.dmctl.1715938736.37452.out
cov.ha_cases.dmctl.1715938738.37489.out
cov.ha_cases.dmctl.1715938742.37637.out
cov.ha_cases.dmctl.1715938743.37764.out
cov.ha_cases.dmctl.1715938744.37807.out
cov.ha_cases.dmctl.1715938746.37876.out
cov.ha_cases.dmctl.1715938747.38004.out
cov.ha_cases.dmctl.1715938767.38583.out
cov.ha_cases.dmctl.1715938769.38709.out
cov.ha_cases.dmctl.1715938771.38753.out
cov.ha_cases.dmctl.1715938772.38840.out
cov.ha_cases.dmctl.1715938776.39043.out
cov.ha_cases.dmctl.1715938777.39084.out
cov.ha_cases.dmctl.1715938778.39137.out
cov.ha_cases.dmctl.1715938783.39222.out
cov.ha_cases.dmctl.1715938785.39446.out
cov.ha_cases.dmctl.1715938790.39622.out
cov.ha_cases.dmctl.1715938792.39849.out
cov.ha_cases.dmctl.1715938797.40029.out
cov.ha_cases.dmctl.1715938799.40248.out
cov.ha_cases.dmctl.1715938800.40364.out
cov.ha_cases.dmctl.1715938801.40412.out
cov.ha_cases.dmctl.1715938802.40450.out
cov.ha_cases.dmctl.1715938807.40591.out
cov.ha_cases.dmctl.1715938809.40813.out
cov.ha_cases.dmctl.1715938829.41486.out
cov.ha_cases.dmctl.1715938859.41959.out
cov.ha_cases.dmctl.1715938859.42001.out
cov.ha_cases.dmctl.1715938859.42042.out
cov.ha_cases.dmctl.1715938860.42083.out
cov.ha_cases.dmctl.1715938861.42206.out
cov.ha_cases.dmctl.1715938864.42283.out
cov.ha_cases.dmctl.1715938869.42487.out
cov.ha_cases.dmctl.1715938938.43322.out
cov.ha_cases.dmctl.1715938939.43366.out
cov.ha_cases.dmctl.1715938942.43532.out
cov.ha_cases.dmctl.1715938942.43576.out
cov.ha_cases.dmctl.1715938944.43708.out
cov.ha_cases.dmctl.1715938947.43801.out
cov.ha_cases.dmctl.1715938949.43867.out
cov.ha_cases.dmctl.1715938949.43912.out
cov.ha_cases.dmctl.1715938949.43953.out
cov.ha_cases.dmctl.1715938950.43993.out
cov.ha_cases.master.out
cov.ha_cases.worker.18262.1715938798.out
cov.ha_cases.worker.8262.1715938718.out
cov.ha_cases.worker.8262.1715938731.out
cov.ha_cases.worker.8262.1715938742.out
cov.ha_cases.worker.8262.1715938766.out
cov.ha_cases.worker.8262.1715938783.out
cov.ha_cases.worker.8262.1715938791.out
cov.ha_cases.worker.8262.1715938807.out
cov.ha_cases.worker.8262.1715938864.out
cov.ha_cases.worker.8262.1715938937.out
cov.ha_cases.worker.8263.1715938719.out
cov.ha_cases.worker.8263.1715938734.out
cov.ha_cases.worker.8263.1715938746.out
cov.ha_cases.worker.8263.1715938768.out
cov.ha_cases.worker.8263.1715938784.out
cov.ha_cases.worker.8263.1715938790.out
cov.ha_cases.worker.8263.1715938808.out
cov.ha_cases.worker.8263.1715938860.out
cov.ha_cases.worker.8263.1715938943.out
cov.ha_cases.worker.8264.1715938739.out
cov.ha_cases.worker.8264.1715938797.out
cov.http_proxies.dmctl.1715939017.44749.out
cov.http_proxies.dmctl.1715939028.45078.out
cov.http_proxies.dmctl.1715939039.45409.out
cov.http_proxies.master.out
cov.http_proxies.worker.8262.1715939014.out
cov.http_proxies.worker.8262.1715939027.out
cov.http_proxies.worker.8262.1715939038.out
cov.lightning_load_task.dmctl.1715939049.45837.out
cov.lightning_load_task.dmctl.1715939050.45971.out
cov.lightning_load_task.dmctl.1715939053.46107.out
cov.lightning_load_task.dmctl.1715939054.46169.out
cov.lightning_load_task.dmctl.1715939055.46235.out
cov.lightning_load_task.dmctl.1715939055.46283.out
cov.lightning_load_task.dmctl.1715939057.46360.out
cov.lightning_load_task.dmctl.1715939058.46404.out
cov.lightning_load_task.dmctl.1715939058.46447.out
cov.lightning_load_task.dmctl.1715939059.46587.out
cov.lightning_load_task.dmctl.1715939059.46629.out
cov.lightning_load_task.dmctl.1715939059.46675.out
cov.lightning_load_task.dmctl.1715939059.46721.out
cov.lightning_load_task.dmctl.1715939061.46796.out
cov.lightning_load_task.dmctl.1715939062.46841.out
cov.lightning_load_task.dmctl.1715939062.46886.out
cov.lightning_load_task.dmctl.1715939063.46930.out
cov.lightning_load_task.dmctl.1715939064.47057.out
cov.lightning_load_task.dmctl.1715939066.47125.out
cov.lightning_load_task.dmctl.1715939066.47171.out
cov.lightning_load_task.dmctl.1715939068.47218.out
cov.lightning_load_task.dmctl.1715939069.47345.out
cov.lightning_load_task.dmctl.1715939071.47419.out
cov.lightning_load_task.dmctl.1715939071.47462.out
cov.lightning_load_task.dmctl.1715939071.47508.out
cov.lightning_load_task.dmctl.1715939071.47551.out
cov.lightning_load_task.dmctl.1715939082.47820.out
cov.lightning_load_task.dmctl.1715939083.47864.out
cov.lightning_load_task.dmctl.1715939084.47996.out
cov.lightning_load_task.dmctl.1715939084.48039.out
cov.lightning_load_task.master.out
cov.lightning_load_task.worker.8262.1715939048.out
cov.lightning_load_task.worker.8262.1715939058.out
cov.lightning_load_task.worker.8262.1715939068.out
cov.lightning_load_task.worker.8263.1715939049.out
cov.lightning_load_task.worker.8263.1715939063.out
cov.lightning_load_task.worker.8264.1715939052.out
cov.lightning_load_task.worker.8264.1715939083.out
cov.lightning_mode.dmctl.1715939121.49159.out
cov.lightning_mode.dmctl.1715939122.49285.out
cov.lightning_mode.dmctl.1715939123.49326.out
cov.lightning_mode.dmctl.1715939129.49527.out
cov.lightning_mode.dmctl.1715939131.49588.out
cov.lightning_mode.master.out
dm_syncer
downstream
downstream_diff_index
downstream_more_column
drop_column_with_index
duplicate_event
expression_filter
extend_column
fake_rotate_event
foreign_key
full_mode
gbk
goroutines
gtid
ha_cases
http_proxies
lightning_load_task
lightning_mode
sql_res.adjust_gtid.txt
sql_res.async_checkpoint_flush.txt
sql_res.binlog_parse.txt
sql_res.case_sensitive.txt
sql_res.check_task.txt
sql_res.checkpoint_transaction.txt
sql_res.dm_syncer.txt
sql_res.downstream_diff_index.txt
sql_res.downstream_more_column.txt
sql_res.drop_column_with_index.txt
sql_res.duplicate_event.txt
sql_res.expression_filter.txt
sql_res.extend_column.txt
sql_res.fake_rotate_event.txt
sql_res.foreign_key.txt
sql_res.full_mode.txt
sql_res.gbk.txt
sql_res.gtid.txt
sql_res.ha_cases.txt
sql_res.http_proxies.txt
sql_res.lightning_load_task.txt
sql_res.lightning_mode.txt
tidb.toml
++ find /tmp/dm_test/ -type f -name '*.log'
+ tar -cvzf log-G11.tar.gz /tmp/dm_test/expression_filter/worker1/log/stdout.log /tmp/dm_test/expression_filter/worker1/log/dm-worker.log /tmp/dm_test/expression_filter/dmctl.1715938470.log /tmp/dm_test/expression_filter/master/log/stdout.log /tmp/dm_test/expression_filter/master/log/dm-master.log /tmp/dm_test/expression_filter/dmctl.1715938464.log /tmp/dm_test/fake_rotate_event/worker1/log/stdout.log /tmp/dm_test/fake_rotate_event/worker1/log/dm-worker.log /tmp/dm_test/fake_rotate_event/dmctl.1715938508.log /tmp/dm_test/fake_rotate_event/dmctl.1715938511.log /tmp/dm_test/fake_rotate_event/dmctl.1715938507.log /tmp/dm_test/fake_rotate_event/dmctl.1715938505.log /tmp/dm_test/fake_rotate_event/master/log/stdout.log /tmp/dm_test/fake_rotate_event/master/log/dm-master.log /tmp/dm_test/fake_rotate_event/sync_diff_stdout.log /tmp/dm_test/adjust_gtid/worker1/log/stdout.log /tmp/dm_test/adjust_gtid/worker1/log/dm-worker.log /tmp/dm_test/adjust_gtid/dmctl.1715937979.log /tmp/dm_test/adjust_gtid/worker2/log/stdout.log /tmp/dm_test/adjust_gtid/worker2/log/dm-worker.log /tmp/dm_test/adjust_gtid/dmctl.1715937977.log /tmp/dm_test/adjust_gtid/dmctl.1715937965.log /tmp/dm_test/adjust_gtid/dmctl.1715937967.log /tmp/dm_test/adjust_gtid/dmctl.1715937964.log /tmp/dm_test/adjust_gtid/master/log/stdout.log /tmp/dm_test/adjust_gtid/master/log/dm-master.log /tmp/dm_test/adjust_gtid/sync_diff_stdout.log /tmp/dm_test/adjust_gtid/dmctl.1715937962.log /tmp/dm_test/downstream_more_column/worker1/log/stdout.log /tmp/dm_test/downstream_more_column/worker1/log/dm-worker.log /tmp/dm_test/downstream_more_column/dmctl.1715938250.log /tmp/dm_test/downstream_more_column/dmctl.1715938256.log /tmp/dm_test/downstream_more_column/dmctl.1715938254.log /tmp/dm_test/downstream_more_column/master/log/stdout.log /tmp/dm_test/downstream_more_column/master/log/dm-master.log /tmp/dm_test/downstream_more_column/dmctl.1715938251.log /tmp/dm_test/dm_syncer/worker1/log/stdout.log /tmp/dm_test/dm_syncer/worker1/log/dm-worker.log /tmp/dm_test/dm_syncer/worker2/log/stdout.log /tmp/dm_test/dm_syncer/worker2/log/dm-worker.log /tmp/dm_test/dm_syncer/syncer1/log/stdout.log /tmp/dm_test/dm_syncer/syncer1/log/dm-syncer.log /tmp/dm_test/dm_syncer/dmctl.1715938203.log /tmp/dm_test/dm_syncer/dmctl.1715938201.log /tmp/dm_test/dm_syncer/dmctl.1715938200.log /tmp/dm_test/dm_syncer/master/log/stdout.log /tmp/dm_test/dm_syncer/master/log/dm-master.log /tmp/dm_test/dm_syncer/sync_diff_stdout.log /tmp/dm_test/dm_syncer/syncer2/log/stdout.log /tmp/dm_test/dm_syncer/syncer2/log/dm-syncer.log /tmp/dm_test/ha_cases/worker1/log/stdout.log /tmp/dm_test/ha_cases/worker1/log/dm-worker.log /tmp/dm_test/ha_cases/dmctl.1715938942.log /tmp/dm_test/ha_cases/worker2/log/stdout.log /tmp/dm_test/ha_cases/worker2/log/dm-worker.log /tmp/dm_test/ha_cases/master2/log/stdout.log /tmp/dm_test/ha_cases/master2/log/dm-master.log /tmp/dm_test/ha_cases/dmctl.1715938950.log /tmp/dm_test/ha_cases/master1/log/stdout.log /tmp/dm_test/ha_cases/master1/log/dm-master.log /tmp/dm_test/ha_cases/dmctl.1715938949.log /tmp/dm_test/ha_cases/dmctl.1715938944.log /tmp/dm_test/ha_cases/dmctl.1715938939.log /tmp/dm_test/ha_cases/master3/log/stdout.log /tmp/dm_test/ha_cases/master3/log/dm-master.log /tmp/dm_test/ha_cases/sync_diff_stdout.log /tmp/dm_test/ha_cases/dmctl.1715938947.log /tmp/dm_test/ha_cases/dmctl.1715938938.log /tmp/dm_test/gbk/worker1/log/stdout.log /tmp/dm_test/gbk/worker1/log/dm-worker.log /tmp/dm_test/gbk/dmctl.1715938600.log /tmp/dm_test/gbk/worker2/log/stdout.log /tmp/dm_test/gbk/worker2/log/dm-worker.log /tmp/dm_test/gbk/dmctl.1715938599.log /tmp/dm_test/gbk/dmctl.1715938641.log /tmp/dm_test/gbk/dmctl.1715938675.log /tmp/dm_test/gbk/master/log/stdout.log /tmp/dm_test/gbk/master/log/dm-master.log /tmp/dm_test/gbk/sync_diff_stdout.log /tmp/dm_test/gbk/dmctl.1715938602.log /tmp/dm_test/gbk/dmctl.1715938644.log /tmp/dm_test/goroutines/stack/log/master-8361.log /tmp/dm_test/goroutines/stack/log/master-8261.log /tmp/dm_test/goroutines/stack/log/worker-8262.log /tmp/dm_test/goroutines/stack/log/worker-8263.log /tmp/dm_test/goroutines/stack/log/master-8761.log /tmp/dm_test/goroutines/stack/log/master-8661.log /tmp/dm_test/goroutines/stack/log/worker-18263.log /tmp/dm_test/goroutines/stack/log/worker-8264.log /tmp/dm_test/goroutines/stack/log/master-8561.log /tmp/dm_test/goroutines/stack/log/worker-18262.log /tmp/dm_test/goroutines/stack/log/master-8461.log /tmp/dm_test/lightning_mode/worker1/log/stdout.log /tmp/dm_test/lightning_mode/worker1/log/dm-worker.log /tmp/dm_test/lightning_mode/worker2/log/stdout.log /tmp/dm_test/lightning_mode/worker2/log/dm-worker.log /tmp/dm_test/lightning_mode/tikv.log /tmp/dm_test/lightning_mode/dmctl.1715939122.log /tmp/dm_test/lightning_mode/dmctl.1715939123.log /tmp/dm_test/lightning_mode/pd.log /tmp/dm_test/lightning_mode/pd/region-meta/000001.log /tmp/dm_test/lightning_mode/pd/hot-region/000001.log /tmp/dm_test/lightning_mode/master/log/stdout.log /tmp/dm_test/lightning_mode/master/log/dm-master.log /tmp/dm_test/lightning_mode/dmctl.1715939121.log /tmp/dm_test/lightning_mode/tikv/db/000005.log /tmp/dm_test/lightning_mode/tidb.log /tmp/dm_test/lightning_mode/dmctl.1715939131.log /tmp/dm_test/lightning_mode/dmctl.1715939129.log /tmp/dm_test/lightning_load_task/worker1/log/stdout.log /tmp/dm_test/lightning_load_task/worker1/log/dm-worker.log /tmp/dm_test/lightning_load_task/dmctl.1715939059.log /tmp/dm_test/lightning_load_task/dmctl.1715939068.log /tmp/dm_test/lightning_load_task/worker2/log/stdout.log /tmp/dm_test/lightning_load_task/worker2/log/dm-worker.log /tmp/dm_test/lightning_load_task/dmctl.1715939058.log /tmp/dm_test/lightning_load_task/dmctl.1715939054.log /tmp/dm_test/lightning_load_task/dmctl.1715939063.log /tmp/dm_test/lightning_load_task/dmctl.1715939061.log /tmp/dm_test/lightning_load_task/dmctl.1715939069.log /tmp/dm_test/lightning_load_task/dmctl.1715939082.log /tmp/dm_test/lightning_load_task/dmctl.1715939066.log /tmp/dm_test/lightning_load_task/dmctl.1715939049.log /tmp/dm_test/lightning_load_task/dmctl.1715939053.log /tmp/dm_test/lightning_load_task/dmctl.1715939050.log /tmp/dm_test/lightning_load_task/dmctl.1715939055.log /tmp/dm_test/lightning_load_task/dmctl.1715939084.log /tmp/dm_test/lightning_load_task/master/log/stdout.log /tmp/dm_test/lightning_load_task/master/log/dm-master.log /tmp/dm_test/lightning_load_task/sync_diff_stdout.log /tmp/dm_test/lightning_load_task/worker3/log/stdout.log /tmp/dm_test/lightning_load_task/worker3/log/dm-worker.log /tmp/dm_test/lightning_load_task/dmctl.1715939071.log /tmp/dm_test/lightning_load_task/dmctl.1715939064.log /tmp/dm_test/lightning_load_task/dmctl.1715939083.log /tmp/dm_test/lightning_load_task/dmctl.1715939057.log /tmp/dm_test/lightning_load_task/dmctl.1715939062.log /tmp/dm_test/binlog_parse/worker1/log/stdout.log /tmp/dm_test/binlog_parse/worker1/log/dm-worker.log /tmp/dm_test/binlog_parse/dmctl.1715938034.log /tmp/dm_test/binlog_parse/dmctl.1715938035.log /tmp/dm_test/binlog_parse/master/log/stdout.log /tmp/dm_test/binlog_parse/master/log/dm-master.log /tmp/dm_test/binlog_parse/sync_diff_stdout.log /tmp/dm_test/binlog_parse/dmctl.1715938032.log /tmp/dm_test/duplicate_event/worker1/log/stdout.log /tmp/dm_test/duplicate_event/worker1/log/dm-worker.log /tmp/dm_test/duplicate_event/dmctl.1715938440.log /tmp/dm_test/duplicate_event/dmctl.1715938441.log /tmp/dm_test/duplicate_event/worker2/log/stdout.log /tmp/dm_test/duplicate_event/worker2/log/dm-worker.log /tmp/dm_test/duplicate_event/dmctl.1715938439.log /tmp/dm_test/duplicate_event/dmctl.1715938373.log /tmp/dm_test/duplicate_event/dmctl.1715938443.log /tmp/dm_test/duplicate_event/master/log/stdout.log /tmp/dm_test/duplicate_event/master/log/dm-master.log /tmp/dm_test/duplicate_event/sync_diff_stdout.log /tmp/dm_test/drop_column_with_index/worker1/log/stdout.log /tmp/dm_test/drop_column_with_index/worker1/log/dm-worker.log /tmp/dm_test/drop_column_with_index/dmctl.1715938264.log /tmp/dm_test/drop_column_with_index/dmctl.1715938274.log /tmp/dm_test/drop_column_with_index/dmctl.1715938268.log /tmp/dm_test/drop_column_with_index/dmctl.1715938266.log /tmp/dm_test/drop_column_with_index/master/log/stdout.log /tmp/dm_test/drop_column_with_index/master/log/dm-master.log /tmp/dm_test/drop_column_with_index/sync_diff_stdout.log /tmp/dm_test/drop_column_with_index/dmctl.1715938273.log /tmp/dm_test/drop_column_with_index/dmctl.1715938263.log /tmp/dm_test/extend_column/worker1/log/stdout.log /tmp/dm_test/extend_column/worker1/log/dm-worker.log /tmp/dm_test/extend_column/worker2/log/stdout.log /tmp/dm_test/extend_column/worker2/log/dm-worker.log /tmp/dm_test/extend_column/dmctl.1715938493.log /tmp/dm_test/extend_column/dmctl.1715938492.log /tmp/dm_test/extend_column/master/log/stdout.log /tmp/dm_test/extend_column/master/log/dm-master.log /tmp/dm_test/extend_column/dmctl.1715938494.log /tmp/dm_test/checkpoint_transaction/worker1/log/stdout.log /tmp/dm_test/checkpoint_transaction/worker1/log/dm-worker.log /tmp/dm_test/checkpoint_transaction/dmctl.1715938157.log /tmp/dm_test/checkpoint_transaction/dmctl.1715938112.log /tmp/dm_test/checkpoint_transaction/dmctl.1715938113.log /tmp/dm_test/checkpoint_transaction/master/log/stdout.log /tmp/dm_test/checkpoint_transaction/master/log/dm-master.log /tmp/dm_test/checkpoint_transaction/dmctl.1715938142.log /tmp/dm_test/checkpoint_transaction/sync_diff_stdout.log /tmp/dm_test/checkpoint_transaction/dmctl.1715938162.log /tmp/dm_test/downstream_diff_index/worker1/log/stdout.log /tmp/dm_test/downstream_diff_index/worker1/log/dm-worker.log /tmp/dm_test/downstream_diff_index/worker2/log/stdout.log /tmp/dm_test/downstream_diff_index/worker2/log/dm-worker.log /tmp/dm_test/downstream_diff_index/dmctl.1715938232.log /tmp/dm_test/downstream_diff_index/master/log/stdout.log /tmp/dm_test/downstream_diff_index/master/log/dm-master.log /tmp/dm_test/downstream_diff_index/dmctl.1715938234.log /tmp/dm_test/downstream_diff_index/dmctl.1715938233.log /tmp/dm_test/gtid/dmctl.1715938698.log /tmp/dm_test/gtid/worker1/log/stdout.log /tmp/dm_test/gtid/worker1/log/dm-worker.log /tmp/dm_test/gtid/dmctl.1715938701.log /tmp/dm_test/gtid/dmctl.1715938709.log /tmp/dm_test/gtid/worker2/log/stdout.log /tmp/dm_test/gtid/worker2/log/dm-worker.log /tmp/dm_test/gtid/dmctl.1715938704.log /tmp/dm_test/gtid/dmctl.1715938700.log /tmp/dm_test/gtid/dmctl.1715938705.log /tmp/dm_test/gtid/dmctl.1715938710.log /tmp/dm_test/gtid/master/log/stdout.log /tmp/dm_test/gtid/master/log/dm-master.log /tmp/dm_test/gtid/sync_diff_stdout.log /tmp/dm_test/gtid/dmctl.1715938703.log /tmp/dm_test/async_checkpoint_flush/worker1/log/stdout.log /tmp/dm_test/async_checkpoint_flush/worker1/log/dm-worker.log /tmp/dm_test/async_checkpoint_flush/dmctl.1715937993.log /tmp/dm_test/async_checkpoint_flush/dmctl.1715937992.log /tmp/dm_test/async_checkpoint_flush/master/log/stdout.log /tmp/dm_test/async_checkpoint_flush/master/log/dm-master.log /tmp/dm_test/async_checkpoint_flush/sync_diff_stdout.log /tmp/dm_test/foreign_key/worker1/log/stdout.log /tmp/dm_test/foreign_key/worker1/log/dm-worker.log /tmp/dm_test/foreign_key/dmctl.1715938521.log /tmp/dm_test/foreign_key/dmctl.1715938519.log /tmp/dm_test/foreign_key/master/log/stdout.log /tmp/dm_test/foreign_key/master/log/dm-master.log /tmp/dm_test/foreign_key/sync_diff_stdout.log /tmp/dm_test/full_mode/worker1/log/stdout.log /tmp/dm_test/full_mode/worker1/log/dm-worker.log /tmp/dm_test/full_mode/dmctl.1715938590.log /tmp/dm_test/full_mode/worker2/log/stdout.log /tmp/dm_test/full_mode/worker2/log/dm-worker.log /tmp/dm_test/full_mode/dmctl.1715938588.log /tmp/dm_test/full_mode/master/log/stdout.log /tmp/dm_test/full_mode/master/log/dm-master.log /tmp/dm_test/full_mode/sync_diff_stdout.log /tmp/dm_test/full_mode/dmctl.1715938587.log /tmp/dm_test/downstream/tidb/log/tidb.log /tmp/dm_test/http_proxies/worker1/log/stdout.log /tmp/dm_test/http_proxies/worker1/log/dm-worker.log /tmp/dm_test/http_proxies/dmctl.1715939039.log /tmp/dm_test/http_proxies/dmctl.1715939017.log /tmp/dm_test/http_proxies/dmctl.1715939028.log /tmp/dm_test/http_proxies/master/log/stdout.log /tmp/dm_test/http_proxies/master/log/dm-master.log
tar: Removing leading `/' from member names
/tmp/dm_test/expression_filter/worker1/log/stdout.log
/tmp/dm_test/expression_filter/worker1/log/dm-worker.log
/tmp/dm_test/expression_filter/dmctl.1715938470.log
/tmp/dm_test/expression_filter/master/log/stdout.log
/tmp/dm_test/expression_filter/master/log/dm-master.log
/tmp/dm_test/expression_filter/dmctl.1715938464.log
/tmp/dm_test/fake_rotate_event/worker1/log/stdout.log
/tmp/dm_test/fake_rotate_event/worker1/log/dm-worker.log
/tmp/dm_test/fake_rotate_event/dmctl.1715938508.log
/tmp/dm_test/fake_rotate_event/dmctl.1715938511.log
/tmp/dm_test/fake_rotate_event/dmctl.1715938507.log
/tmp/dm_test/fake_rotate_event/dmctl.1715938505.log
/tmp/dm_test/fake_rotate_event/master/log/stdout.log
/tmp/dm_test/fake_rotate_event/master/log/dm-master.log
/tmp/dm_test/fake_rotate_event/sync_diff_stdout.log
/tmp/dm_test/adjust_gtid/worker1/log/stdout.log
/tmp/dm_test/adjust_gtid/worker1/log/dm-worker.log
/tmp/dm_test/adjust_gtid/dmctl.1715937979.log
/tmp/dm_test/adjust_gtid/worker2/log/stdout.log
/tmp/dm_test/adjust_gtid/worker2/log/dm-worker.log
/tmp/dm_test/adjust_gtid/dmctl.1715937977.log
/tmp/dm_test/adjust_gtid/dmctl.1715937965.log
/tmp/dm_test/adjust_gtid/dmctl.1715937967.log
/tmp/dm_test/adjust_gtid/dmctl.1715937964.log
/tmp/dm_test/adjust_gtid/master/log/stdout.log
/tmp/dm_test/adjust_gtid/master/log/dm-master.log
/tmp/dm_test/adjust_gtid/sync_diff_stdout.log
/tmp/dm_test/adjust_gtid/dmctl.1715937962.log
/tmp/dm_test/downstream_more_column/worker1/log/stdout.log
/tmp/dm_test/downstream_more_column/worker1/log/dm-worker.log
/tmp/dm_test/downstream_more_column/dmctl.1715938250.log
/tmp/dm_test/downstream_more_column/dmctl.1715938256.log
/tmp/dm_test/downstream_more_column/dmctl.1715938254.log
/tmp/dm_test/downstream_more_column/master/log/stdout.log
/tmp/dm_test/downstream_more_column/master/log/dm-master.log
/tmp/dm_test/downstream_more_column/dmctl.1715938251.log
/tmp/dm_test/dm_syncer/worker1/log/stdout.log
/tmp/dm_test/dm_syncer/worker1/log/dm-worker.log
/tmp/dm_test/dm_syncer/worker2/log/stdout.log
/tmp/dm_test/dm_syncer/worker2/log/dm-worker.log
/tmp/dm_test/dm_syncer/syncer1/log/stdout.log
/tmp/dm_test/dm_syncer/syncer1/log/dm-syncer.log
/tmp/dm_test/dm_syncer/dmctl.1715938203.log
/tmp/dm_test/dm_syncer/dmctl.1715938201.log
/tmp/dm_test/dm_syncer/dmctl.1715938200.log
/tmp/dm_test/dm_syncer/master/log/stdout.log
/tmp/dm_test/dm_syncer/master/log/dm-master.log
/tmp/dm_test/dm_syncer/sync_diff_stdout.log
/tmp/dm_test/dm_syncer/syncer2/log/stdout.log
/tmp/dm_test/dm_syncer/syncer2/log/dm-syncer.log
/tmp/dm_test/ha_cases/worker1/log/stdout.log
/tmp/dm_test/ha_cases/worker1/log/dm-worker.log
/tmp/dm_test/ha_cases/dmctl.1715938942.log
/tmp/dm_test/ha_cases/worker2/log/stdout.log
/tmp/dm_test/ha_cases/worker2/log/dm-worker.log
/tmp/dm_test/ha_cases/master2/log/stdout.log
/tmp/dm_test/ha_cases/master2/log/dm-master.log
/tmp/dm_test/ha_cases/dmctl.1715938950.log
/tmp/dm_test/ha_cases/master1/log/stdout.log
/tmp/dm_test/ha_cases/master1/log/dm-master.log
/tmp/dm_test/ha_cases/dmctl.1715938949.log
/tmp/dm_test/ha_cases/dmctl.1715938944.log
/tmp/dm_test/ha_cases/dmctl.1715938939.log
/tmp/dm_test/ha_cases/master3/log/stdout.log
/tmp/dm_test/ha_cases/master3/log/dm-master.log
/tmp/dm_test/ha_cases/sync_diff_stdout.log
/tmp/dm_test/ha_cases/dmctl.1715938947.log
/tmp/dm_test/ha_cases/dmctl.1715938938.log
/tmp/dm_test/gbk/worker1/log/stdout.log
/tmp/dm_test/gbk/worker1/log/dm-worker.log
/tmp/dm_test/gbk/dmctl.1715938600.log
/tmp/dm_test/gbk/worker2/log/stdout.log
/tmp/dm_test/gbk/worker2/log/dm-worker.log
/tmp/dm_test/gbk/dmctl.1715938599.log
/tmp/dm_test/gbk/dmctl.1715938641.log
/tmp/dm_test/gbk/dmctl.1715938675.log
/tmp/dm_test/gbk/master/log/stdout.log
/tmp/dm_test/gbk/master/log/dm-master.log
/tmp/dm_test/gbk/sync_diff_stdout.log
/tmp/dm_test/gbk/dmctl.1715938602.log
/tmp/dm_test/gbk/dmctl.1715938644.log
/tmp/dm_test/goroutines/stack/log/master-8361.log
/tmp/dm_test/goroutines/stack/log/master-8261.log
/tmp/dm_test/goroutines/stack/log/worker-8262.log
/tmp/dm_test/goroutines/stack/log/worker-8263.log
/tmp/dm_test/goroutines/stack/log/master-8761.log
/tmp/dm_test/goroutines/stack/log/master-8661.log
/tmp/dm_test/goroutines/stack/log/worker-18263.log
/tmp/dm_test/goroutines/stack/log/worker-8264.log
/tmp/dm_test/goroutines/stack/log/master-8561.log
/tmp/dm_test/goroutines/stack/log/worker-18262.log
/tmp/dm_test/goroutines/stack/log/master-8461.log
/tmp/dm_test/lightning_mode/worker1/log/stdout.log
/tmp/dm_test/lightning_mode/worker1/log/dm-worker.log
/tmp/dm_test/lightning_mode/worker2/log/stdout.log
/tmp/dm_test/lightning_mode/worker2/log/dm-worker.log
/tmp/dm_test/lightning_mode/tikv.log
/tmp/dm_test/lightning_mode/dmctl.1715939122.log
/tmp/dm_test/lightning_mode/dmctl.1715939123.log
/tmp/dm_test/lightning_mode/pd.log
/tmp/dm_test/lightning_mode/pd/region-meta/000001.log
/tmp/dm_test/lightning_mode/pd/hot-region/000001.log
/tmp/dm_test/lightning_mode/master/log/stdout.log
/tmp/dm_test/lightning_mode/master/log/dm-master.log
/tmp/dm_test/lightning_mode/dmctl.1715939121.log
/tmp/dm_test/lightning_mode/tikv/db/000005.log
tar: /tmp/dm_test/lightning_mode/tikv/db/000005.log: file changed as we read it
/tmp/dm_test/lightning_mode/tidb.log
/tmp/dm_test/lightning_mode/dmctl.1715939131.log
/tmp/dm_test/lightning_mode/dmctl.1715939129.log
/tmp/dm_test/lightning_load_task/worker1/log/stdout.log
/tmp/dm_test/lightning_load_task/worker1/log/dm-worker.log
/tmp/dm_test/lightning_load_task/dmctl.1715939059.log
/tmp/dm_test/lightning_load_task/dmctl.1715939068.log
/tmp/dm_test/lightning_load_task/worker2/log/stdout.log
/tmp/dm_test/lightning_load_task/worker2/log/dm-worker.log
/tmp/dm_test/lightning_load_task/dmctl.1715939058.log
/tmp/dm_test/lightning_load_task/dmctl.1715939054.log
/tmp/dm_test/lightning_load_task/dmctl.1715939063.log
/tmp/dm_test/lightning_load_task/dmctl.1715939061.log
/tmp/dm_test/lightning_load_task/dmctl.1715939069.log
/tmp/dm_test/lightning_load_task/dmctl.1715939082.log
/tmp/dm_test/lightning_load_task/dmctl.1715939066.log
/tmp/dm_test/lightning_load_task/dmctl.1715939049.log
/tmp/dm_test/lightning_load_task/dmctl.1715939053.log
/tmp/dm_test/lightning_load_task/dmctl.1715939050.log
/tmp/dm_test/lightning_load_task/dmctl.1715939055.log
/tmp/dm_test/lightning_load_task/dmctl.1715939084.log
/tmp/dm_test/lightning_load_task/master/log/stdout.log
/tmp/dm_test/lightning_load_task/master/log/dm-master.log
/tmp/dm_test/lightning_load_task/sync_diff_stdout.log
/tmp/dm_test/lightning_load_task/worker3/log/stdout.log
/tmp/dm_test/lightning_load_task/worker3/log/dm-worker.log
/tmp/dm_test/lightning_load_task/dmctl.1715939071.log
/tmp/dm_test/lightning_load_task/dmctl.1715939064.log
/tmp/dm_test/lightning_load_task/dmctl.1715939083.log
/tmp/dm_test/lightning_load_task/dmctl.1715939057.log
/tmp/dm_test/lightning_load_task/dmctl.1715939062.log
/tmp/dm_test/binlog_parse/worker1/log/stdout.log
/tmp/dm_test/binlog_parse/worker1/log/dm-worker.log
/tmp/dm_test/binlog_parse/dmctl.1715938034.log
/tmp/dm_test/binlog_parse/dmctl.1715938035.log
/tmp/dm_test/binlog_parse/master/log/stdout.log
/tmp/dm_test/binlog_parse/master/log/dm-master.log
/tmp/dm_test/binlog_parse/sync_diff_stdout.log
/tmp/dm_test/binlog_parse/dmctl.1715938032.log
/tmp/dm_test/duplicate_event/worker1/log/stdout.log
/tmp/dm_test/duplicate_event/worker1/log/dm-worker.log
/tmp/dm_test/duplicate_event/dmctl.1715938440.log
/tmp/dm_test/duplicate_event/dmctl.1715938441.log
/tmp/dm_test/duplicate_event/worker2/log/stdout.log
/tmp/dm_test/duplicate_event/worker2/log/dm-worker.log
/tmp/dm_test/duplicate_event/dmctl.1715938439.log
/tmp/dm_test/duplicate_event/dmctl.1715938373.log
/tmp/dm_test/duplicate_event/dmctl.1715938443.log
/tmp/dm_test/duplicate_event/master/log/stdout.log
/tmp/dm_test/duplicate_event/master/log/dm-master.log
/tmp/dm_test/duplicate_event/sync_diff_stdout.log
/tmp/dm_test/drop_column_with_index/worker1/log/stdout.log
/tmp/dm_test/drop_column_with_index/worker1/log/dm-worker.log
/tmp/dm_test/drop_column_with_index/dmctl.1715938264.log
/tmp/dm_test/drop_column_with_index/dmctl.1715938274.log
/tmp/dm_test/drop_column_with_index/dmctl.1715938268.log
/tmp/dm_test/drop_column_with_index/dmctl.1715938266.log
/tmp/dm_test/drop_column_with_index/master/log/stdout.log
/tmp/dm_test/drop_column_with_index/master/log/dm-master.log
/tmp/dm_test/drop_column_with_index/sync_diff_stdout.log
/tmp/dm_test/drop_column_with_index/dmctl.1715938273.log
/tmp/dm_test/drop_column_with_index/dmctl.1715938263.log
/tmp/dm_test/extend_column/worker1/log/stdout.log
/tmp/dm_test/extend_column/worker1/log/dm-worker.log
/tmp/dm_test/extend_column/worker2/log/stdout.log
/tmp/dm_test/extend_column/worker2/log/dm-worker.log
/tmp/dm_test/extend_column/dmctl.1715938493.log
/tmp/dm_test/extend_column/dmctl.1715938492.log
/tmp/dm_test/extend_column/master/log/stdout.log
/tmp/dm_test/extend_column/master/log/dm-master.log
/tmp/dm_test/extend_column/dmctl.1715938494.log
/tmp/dm_test/checkpoint_transaction/worker1/log/stdout.log
/tmp/dm_test/checkpoint_transaction/worker1/log/dm-worker.log
/tmp/dm_test/checkpoint_transaction/dmctl.1715938157.log
/tmp/dm_test/checkpoint_transaction/dmctl.1715938112.log
/tmp/dm_test/checkpoint_transaction/dmctl.1715938113.log
/tmp/dm_test/checkpoint_transaction/master/log/stdout.log
/tmp/dm_test/checkpoint_transaction/master/log/dm-master.log
/tmp/dm_test/checkpoint_transaction/dmctl.1715938142.log
/tmp/dm_test/checkpoint_transaction/sync_diff_stdout.log
/tmp/dm_test/checkpoint_transaction/dmctl.1715938162.log
/tmp/dm_test/downstream_diff_index/worker1/log/stdout.log
/tmp/dm_test/downstream_diff_index/worker1/log/dm-worker.log
/tmp/dm_test/downstream_diff_index/worker2/log/stdout.log
/tmp/dm_test/downstream_diff_index/worker2/log/dm-worker.log
/tmp/dm_test/downstream_diff_index/dmctl.1715938232.log
/tmp/dm_test/downstream_diff_index/master/log/stdout.log
/tmp/dm_test/downstream_diff_index/master/log/dm-master.log
/tmp/dm_test/downstream_diff_index/dmctl.1715938234.log
/tmp/dm_test/downstream_diff_index/dmctl.1715938233.log
/tmp/dm_test/gtid/dmctl.1715938698.log
/tmp/dm_test/gtid/worker1/log/stdout.log
/tmp/dm_test/gtid/worker1/log/dm-worker.log
/tmp/dm_test/gtid/dmctl.1715938701.log
/tmp/dm_test/gtid/dmctl.1715938709.log
/tmp/dm_test/gtid/worker2/log/stdout.log
/tmp/dm_test/gtid/worker2/log/dm-worker.log
/tmp/dm_test/gtid/dmctl.1715938704.log
/tmp/dm_test/gtid/dmctl.1715938700.log
/tmp/dm_test/gtid/dmctl.1715938705.log
/tmp/dm_test/gtid/dmctl.1715938710.log
/tmp/dm_test/gtid/master/log/stdout.log
/tmp/dm_test/gtid/master/log/dm-master.log
/tmp/dm_test/gtid/sync_diff_stdout.log
/tmp/dm_test/gtid/dmctl.1715938703.log
/tmp/dm_test/async_checkpoint_flush/worker1/log/stdout.log
/tmp/dm_test/async_checkpoint_flush/worker1/log/dm-worker.log
/tmp/dm_test/async_checkpoint_flush/dmctl.1715937993.log
/tmp/dm_test/async_checkpoint_flush/dmctl.1715937992.log
/tmp/dm_test/async_checkpoint_flush/master/log/stdout.log
/tmp/dm_test/async_checkpoint_flush/master/log/dm-master.log
/tmp/dm_test/async_checkpoint_flush/sync_diff_stdout.log
/tmp/dm_test/foreign_key/worker1/log/stdout.log
/tmp/dm_test/foreign_key/worker1/log/dm-worker.log
/tmp/dm_test/foreign_key/dmctl.1715938521.log
/tmp/dm_test/foreign_key/dmctl.1715938519.log
/tmp/dm_test/foreign_key/master/log/stdout.log
/tmp/dm_test/foreign_key/master/log/dm-master.log
/tmp/dm_test/foreign_key/sync_diff_stdout.log
/tmp/dm_test/full_mode/worker1/log/stdout.log
/tmp/dm_test/full_mode/worker1/log/dm-worker.log
/tmp/dm_test/full_mode/dmctl.1715938590.log
/tmp/dm_test/full_mode/worker2/log/stdout.log
/tmp/dm_test/full_mode/worker2/log/dm-worker.log
/tmp/dm_test/full_mode/dmctl.1715938588.log
/tmp/dm_test/full_mode/master/log/stdout.log
/tmp/dm_test/full_mode/master/log/dm-master.log
/tmp/dm_test/full_mode/sync_diff_stdout.log
/tmp/dm_test/full_mode/dmctl.1715938587.log
/tmp/dm_test/downstream/tidb/log/tidb.log
/tmp/dm_test/http_proxies/worker1/log/stdout.log
/tmp/dm_test/http_proxies/worker1/log/dm-worker.log
/tmp/dm_test/http_proxies/dmctl.1715939039.log
/tmp/dm_test/http_proxies/dmctl.1715939017.log
/tmp/dm_test/http_proxies/dmctl.1715939028.log
/tmp/dm_test/http_proxies/master/log/stdout.log
/tmp/dm_test/http_proxies/master/log/dm-master.log
Error when executing failure post condition:
Also:   org.jenkinsci.plugins.workflow.actions.ErrorAction$ErrorId: fdd5ee3e-dc36-4331-a196-a1cc96793e2f
hudson.AbortException: script returned exit code 1
	at org.jenkinsci.plugins.workflow.steps.durable_task.DurableTaskStep$Execution.handleExit(DurableTaskStep.java:668)
	at org.jenkinsci.plugins.workflow.steps.durable_task.DurableTaskStep$Execution.check(DurableTaskStep.java:614)
	at org.jenkinsci.plugins.workflow.steps.durable_task.DurableTaskStep$Execution.run(DurableTaskStep.java:555)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:829)

[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G11'
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE