Skip to content

Console Output

Skipping 1,271 KB.. Full Log
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ nc -z localhost 6651
+ echo 'Waiting for pulsar namespace to be ready...'
Waiting for pulsar namespace to be ready...
+ i=0
+ /usr/local/pulsar/bin/pulsar-admin namespaces list public
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_sequence.cli.10592.out cli tso query --pd=http://127.0.0.1:2379
check_ts_not_forward ddl-only-block-related-table
check diff failed 4-th time, retry later
+ set +x
+ tso='449527882407739393
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449527882407739393 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
try a VALID cdc server command
[Sat May  4 16:53:05 CST 2024] <<<<<< START cdc server in cdc_server_tips case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.62356237.out server --log-file /tmp/tidb_cdc_test/cdc_server_tips/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cdc_server_tips/cdc_data --cluster-id default
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ set +x
+ tso='449527882625056769
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449527882625056769 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sat May  4 16:53:06 CST 2024] <<<<<< START cdc server in ddl_sequence case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_sequence.1063010632.out server --log-file /tmp/tidb_cdc_test/ddl_sequence/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_sequence/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
Tenant not found

Reason: Tenant not found
+ i=1
+ '[' 1 -gt 20 ']'
+ sleep 2
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d0b3a54000005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-pulsar-test-1523-8hr7-nt04p, pid:5010, start at 2024-05-04 16:53:05.923613481 +0800 CST m=+5.093714690	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240504-16:55:05.930 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240504-16:53:05.920 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240504-16:43:05.920 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d0b3a54000005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-pulsar-test-1523-8hr7-nt04p, pid:5010, start at 2024-05-04 16:53:05.923613481 +0800 CST m=+5.093714690	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240504-16:55:05.930 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240504-16:53:05.920 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240504-16:43:05.920 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d0b3a54b80003	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-pulsar-test-1523-8hr7-nt04p, pid:5100, start at 2024-05-04 16:53:05.968202947 +0800 CST m=+5.082844564	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240504-16:55:05.974 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240504-16:53:05.966 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240504-16:43:05.966 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/processor_etcd_worker_delay/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/processor_etcd_worker_delay/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/processor_etcd_worker_delay/tiflash/log/proxy.log"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/processor_etcd_worker_delay/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/processor_etcd_worker_delay/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sat, 04 May 2024 08:53:07 GMT
< Content-Length: 816
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/79a8a6d4-bf28-430a-81a2-f5618e0b3a81
	{"id":"79a8a6d4-bf28-430a-81a2-f5618e0b3a81","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812784}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce4cc9d2
	79a8a6d4-bf28-430a-81a2-f5618e0b3a81

/tidb/cdc/default/default/upstream/7365064758230646965
	{"id":7365064758230646965,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/79a8a6d4-bf28-430a-81a2-f5618e0b3a81
	{"id":"79a8a6d4-bf28-430a-81a2-f5618e0b3a81","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812784}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce4cc9d2
	79a8a6d4-bf28-430a-81a2-f5618e0b3a81

/tidb/cdc/default/default/upstream/7365064758230646965
	{"id":7365064758230646965,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/79a8a6d4-bf28-430a-81a2-f5618e0b3a81
	{"id":"79a8a6d4-bf28-430a-81a2-f5618e0b3a81","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812784}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce4cc9d2
	79a8a6d4-bf28-430a-81a2-f5618e0b3a81

/tidb/cdc/default/default/upstream/7365064758230646965
	{"id":7365064758230646965,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
++ date
+ echo '[Sat May  4 16:53:07 CST 2024] <<<<<< START pulsar cluster in ddl_attributes case >>>>>>'
[Sat May  4 16:53:07 CST 2024] <<<<<< START pulsar cluster in ddl_attributes case >>>>>>
+ workdir=/tmp/tidb_cdc_test/ddl_attributes
+ cluster_type=normal
+ cd /tmp/tidb_cdc_test/ddl_attributes
+ DEFAULT_PULSAR_HOME=/usr/local/pulsar
+ pulsar_dir=/usr/local/pulsar
++ cat
+ mtls_conf='
authenticationEnabled=true
authenticationProviders=org.apache.pulsar.broker.authentication.AuthenticationProviderTls
brokerClientTlsEnabled=true
brokerClientTrustCertsFilePath=/tmp/tidb_cdc_test/ddl_attributes/ca.cert.pem
brokerClientAuthenticationPlugin=org.apache.pulsar.client.impl.auth.AuthenticationTls
brokerClientAuthenticationParameters={"tlsCertFile":"/tmp/tidb_cdc_test/ddl_attributes/broker_client.cert.pem","tlsKeyFile":"/tmp/tidb_cdc_test/ddl_attributes/broker_client.key-pk8.pem"}
brokerServicePortTls=6651
webServicePortTls=8443
tlsTrustCertsFilePath=/tmp/tidb_cdc_test/ddl_attributes/ca.cert.pem
tlsCertificateFilePath=/tmp/tidb_cdc_test/ddl_attributes/server.cert.pem
tlsKeyFilePath=/tmp/tidb_cdc_test/ddl_attributes/server.key-pk8.pem
tlsRequireTrustedClientCertOnConnect=true
tlsAllowInsecureConnection=false
tlsCertRefreshCheckDurationSec=300'
++ cat
+ normal_client_conf='
webServiceUrl=http://localhost:8080/
brokerServiceUrl=pulsar://localhost:6650/'
++ cat
+ mtls_client_conf='
webServiceUrl=https://localhost:8443/
brokerServiceUrl=pulsar+ssl://localhost:6651/
authPlugin=org.apache.pulsar.client.impl.auth.AuthenticationTls
authParams=tlsCertFile:/tmp/tidb_cdc_test/ddl_attributes/broker_client.cert.pem,tlsKeyFile:/tmp/tidb_cdc_test/ddl_attributes/broker_client.key-pk8.pem
tlsTrustCertsFilePath=/tmp/tidb_cdc_test/ddl_attributes/ca.cert.pem'
++ cat
+ oauth_client_conf='
    webServiceUrl=http://localhost:8080/
    brokerServiceUrl=pulsar://localhost:6650/
    authPlugin=org.apache.pulsar.client.impl.auth.oauth2.AuthenticationOAuth2
    authParams={"privateKey":"/tmp/tidb_cdc_test/ddl_attributes/credential.json","audience":"cdc-api-uri","issuerUrl":"http://localhost:9096"}'
++ cat
+ oauth_conf='
authenticationEnabled=true
authenticationProviders=org.apache.pulsar.broker.authentication.AuthenticationProviderToken

brokerClientAuthenticationPlugin=org.apache.pulsar.client.impl.auth.oauth2.AuthenticationOAuth2
brokerClientAuthenticationParameters={"privateKey":"file:///tmp/tidb_cdc_test/ddl_attributes/credential.json","audience":"cdc-api-uri","issuerUrl":"http://localhost:9096"}
tokenSecretKey=data:;base64,U0poWDM2X0thcFlTeWJCdEpxMzVseFhfQnJyNExSVVJTa203UW1YSkdteThwVUZXOUVJT2NWUVBzeWt6OS1qag=='
++ cat
+ credential_json='
    {
        "client_id":"1234",
        "client_secret":"e0KVlA2EiBfjoN13olyZd2kv1KL",
        "audience":"cdc-api-uri",
        "issuer_url":"http://localhost:9096",
        "type": "client_credentials"
    }'
++ cat
+ cert_server_conf='[ req ]
default_bits = 2048
prompt = no
default_md = sha256
distinguished_name = dn

[ v3_ext ]
authorityKeyIdentifier=keyid,issuer:always
basicConstraints=CA:FALSE
keyUsage=critical, digitalSignature, keyEncipherment
extendedKeyUsage=serverAuth
subjectAltName=@alt_names

[ dn ]
CN = server

[ alt_names ]
DNS.1 = localhost
IP.1 = 127.0.0.1'
+ echo '
webServiceUrl=http://localhost:8080/
brokerServiceUrl=pulsar://localhost:6650/'
+ cp /usr/local/pulsar/conf/standalone.conf /tmp/tidb_cdc_test/ddl_attributes/pulsar_standalone.conf
+ pulsar_port=6650
+ '[' normal == mtls ']'
+ '[' normal == oauth ']'
+ echo 'no cluster type specified, using default configuration.'
no cluster type specified, using default configuration.
++ date
+ echo '[Sat May  4 16:53:07 CST 2024] <<<<<< START pulsar cluster in normal mode in ddl_attributes case >>>>>>'
[Sat May  4 16:53:07 CST 2024] <<<<<< START pulsar cluster in normal mode in ddl_attributes case >>>>>>
+ echo 'Waiting for pulsar port to be ready...'
Waiting for pulsar port to be ready...
+ i=0
+ /usr/local/pulsar/bin/pulsar standalone --config /tmp/tidb_cdc_test/ddl_attributes/pulsar_standalone.conf -nfw --metadata-dir /tmp/tidb_cdc_test/ddl_attributes/pulsar-metadata --bookkeeper-dir /tmp/tidb_cdc_test/ddl_attributes/pulsar-bookie
+ nc -z localhost 6650
+ i=1
+ '[' 1 -gt 20 ']'
+ sleep 2
check diff failed 5-th time, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sat, 04 May 2024 08:53:08 GMT
< Content-Length: 816
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/0f351274-f22c-40a0-9264-da2cde5af496
	{"id":"0f351274-f22c-40a0-9264-da2cde5af496","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812786}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce4cffe5
	0f351274-f22c-40a0-9264-da2cde5af496

/tidb/cdc/default/default/upstream/7365064757410155802
	{"id":7365064757410155802,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/0f351274-f22c-40a0-9264-da2cde5af496
	{"id":"0f351274-f22c-40a0-9264-da2cde5af496","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812786}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce4cffe5
	0f351274-f22c-40a0-9264-da2cde5af496

/tidb/cdc/default/default/upstream/7365064757410155802
	{"id":7365064757410155802,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/0f351274-f22c-40a0-9264-da2cde5af496
	{"id":"0f351274-f22c-40a0-9264-da2cde5af496","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812786}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce4cffe5
	0f351274-f22c-40a0-9264-da2cde5af496

/tidb/cdc/default/default/upstream/7365064757410155802
	{"id":7365064757410155802,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
++ date
+ echo '[Sat May  4 16:53:09 CST 2024] <<<<<< START pulsar cluster in processor_etcd_worker_delay case >>>>>>'
[Sat May  4 16:53:09 CST 2024] <<<<<< START pulsar cluster in processor_etcd_worker_delay case >>>>>>
+ workdir=/tmp/tidb_cdc_test/processor_etcd_worker_delay
+ cluster_type=normal
+ cd /tmp/tidb_cdc_test/processor_etcd_worker_delay
+ DEFAULT_PULSAR_HOME=/usr/local/pulsar
+ pulsar_dir=/usr/local/pulsar
++ cat
+ mtls_conf='
authenticationEnabled=true
authenticationProviders=org.apache.pulsar.broker.authentication.AuthenticationProviderTls
brokerClientTlsEnabled=true
brokerClientTrustCertsFilePath=/tmp/tidb_cdc_test/processor_etcd_worker_delay/ca.cert.pem
brokerClientAuthenticationPlugin=org.apache.pulsar.client.impl.auth.AuthenticationTls
brokerClientAuthenticationParameters={"tlsCertFile":"/tmp/tidb_cdc_test/processor_etcd_worker_delay/broker_client.cert.pem","tlsKeyFile":"/tmp/tidb_cdc_test/processor_etcd_worker_delay/broker_client.key-pk8.pem"}
brokerServicePortTls=6651
webServicePortTls=8443
tlsTrustCertsFilePath=/tmp/tidb_cdc_test/processor_etcd_worker_delay/ca.cert.pem
tlsCertificateFilePath=/tmp/tidb_cdc_test/processor_etcd_worker_delay/server.cert.pem
tlsKeyFilePath=/tmp/tidb_cdc_test/processor_etcd_worker_delay/server.key-pk8.pem
tlsRequireTrustedClientCertOnConnect=true
tlsAllowInsecureConnection=false
tlsCertRefreshCheckDurationSec=300'
++ cat
+ normal_client_conf='
webServiceUrl=http://localhost:8080/
brokerServiceUrl=pulsar://localhost:6650/'
++ cat
+ mtls_client_conf='
webServiceUrl=https://localhost:8443/
brokerServiceUrl=pulsar+ssl://localhost:6651/
authPlugin=org.apache.pulsar.client.impl.auth.AuthenticationTls
authParams=tlsCertFile:/tmp/tidb_cdc_test/processor_etcd_worker_delay/broker_client.cert.pem,tlsKeyFile:/tmp/tidb_cdc_test/processor_etcd_worker_delay/broker_client.key-pk8.pem
tlsTrustCertsFilePath=/tmp/tidb_cdc_test/processor_etcd_worker_delay/ca.cert.pem'
++ cat
+ oauth_client_conf='
    webServiceUrl=http://localhost:8080/
    brokerServiceUrl=pulsar://localhost:6650/
    authPlugin=org.apache.pulsar.client.impl.auth.oauth2.AuthenticationOAuth2
    authParams={"privateKey":"/tmp/tidb_cdc_test/processor_etcd_worker_delay/credential.json","audience":"cdc-api-uri","issuerUrl":"http://localhost:9096"}'
++ cat
+ oauth_conf='
authenticationEnabled=true
authenticationProviders=org.apache.pulsar.broker.authentication.AuthenticationProviderToken

brokerClientAuthenticationPlugin=org.apache.pulsar.client.impl.auth.oauth2.AuthenticationOAuth2
brokerClientAuthenticationParameters={"privateKey":"file:///tmp/tidb_cdc_test/processor_etcd_worker_delay/credential.json","audience":"cdc-api-uri","issuerUrl":"http://localhost:9096"}
tokenSecretKey=data:;base64,U0poWDM2X0thcFlTeWJCdEpxMzVseFhfQnJyNExSVVJTa203UW1YSkdteThwVUZXOUVJT2NWUVBzeWt6OS1qag=='
++ cat
+ credential_json='
    {
        "client_id":"1234",
        "client_secret":"e0KVlA2EiBfjoN13olyZd2kv1KL",
        "audience":"cdc-api-uri",
        "issuer_url":"http://localhost:9096",
        "type": "client_credentials"
    }'
++ cat
+ cert_server_conf='[ req ]
default_bits = 2048
prompt = no
default_md = sha256
distinguished_name = dn

[ v3_ext ]
authorityKeyIdentifier=keyid,issuer:always
basicConstraints=CA:FALSE
keyUsage=critical, digitalSignature, keyEncipherment
extendedKeyUsage=serverAuth
subjectAltName=@alt_names

[ dn ]
CN = server

[ alt_names ]
DNS.1 = localhost
IP.1 = 127.0.0.1'
+ echo '
webServiceUrl=http://localhost:8080/
brokerServiceUrl=pulsar://localhost:6650/'
+ cp /usr/local/pulsar/conf/standalone.conf /tmp/tidb_cdc_test/processor_etcd_worker_delay/pulsar_standalone.conf
+ pulsar_port=6650
+ '[' normal == mtls ']'
+ '[' normal == oauth ']'
+ echo 'no cluster type specified, using default configuration.'
no cluster type specified, using default configuration.
++ date
+ echo '[Sat May  4 16:53:09 CST 2024] <<<<<< START pulsar cluster in normal mode in processor_etcd_worker_delay case >>>>>>'
[Sat May  4 16:53:09 CST 2024] <<<<<< START pulsar cluster in normal mode in processor_etcd_worker_delay case >>>>>>
+ echo 'Waiting for pulsar port to be ready...'
Waiting for pulsar port to be ready...
+ i=0
+ nc -z localhost 6650
+ /usr/local/pulsar/bin/pulsar standalone --config /tmp/tidb_cdc_test/processor_etcd_worker_delay/pulsar_standalone.conf -nfw --metadata-dir /tmp/tidb_cdc_test/processor_etcd_worker_delay/pulsar-metadata --bookkeeper-dir /tmp/tidb_cdc_test/processor_etcd_worker_delay/pulsar-bookie
+ i=1
+ '[' 1 -gt 20 ']'
+ sleep 2
+ nc -z localhost 6650
+ i=2
+ '[' 2 -gt 20 ']'
+ sleep 2
check diff successfully
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sat, 04 May 2024 08:53:09 GMT
< Content-Length: 816
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/13e187ae-32b5-4715-8596-5059984a6256
	{"id":"13e187ae-32b5-4715-8596-5059984a6256","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812786}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce5a31d2
	13e187ae-32b5-4715-8596-5059984a6256

/tidb/cdc/default/default/upstream/7365064771925926090
	{"id":7365064771925926090,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/13e187ae-32b5-4715-8596-5059984a6256
	{"id":"13e187ae-32b5-4715-8596-5059984a6256","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812786}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce5a31d2
	13e187ae-32b5-4715-8596-5059984a6256

/tidb/cdc/default/default/upstream/7365064771925926090
	{"id":7365064771925926090,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/13e187ae-32b5-4715-8596-5059984a6256
	{"id":"13e187ae-32b5-4715-8596-5059984a6256","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812786}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce5a31d2
	13e187ae-32b5-4715-8596-5059984a6256

/tidb/cdc/default/default/upstream/7365064771925926090
	{"id":7365064771925926090,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
++ date
+ echo '[Sat May  4 16:53:09 CST 2024] <<<<<< START pulsar cluster in ddl_sequence case >>>>>>'
[Sat May  4 16:53:09 CST 2024] <<<<<< START pulsar cluster in ddl_sequence case >>>>>>
+ workdir=/tmp/tidb_cdc_test/ddl_sequence
+ cluster_type=normal
+ cd /tmp/tidb_cdc_test/ddl_sequence
+ DEFAULT_PULSAR_HOME=/usr/local/pulsar
+ pulsar_dir=/usr/local/pulsar
++ cat
+ mtls_conf='
authenticationEnabled=true
authenticationProviders=org.apache.pulsar.broker.authentication.AuthenticationProviderTls
brokerClientTlsEnabled=true
brokerClientTrustCertsFilePath=/tmp/tidb_cdc_test/ddl_sequence/ca.cert.pem
brokerClientAuthenticationPlugin=org.apache.pulsar.client.impl.auth.AuthenticationTls
brokerClientAuthenticationParameters={"tlsCertFile":"/tmp/tidb_cdc_test/ddl_sequence/broker_client.cert.pem","tlsKeyFile":"/tmp/tidb_cdc_test/ddl_sequence/broker_client.key-pk8.pem"}
brokerServicePortTls=6651
webServicePortTls=8443
tlsTrustCertsFilePath=/tmp/tidb_cdc_test/ddl_sequence/ca.cert.pem
tlsCertificateFilePath=/tmp/tidb_cdc_test/ddl_sequence/server.cert.pem
tlsKeyFilePath=/tmp/tidb_cdc_test/ddl_sequence/server.key-pk8.pem
tlsRequireTrustedClientCertOnConnect=true
tlsAllowInsecureConnection=false
tlsCertRefreshCheckDurationSec=300'
++ cat
+ normal_client_conf='
webServiceUrl=http://localhost:8080/
brokerServiceUrl=pulsar://localhost:6650/'
++ cat
+ mtls_client_conf='
webServiceUrl=https://localhost:8443/
brokerServiceUrl=pulsar+ssl://localhost:6651/
authPlugin=org.apache.pulsar.client.impl.auth.AuthenticationTls
authParams=tlsCertFile:/tmp/tidb_cdc_test/ddl_sequence/broker_client.cert.pem,tlsKeyFile:/tmp/tidb_cdc_test/ddl_sequence/broker_client.key-pk8.pem
tlsTrustCertsFilePath=/tmp/tidb_cdc_test/ddl_sequence/ca.cert.pem'
++ cat
+ oauth_client_conf='
    webServiceUrl=http://localhost:8080/
    brokerServiceUrl=pulsar://localhost:6650/
    authPlugin=org.apache.pulsar.client.impl.auth.oauth2.AuthenticationOAuth2
    authParams={"privateKey":"/tmp/tidb_cdc_test/ddl_sequence/credential.json","audience":"cdc-api-uri","issuerUrl":"http://localhost:9096"}'
++ cat
+ oauth_conf='
authenticationEnabled=true
authenticationProviders=org.apache.pulsar.broker.authentication.AuthenticationProviderToken

brokerClientAuthenticationPlugin=org.apache.pulsar.client.impl.auth.oauth2.AuthenticationOAuth2
brokerClientAuthenticationParameters={"privateKey":"file:///tmp/tidb_cdc_test/ddl_sequence/credential.json","audience":"cdc-api-uri","issuerUrl":"http://localhost:9096"}
tokenSecretKey=data:;base64,U0poWDM2X0thcFlTeWJCdEpxMzVseFhfQnJyNExSVVJTa203UW1YSkdteThwVUZXOUVJT2NWUVBzeWt6OS1qag=='
++ cat
+ credential_json='
    {
        "client_id":"1234",
        "client_secret":"e0KVlA2EiBfjoN13olyZd2kv1KL",
        "audience":"cdc-api-uri",
        "issuer_url":"http://localhost:9096",
        "type": "client_credentials"
    }'
++ cat
+ cert_server_conf='[ req ]
default_bits = 2048
prompt = no
default_md = sha256
distinguished_name = dn

[ v3_ext ]
authorityKeyIdentifier=keyid,issuer:always
basicConstraints=CA:FALSE
keyUsage=critical, digitalSignature, keyEncipherment
extendedKeyUsage=serverAuth
subjectAltName=@alt_names

[ dn ]
CN = server

[ alt_names ]
DNS.1 = localhost
IP.1 = 127.0.0.1'
+ echo '
webServiceUrl=http://localhost:8080/
brokerServiceUrl=pulsar://localhost:6650/'
+ cp /usr/local/pulsar/conf/standalone.conf /tmp/tidb_cdc_test/ddl_sequence/pulsar_standalone.conf
+ pulsar_port=6650
+ '[' normal == mtls ']'
+ '[' normal == oauth ']'
+ echo 'no cluster type specified, using default configuration.'
no cluster type specified, using default configuration.
++ date
+ echo '[Sat May  4 16:53:09 CST 2024] <<<<<< START pulsar cluster in normal mode in ddl_sequence case >>>>>>'
[Sat May  4 16:53:09 CST 2024] <<<<<< START pulsar cluster in normal mode in ddl_sequence case >>>>>>
+ echo 'Waiting for pulsar port to be ready...'
Waiting for pulsar port to be ready...
+ i=0
+ nc -z localhost 6650
+ /usr/local/pulsar/bin/pulsar standalone --config /tmp/tidb_cdc_test/ddl_sequence/pulsar_standalone.conf -nfw --metadata-dir /tmp/tidb_cdc_test/ddl_sequence/pulsar-metadata --bookkeeper-dir /tmp/tidb_cdc_test/ddl_sequence/pulsar-bookie
+ i=1
+ '[' 1 -gt 20 ']'
+ sleep 2
wait process cdc.test exit for 1-th time...
+ /usr/local/pulsar/bin/pulsar-admin namespaces list public
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sat May  4 16:53:11 CST 2024] <<<<<< run test case changefeed_pause_resume success! >>>>>>
+ nc -z localhost 6650
+ i=2
+ '[' 2 -gt 20 ']'
+ sleep 2
public/default
+ nc -z localhost 6650
+ i=2
+ '[' 2 -gt 20 ']'
+ sleep 2
++ date
+ echo '[Sat May  4 16:53:11 CST 2024] <<<<<< pulsar is ready >>>>>>'
[Sat May  4 16:53:11 CST 2024] <<<<<< pulsar is ready >>>>>>
+ nc -z localhost 6650
+ echo 'Waiting for pulsar namespace to be ready...'
Waiting for pulsar namespace to be ready...
+ i=0
+ /usr/local/pulsar/bin/pulsar-admin namespaces list public
Create changefeed successfully!
ID: cfdabb40-0337-4077-8ca2-e51ddc9077a4
Info: {"upstream_id":7365064723396403914,"namespace":"default","id":"cfdabb40-0337-4077-8ca2-e51ddc9077a4","sink_uri":"pulsar+ssl://127.0.0.1:6651/ticdc-region-merge-test-8368?protocol=canal-json\u0026enable-tidb-extension=true","create_time":"2024-05-04T16:53:12.403125863+08:00","start_ts":449527884464521218,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"pulsar_config":{"tls-trust-certs-file-path":"/tmp/tidb_cdc_test/region_merge/ca.cert.pem","connection-timeout":5,"operation-timeout":30,"batching-max-messages":1000,"batching-max-publish-delay":10,"send-timeout":30,"auth-tls-certificate-path":"/tmp/tidb_cdc_test/region_merge/broker_client.cert.pem","auth-tls-private-key-path":"/tmp/tidb_cdc_test/region_merge/broker_client.key-pk8.pem"},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-79-gc950cce3a","resolved_ts":449527884464521218,"checkpoint_ts":449527884464521218,"checkpoint_time":"2024-05-04 16:53:11.689"}
[Sat May  4 16:53:12 CST 2024] <<<<<< START Pulsar consumer in region_merge case >>>>>>
split_and_random_merge scale: 20
+ nc -z localhost 6650
+ echo 'Waiting for pulsar namespace to be ready...'
Waiting for pulsar namespace to be ready...
+ i=0
+ /usr/local/pulsar/bin/pulsar-admin namespaces list public
+ nc -z localhost 6650
+ i=3
+ '[' 3 -gt 20 ']'
+ sleep 2
public/default
++ date
+ echo '[Sat May  4 16:53:14 CST 2024] <<<<<< pulsar is ready >>>>>>'
[Sat May  4 16:53:14 CST 2024] <<<<<< pulsar is ready >>>>>>
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_attributes.cli.6852.out cli changefeed create --start-ts=449527882105487362 '--sink-uri=pulsar://127.0.0.1:6650/ticdc-ddl-attributes-test-6998?protocol=canal-json&enable-tidb-extension=true'
public/default
+ nc -z localhost 6650
+ i=4
+ '[' 4 -gt 20 ']'
+ sleep 2
++ date
+ echo '[Sat May  4 16:53:15 CST 2024] <<<<<< pulsar is ready >>>>>>'
[Sat May  4 16:53:15 CST 2024] <<<<<< pulsar is ready >>>>>>
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_sequence.cli.11093.out cli changefeed create --start-ts=449527882625056769 '--sink-uri=pulsar://127.0.0.1:6650/ticdc-ddl-sequence-test-8038?protocol=canal-json&enable-tidb-extension=true'
Create changefeed successfully!
ID: b5cd0943-210c-41d5-a8be-8f34842051fe
Info: {"upstream_id":7365064758230646965,"namespace":"default","id":"b5cd0943-210c-41d5-a8be-8f34842051fe","sink_uri":"pulsar://127.0.0.1:6650/ticdc-ddl-attributes-test-6998?protocol=canal-json\u0026enable-tidb-extension=true","create_time":"2024-05-04T16:53:16.297224722+08:00","start_ts":449527882105487362,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"pulsar_config":{"connection-timeout":5,"operation-timeout":30,"batching-max-messages":1000,"batching-max-publish-delay":10,"send-timeout":30},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-79-gc950cce3a","resolved_ts":449527882105487362,"checkpoint_ts":449527882105487362,"checkpoint_time":"2024-05-04 16:53:02.690"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Create changefeed successfully!
ID: 9cf66069-1425-43e6-b843-4b0135df060d
Info: {"upstream_id":7365064771925926090,"namespace":"default","id":"9cf66069-1425-43e6-b843-4b0135df060d","sink_uri":"pulsar://127.0.0.1:6650/ticdc-ddl-sequence-test-8038?protocol=canal-json\u0026enable-tidb-extension=true","create_time":"2024-05-04T16:53:16.436811512+08:00","start_ts":449527882625056769,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"pulsar_config":{"connection-timeout":5,"operation-timeout":30,"batching-max-messages":1000,"batching-max-publish-delay":10,"send-timeout":30},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-79-gc950cce3a","resolved_ts":449527882625056769,"checkpoint_ts":449527882625056769,"checkpoint_time":"2024-05-04 16:53:04.672"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ nc -z localhost 6650
+ echo 'Waiting for pulsar namespace to be ready...'
Waiting for pulsar namespace to be ready...
+ i=0
+ /usr/local/pulsar/bin/pulsar-admin namespaces list public
+ set +x
[Sat May  4 16:53:17 CST 2024] <<<<<< START Pulsar consumer in ddl_attributes case >>>>>>
+ set +x
[Sat May  4 16:53:17 CST 2024] <<<<<< START Pulsar consumer in ddl_sequence case >>>>>>
*************************** 1. row ***************************
count(distinct region_id): 1
public/default
++ date
+ echo '[Sat May  4 16:53:19 CST 2024] <<<<<< pulsar is ready >>>>>>'
[Sat May  4 16:53:19 CST 2024] <<<<<< pulsar is ready >>>>>>
[Sat May  4 16:53:19 CST 2024] <<<<<< START cdc server in processor_etcd_worker_delay case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/pkg/orchestrator/ProcessorEtcdDelay=return(true)'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.processor_etcd_worker_delay.69216923.out server --log-file /tmp/tidb_cdc_test/processor_etcd_worker_delay/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/processor_etcd_worker_delay/cdc_data --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table ddl_sequence.finish_mark not exists for 1-th check, retry later
table ddl_sequence.finish_mark not exists for 2-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sat, 04 May 2024 08:53:22 GMT
< Content-Length: 816
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/64523370-eac9-4724-88d5-1054d6b1d383
	{"id":"64523370-eac9-4724-88d5-1054d6b1d383","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812799}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce690af8
	64523370-eac9-4724-88d5-1054d6b1d383

/tidb/cdc/default/default/upstream/7365064789791961121
	{"id":7365064789791961121,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/64523370-eac9-4724-88d5-1054d6b1d383
	{"id":"64523370-eac9-4724-88d5-1054d6b1d383","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812799}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce690af8
	64523370-eac9-4724-88d5-1054d6b1d383

/tidb/cdc/default/default/upstream/7365064789791961121
	{"id":7365064789791961121,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/64523370-eac9-4724-88d5-1054d6b1d383
	{"id":"64523370-eac9-4724-88d5-1054d6b1d383","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812799}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42ce690af8
	64523370-eac9-4724-88d5-1054d6b1d383

/tidb/cdc/default/default/upstream/7365064789791961121
	{"id":7365064789791961121,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
split_and_random_merge scale: 40
[Sat May  4 16:53:23 CST 2024] <<<<<< START Pulsar consumer in processor_etcd_worker_delay case >>>>>>
table ddl_attributes.attributes_t1_new not exists for 1-th check, retry later
table ddl_sequence.finish_mark not exists for 3-th check, retry later
table ddl_attributes.attributes_t1_new not exists for 2-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/tests/integration_tests/cli_with_auth/run.sh using Sink-Type: pulsar... <<=================
The 1 times to try to start tidb cluster...
table ddl_sequence.finish_mark not exists for 4-th check, retry later
table ddl_attributes.attributes_t1_new exists
table ddl_attributes.finish_mark not exists for 1-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/cli_with_auth
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table ddl_sequence.finish_mark not exists for 5-th check, retry later
table ddl_attributes.finish_mark not exists for 2-th check, retry later
*************************** 1. row ***************************
count(distinct region_id): 6
table ddl_sequence.finish_mark not exists for 6-th check, retry later
valid ~~~ running cdc  
Failed to start cdc, the usage tips should be printed
 1st test case cdc_server_tips success! 
try an INVALID cdc server command
[Sat May  4 16:53:28 CST 2024] <<<<<< START cdc server in cdc_server_tips case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ true != \n\o ]]
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.63246326.out server --log-file /tmp/tidb_cdc_test/cdc_server_tips/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cdc_server_tips/cdc_data --cluster-id default --pd None
table processor_delay.t1 exists
table processor_delay.t2 not exists for 1-th check, retry later
table ddl_attributes.finish_mark not exists for 3-th check, retry later
table ddl_sequence.finish_mark not exists for 7-th check, retry later
table processor_delay.t2 exists
table processor_delay.t3 exists
table processor_delay.t4 not exists for 1-th check, retry later
table ddl_attributes.finish_mark not exists for 4-th check, retry later
table ddl_sequence.finish_mark exists
check diff successfully
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
wait process cdc.test exit for 1-th time...
split_and_random_merge scale: 80
cdc.test: no process found
wait process cdc.test exit for 2-th time...
process cdc.test already exit
[Sat May  4 16:53:35 CST 2024] <<<<<< run test case ddl_sequence success! >>>>>>
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_attributes.finish_mark not exists for 5-th check, retry later
table processor_delay.t4 exists
table processor_delay.t5 not exists for 1-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_attributes.finish_mark not exists for 6-th check, retry later
table processor_delay.t5 exists
table processor_delay.t6 not exists for 1-th check, retry later
table ddl_attributes.finish_mark not exists for 7-th check, retry later
table processor_delay.t6 exists
table processor_delay.t7 not exists for 1-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_attributes.finish_mark not exists for 8-th check, retry later
table processor_delay.t7 exists
table processor_delay.t8 not exists for 1-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d0b3c72b00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-pulsar-test-1523-3rj0-d6tg7, pid:12193, start at 2024-05-04 16:53:40.689493017 +0800 CST m=+5.307759510	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240504-16:55:40.698 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240504-16:53:40.702 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240504-16:43:40.702 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d0b3c72b00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-pulsar-test-1523-3rj0-d6tg7, pid:12193, start at 2024-05-04 16:53:40.689493017 +0800 CST m=+5.307759510	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240504-16:55:40.698 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240504-16:53:40.702 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240504-16:43:40.702 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d0b3c71c00015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-pulsar-test-1523-3rj0-d6tg7, pid:12279, start at 2024-05-04 16:53:40.633333044 +0800 CST m=+5.199010362	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240504-16:55:40.641 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240504-16:53:40.642 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240504-16:43:40.642 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/cli_with_auth/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/cli_with_auth/tiflash/log/error.log
arg matches is ArgMatches { args: {"addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/cli_with_auth/tiflash/log/proxy.log"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/cli_with_auth/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/cli_with_auth/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table ddl_attributes.finish_mark not exists for 9-th check, retry later
table processor_delay.t8 exists
table processor_delay.t9 not exists for 1-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/tests/integration_tests/resourcecontrol/run.sh using Sink-Type: pulsar... <<=================
The 1 times to try to start tidb cluster...
table ddl_attributes.finish_mark exists
check diff successfully
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13634.out cli tso query --pd=http://127.0.0.1:2379
table processor_delay.t9 exists
table processor_delay.t10 not exists for 1-th check, retry later
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
+ set +x
+ tso='449527893472837634
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449527893472837634 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sat May  4 16:53:47 CST 2024] <<<<<< run test case ddl_attributes success! >>>>>>
table processor_delay.t10 not exists for 2-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/resourcecontrol
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
[Sat May  4 16:53:48 CST 2024] <<<<<< START cdc server in cli_with_auth case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.1368713689.out server --log-file /tmp/tidb_cdc_test/cli_with_auth/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cli_with_auth/cdc_data --cluster-id default --config /tmp/tidb_cdc_test/cli_with_auth/server.toml
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
invalid ~~~ running cdc  
Failed to start cdc, the usage tips should be printed
 2nd test case cdc_server_tips success! 
[Sat May  4 16:53:48 CST 2024] <<<<<< run all test cases cdc_server_tips success! >>>>>> 
table processor_delay.t10 exists
table processor_delay.t11 not exists for 1-th check, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Aborted by Jenkins Admin
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
check_changefeed_state http://127.0.0.1:2379 8bc23dca-59e4-4adc-9199-d12527bc332b finished null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=8bc23dca-59e4-4adc-9199-d12527bc332b
+ expected_state=finished
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c 8bc23dca-59e4-4adc-9199-d12527bc332b -s
+ info='{
  "upstream_id": 7365064474688022527,
  "namespace": "default",
  "id": "8bc23dca-59e4-4adc-9199-d12527bc332b",
  "state": "finished",
  "checkpoint_tso": 449527891842039810,
  "checkpoint_time": "2024-05-04 16:53:39.832",
  "error": null
}'
+ echo '{
  "upstream_id": 7365064474688022527,
  "namespace": "default",
  "id": "8bc23dca-59e4-4adc-9199-d12527bc332b",
  "state": "finished",
  "checkpoint_tso": 449527891842039810,
  "checkpoint_time": "2024-05-04 16:53:39.832",
  "error": null
}'
{
  "upstream_id": 7365064474688022527,
  "namespace": "default",
  "id": "8bc23dca-59e4-4adc-9199-d12527bc332b",
  "state": "finished",
  "checkpoint_tso": 449527891842039810,
  "checkpoint_time": "2024-05-04 16:53:39.832",
  "error": null
}
++ echo '{' '"upstream_id":' 7365064474688022527, '"namespace":' '"default",' '"id":' '"8bc23dca-59e4-4adc-9199-d12527bc332b",' '"state":' '"finished",' '"checkpoint_tso":' 449527891842039810, '"checkpoint_time":' '"2024-05-04' '16:53:39.832",' '"error":' null '}'
++ jq -r .state
+ state=finished
+ [[ ! finished == \f\i\n\i\s\h\e\d ]]
++ echo '{' '"upstream_id":' 7365064474688022527, '"namespace":' '"default",' '"id":' '"8bc23dca-59e4-4adc-9199-d12527bc332b",' '"state":' '"finished",' '"checkpoint_tso":' 449527891842039810, '"checkpoint_time":' '"2024-05-04' '16:53:39.832",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sat May  4 16:53:47 CST 2024] <<<<<< run test case changefeed_finish success! >>>>>>
script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sat, 04 May 2024 08:53:51 GMT
< Content-Length: 860
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/511580a7-d619-449a-adab-2ce41fe1eb24
	{"id":"511580a7-d619-449a-adab-2ce41fe1eb24","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812828}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42cef20fa4
	511580a7-d619-449a-adab-2ce41fe1eb24

/tidb/cdc/default/default/upstream/7365064950349441515
	{"id":7365064950349441515,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2779,http://127.0.0.1:2679,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/511580a7-d619-449a-adab-2ce41fe1eb24
	{"id":"511580a7-d619-449a-adab-2ce41fe1eb24","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812828}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42cef20fa4
	511580a7-d619-449a-adab-2ce41fe1eb24

/tidb/cdc/default/default/upstream/7365064950349441515
	{"id":7365064950349441515,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2779,http://127.0.0.1:2679,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/511580a7-d619-449a-adab-2ce41fe1eb24
	{"id":"511580a7-d619-449a-adab-2ce41fe1eb24","address":"127.0.0.1:8300","version":"v8.2.0-alpha-79-gc950cce3a","git-hash":"c950cce3a9b105fd95bb2c788e1ab69ec32e0668","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/bin/cdc.test","start-timestamp":1714812828}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f42cef20fa4
	511580a7-d619-449a-adab-2ce41fe1eb24

/tidb/cdc/default/default/upstream/7365064950349441515
	{"id":7365064950349441515,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2779,http://127.0.0.1:2679,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
++ date
+ echo '[Sat May  4 16:53:51 CST 2024] <<<<<< START pulsar cluster in cli_with_auth case >>>>>>'
[Sat May  4 16:53:51 CST 2024] <<<<<< START pulsar cluster in cli_with_auth case >>>>>>
+ workdir=/tmp/tidb_cdc_test/cli_with_auth
+ cluster_type=normal
+ cd /tmp/tidb_cdc_test/cli_with_auth
+ DEFAULT_PULSAR_HOME=/usr/local/pulsar
+ pulsar_dir=/usr/local/pulsar
++ cat
+ mtls_conf='
authenticationEnabled=true
authenticationProviders=org.apache.pulsar.broker.authentication.AuthenticationProviderTls
brokerClientTlsEnabled=true
brokerClientTrustCertsFilePath=/tmp/tidb_cdc_test/cli_with_auth/ca.cert.pem
brokerClientAuthenticationPlugin=org.apache.pulsar.client.impl.auth.AuthenticationTls
brokerClientAuthenticationParameters={"tlsCertFile":"/tmp/tidb_cdc_test/cli_with_auth/broker_client.cert.pem","tlsKeyFile":"/tmp/tidb_cdc_test/cli_with_auth/broker_client.key-pk8.pem"}
brokerServicePortTls=6651
webServicePortTls=8443
tlsTrustCertsFilePath=/tmp/tidb_cdc_test/cli_with_auth/ca.cert.pem
tlsCertificateFilePath=/tmp/tidb_cdc_test/cli_with_auth/server.cert.pem
tlsKeyFilePath=/tmp/tidb_cdc_test/cli_with_auth/server.key-pk8.pem
tlsRequireTrustedClientCertOnConnect=true
tlsAllowInsecureConnection=false
tlsCertRefreshCheckDurationSec=300'
++ cat
+ normal_client_conf='
webServiceUrl=http://localhost:8080/
brokerServiceUrl=pulsar://localhost:6650/'
++ cat
+ mtls_client_conf='
webServiceUrl=https://localhost:8443/
brokerServiceUrl=pulsar+ssl://localhost:6651/
authPlugin=org.apache.pulsar.client.impl.auth.AuthenticationTls
authParams=tlsCertFile:/tmp/tidb_cdc_test/cli_with_auth/broker_client.cert.pem,tlsKeyFile:/tmp/tidb_cdc_test/cli_with_auth/broker_client.key-pk8.pem
tlsTrustCertsFilePath=/tmp/tidb_cdc_test/cli_with_auth/ca.cert.pem'
++ cat
+ oauth_client_conf='
    webServiceUrl=http://localhost:8080/
    brokerServiceUrl=pulsar://localhost:6650/
    authPlugin=org.apache.pulsar.client.impl.auth.oauth2.AuthenticationOAuth2
    authParams={"privateKey":"/tmp/tidb_cdc_test/cli_with_auth/credential.json","audience":"cdc-api-uri","issuerUrl":"http://localhost:9096"}'
++ cat
+ oauth_conf='
authenticationEnabled=true
authenticationProviders=org.apache.pulsar.broker.authentication.AuthenticationProviderToken

brokerClientAuthenticationPlugin=org.apache.pulsar.client.impl.auth.oauth2.AuthenticationOAuth2
brokerClientAuthenticationParameters={"privateKey":"file:///tmp/tidb_cdc_test/cli_with_auth/credential.json","audience":"cdc-api-uri","issuerUrl":"http://localhost:9096"}
tokenSecretKey=data:;base64,U0poWDM2X0thcFlTeWJCdEpxMzVseFhfQnJyNExSVVJTa203UW1YSkdteThwVUZXOUVJT2NWUVBzeWt6OS1qag=='
++ cat
+ credential_json='
    {
        "client_id":"1234",
        "client_secret":"e0KVlA2EiBfjoN13olyZd2kv1KL",
        "audience":"cdc-api-uri",
        "issuer_url":"http://localhost:9096",
        "type": "client_credentials"
    }'
++ cat
+ cert_server_conf='[ req ]
default_bits = 2048
prompt = no
default_md = sha256
distinguished_name = dn

[ v3_ext ]
authorityKeyIdentifier=keyid,issuer:always
basicConstraints=CA:FALSE
keyUsage=critical, digitalSignature, keyEncipherment
extendedKeyUsage=serverAuth
subjectAltName=@alt_names

[ dn ]
CN = server

[ alt_names ]
DNS.1 = localhost
IP.1 = 127.0.0.1'
+ echo '
webServiceUrl=http://localhost:8080/
brokerServiceUrl=pulsar://localhost:6650/'
+ cp /usr/local/pulsar/conf/standalone.conf /tmp/tidb_cdc_test/cli_with_auth/pulsar_standalone.conf
+ pulsar_port=6650
+ '[' normal == mtls ']'
+ '[' normal == oauth ']'
+ echo 'no cluster type specified, using default configuration.'
no cluster type specified, using default configuration.
++ date
+ echo '[Sat May  4 16:53:51 CST 2024] <<<<<< START pulsar cluster in normal mode in cli_with_auth case >>>>>>'
[Sat May  4 16:53:51 CST 2024] <<<<<< START pulsar cluster in normal mode in cli_with_auth case >>>>>>
+ echo 'Waiting for pulsar port to be ready...'
Waiting for pulsar port to be ready...
+ i=0
+ nc -z localhost 6650
+ /usr/local/pulsar/bin/pulsar standalone --config /tmp/tidb_cdc_test/cli_with_auth/pulsar_standalone.conf -nfw --metadata-dir /tmp/tidb_cdc_test/cli_with_auth/pulsar-metadata --bookkeeper-dir /tmp/tidb_cdc_test/cli_with_auth/pulsar-bookie
+ i=1
+ '[' 1 -gt 20 ']'
+ sleep 2
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
script returned exit code 143
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
{"level":"warn","ts":1714812832.4194412,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0018ee540/127.0.0.1:2779","attempt":0,"error":"rpc error: code = Unavailable desc = error reading from server: EOF"}
{"level":"warn","ts":1714812832.423079,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0018ee540/127.0.0.1:2779","attempt":1,"error":"rpc error: code = Unavailable desc = etcdserver: leader changed"}
script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
[2024/05/04 16:53:52.045 +08:00] [INFO] [pd_service_discovery.go:1016] ["[pd] switch leader"] [new-leader=http://127.0.0.1:2379] [old-leader=]
[2024/05/04 16:53:52.045 +08:00] [INFO] [pd_service_discovery.go:498] ["[pd] init cluster id"] [cluster-id=7365064474726613054]
[2024/05/04 16:53:52.045 +08:00] [INFO] [client.go:606] ["[pd] changing service mode"] [old-mode=UNKNOWN_SVC_MODE] [new-mode=PD_SVC_MODE]
[2024/05/04 16:53:52.045 +08:00] [INFO] [tso_client.go:236] ["[tso] switch dc tso global allocator serving url"] [dc-location=global] [new-url=http://127.0.0.1:2379]
[2024/05/04 16:53:52.046 +08:00] [INFO] [tso_dispatcher.go:359] ["[tso] tso dispatcher created"] [dc-location=global]
[2024/05/04 16:53:52.046 +08:00] [INFO] [client.go:612] ["[pd] service mode changed"] [old-mode=UNKNOWN_SVC_MODE] [new-mode=PD_SVC_MODE]
[2024/05/04 16:53:52.047 +08:00] [INFO] [pd_service_discovery.go:1016] ["[pd] switch leader"] [new-leader=http://127.0.0.1:2379] [old-leader=]
[2024/05/04 16:53:52.047 +08:00] [INFO] [pd_service_discovery.go:498] ["[pd] init cluster id"] [cluster-id=7365064474726613054]
[2024/05/04 16:53:52.047 +08:00] [INFO] [client.go:606] ["[pd] changing service mode"] [old-mode=UNKNOWN_SVC_MODE] [new-mode=PD_SVC_MODE]
[2024/05/04 16:53:52.047 +08:00] [INFO] [tso_client.go:236] ["[tso] switch dc tso global allocator serving url"] [dc-location=global] [new-url=http://127.0.0.1:2379]
[2024/05/04 16:53:52.048 +08:00] [INFO] [tso_dispatcher.go:359] ["[tso] tso dispatcher created"] [dc-location=global]
[2024/05/04 16:53:52.048 +08:00] [INFO] [client.go:612] ["[pd] service mode changed"] [old-mode=UNKNOWN_SVC_MODE] [new-mode=PD_SVC_MODE]
[2024/05/04 16:53:52.049 +08:00] [INFO] [tikv_driver.go:197] ["using API V1."]
[2024/05/04 16:53:52.049 +08:00] [INFO] [main.go:180] ["genLock started"]
[2024/05/04 16:53:52.052 +08:00] [INFO] [store_cache.go:477] ["change store resolve state"] [store=1] [addr=127.0.0.1:20161] [from=unresolved] [to=resolved] [liveness-state=reachable]
{"level":"warn","ts":1714812833.5827546,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0021eb340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = Unavailable desc = error reading from server: EOF"}
script returned exit code 143
+ nc -z localhost 6650
+ i=2
+ '[' 2 -gt 20 ']'
+ sleep 2
script returned exit code 143
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] // cache
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
Error: [CDC:ErrOwnerNotFound]owner not found
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/tests/integration_tests/ddl_only_block_related_table/run.sh: line 1: 10122 Terminated              ensure 30 check_ts_not_forward $changefeed_id
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/tests/integration_tests/ddl_only_block_related_table/run.sh: line 1: 10716 Terminated              ensure 30 check_ts_not_forward $changefeed_id
script returned exit code 143
[Pipeline] // node
[Pipeline] // node
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // cache
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G06'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G09'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G10'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G11'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G12'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G13'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G14'
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G16'
[Pipeline] // timeout
[Pipeline] }
Sending interrupt signal to process
Killing processes
kill finished with exit code 0
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G04'
++ stop_tidb_cluster
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/tests/integration_tests/synced_status_with_redo/run.sh: line 1: 13747 Terminated              sleep 130
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_pulsar_test/tiflow/tests/integration_tests/synced_status_with_redo/run.sh: line 1: 13803 Terminated              sleep 130
{"level":"warn","ts":1714812837.8124843,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc001ff7340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":1714812839.844028,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc001ff7340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":1714812841.875,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc001ff7340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-04T16:54:03.187094+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000bf8380/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":"2024-05-04T16:54:03.187145+0800","logger":"etcd-client","caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":"2024-05-04T16:54:03.189189+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc00093d880/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":"2024-05-04T16:54:03.189239+0800","logger":"etcd-client","caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":"2024-05-04T16:54:03.192224+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000bf8380/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-04T16:54:03.194511+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc00093d880/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-04T16:54:03.439086+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc00110f500/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"info","ts":"2024-05-04T16:54:03.439176+0800","logger":"etcd-client","caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":"2024-05-04T16:54:03.448752+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc00110f500/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
script returned exit code 143
[Pipeline] }
Cache not saved (inner-step execution failed)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
Click here to forcibly terminate running steps
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - TEST_GROUP = 'G08'
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
org.jenkinsci.plugins.workflow.actions.ErrorAction$ErrorId: 0a166934-8e70-4228-b23a-91b6a35a2742
Failed in branch Matrix - TEST_GROUP = 'G06'
org.jenkinsci.plugins.workflow.actions.ErrorAction$ErrorId: 2d896b33-9a83-403c-8ec7-8080e839c5cc
org.jenkinsci.plugins.workflow.actions.ErrorAction$ErrorId: 2d896b33-9a83-403c-8ec7-8080e839c5cc
Finished: ABORTED