Skip to content

Console Output

Started by user Jenkins Admin
Obtained pipelines/pingcap/tiflow/latest/pull_cdc_integration_kafka_test.groovy from git https://github.com/PingCAP-QE/ci.git
Loading library tipipeline@main
Library tipipeline@main is cached. Copying from home.
[Pipeline] Start of Pipeline
[Pipeline] readJSON
[Pipeline] readTrusted
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2
Still waiting to schedule task
‘pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2’ is not accepting tasks
‘pingcap-tiflow-pull-cdc-integration-storage-test-1801-5df-xc714’ doesn’t have label ‘pingcap_tiflow_pull_cdc_integration_kafka_test_1856-3fdwf’
‘pingcap-tiflow-pull-dm-integration-test-1902-0pzw0-cn2xr-1z8x1’ doesn’t have label ‘pingcap_tiflow_pull_cdc_integration_kafka_test_1856-3fdwf’
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2 is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-3fdwf-bqwdm
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "9ac1f043f3f713dbdc0f4c39b3bf425fb6c52bc2"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-3fdwf"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2 in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Declarative: Checkout SCM)
[Pipeline] checkout
The recommended git tool is: git
No credentials specified
Cloning the remote Git repository
Using shallow clone with depth 1
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
 > git rev-list --no-walk 03312178c534dce949face80c69812d989e55009 # timeout=10
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 1 hr 5 min
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Debug info)
[Pipeline] sh
+ printenv
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=17472320-cb3d-4429-921b-4d4624c72b01
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Debug info
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
_=/usr/bin/printenv
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
HUDSON_URL=https://do.pingcap.net/jenkins/
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=3
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-3fdwf
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-236df335481f9578f70eb859f68d5ceead3aa27f6c9385fda1ec4c08661c0305
NODE_LABELS=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2 pingcap_tiflow_pull_cdc_integration_kafka_test_1856-3fdwf
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
+ echo -------------------------
-------------------------
+ go env
GO111MODULE=''
GOARCH='amd64'
GOBIN=''
GOCACHE='/home/jenkins/.cache/go-build'
GOENV='/home/jenkins/.config/go/env'
GOEXE=''
GOEXPERIMENT=''
GOFLAGS=''
GOHOSTARCH='amd64'
GOHOSTOS='linux'
GOINSECURE=''
GOMODCACHE='/go/pkg/mod'
GONOPROXY=''
GONOSUMDB=''
GOOS='linux'
GOPATH='/go'
GOPRIVATE=''
GOPROXY='http://goproxy.apps.svc,https://proxy.golang.org,direct'
GOROOT='/usr/local/go'
GOSUMDB='sum.golang.org'
GOTMPDIR=''
GOTOOLCHAIN='auto'
GOTOOLDIR='/usr/local/go/pkg/tool/linux_amd64'
GOVCS=''
GOVERSION='go1.21.0'
GCCGO='gccgo'
GOAMD64='v1'
AR='ar'
CC='gcc'
CXX='g++'
CGO_ENABLED='1'
GOMOD='/dev/null'
GOWORK=''
CGO_CFLAGS='-O2 -g'
CGO_CPPFLAGS=''
CGO_CXXFLAGS='-O2 -g'
CGO_FFLAGS='-O2 -g'
CGO_LDFLAGS='-O2 -g'
PKG_CONFIG='pkg-config'
GOGCCFLAGS='-fPIC -m64 -pthread -Wl,--no-gc-sections -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build2937347130=/tmp/go-build -gno-record-gcc-switches'
+ echo -------------------------
-------------------------
+ echo 'debug command: kubectl -n jenkins-tiflow exec -ti pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2 bash'
debug command: kubectl -n jenkins-tiflow exec -ti pingcap-tiflow-pull-cdc-integration-kafka-test-1856-3fdwf-560v2 bash
[Pipeline] container
[Pipeline] {
[Pipeline] sh
+ dig github.com

; <<>> DiG 9.18.16 <<>> github.com
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 3994
;; flags: qr aa rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 1

;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 1232
; COOKIE: 629f61fb0b7443bf (echoed)
;; QUESTION SECTION:
;github.com.			IN	A

;; ANSWER SECTION:
github.com.		19	IN	A	20.205.243.166

;; Query time: 0 msec
;; SERVER: 169.254.25.10#53(169.254.25.10) (UDP)
;; WHEN: Sun May 05 04:47:58 UTC 2024
;; MSG SIZE  rcvd: 77

[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Check diff files)
[Pipeline] container
[Pipeline] {
[Pipeline] script
[Pipeline] {
[Pipeline] withCredentials
Masking supported pattern matches of $token
[Pipeline] {
[Pipeline] httpRequest
Warning: A secret was passed to "httpRequest" using Groovy String interpolation, which is insecure.
		 Affected argument(s) used the following variable(s): [token]
		 See https://jenkins.io/redirect/groovy-string-interpolation for details.
HttpMethod: GET
URL: https://api.github.com/repos/pingcap/tiflow/pulls/10919/files?page=1&per_page=100
Content-Type: application/json
Authorization: *****
Sending request to url: https://api.github.com/repos/pingcap/tiflow/pulls/10919/files?page=1&per_page=100
Response Code: HTTP/1.1 200 OK
Success: Status code 200 is in the accepted range: 100:399
[Pipeline] httpRequest
Warning: A secret was passed to "httpRequest" using Groovy String interpolation, which is insecure.
		 Affected argument(s) used the following variable(s): [token]
		 See https://jenkins.io/redirect/groovy-string-interpolation for details.
HttpMethod: GET
URL: https://api.github.com/repos/pingcap/tiflow/pulls/10919/files?page=2&per_page=100
Content-Type: application/json
Authorization: *****
Sending request to url: https://api.github.com/repos/pingcap/tiflow/pulls/10919/files?page=2&per_page=100
Response Code: HTTP/1.1 200 OK
Success: Status code 200 is in the accepted range: 100:399
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] echo
pr_diff_files: [cdc/model/kv.go, cdc/model/sink.go, cdc/model/sink_test.go, cdc/processor/processor.go, cdc/processor/sinkmanager/manager.go, cdc/processor/sourcemanager/manager.go, cdc/redo/reader/reader.go, cdc/sink/dmlsink/factory/factory.go, cdc/sink/dmlsink/txn/mysql/mysql.go, cdc/sink/dmlsink/txn/mysql/mysql_test.go, cmd/kafka-consumer/main.go, cmd/pulsar-consumer/main.go, cmd/storage-consumer/main.go, errors.toml, pkg/applier/redo.go, pkg/applier/redo_test.go, pkg/errors/cdc_errors.go, pkg/errors/helper.go, pkg/sink/codec/open/open_protocol_decoder.go, tests/integration_tests/changefeed_dup_error_restart/conf/diff_config.toml, tests/integration_tests/changefeed_dup_error_restart/conf/workload, tests/integration_tests/changefeed_dup_error_restart/run.sh, tests/integration_tests/force_replicate_table/run.sh, tests/integration_tests/run_group.sh]
[Pipeline] echo
diff file not matched: cdc/model/kv.go
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Checkout)
[Pipeline] timeout
Timeout set to expire in 10 min
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] cache
Cache restored successfully (git/pingcap/tiflow/rev-be15534-0de8dc3)
203830272 bytes in 2.36 secs (86230875 bytes/sec)
[Pipeline] {
[Pipeline] retry
[Pipeline] {
[Pipeline] script
[Pipeline] {
[Pipeline] sh
git version 2.36.6
Reinitialized existing Git repository in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/.git/
.git
HEAD is now at 0de8dc3e4 fix test again
POST git-upload-pack (656 bytes)
From https://github.com/pingcap/tiflow
 = [up to date]          master               -> origin/master
 = [up to date]          refs/pull/10919/head -> origin/pr/10919/head
Previous HEAD position was 0de8dc3e4 fix test again
HEAD is now at be1553484 codec(ticdc): avro simplify the unit test (#11010)
🚧 Checkouting to base SHA:be1553484fe4c03594eabb8d7435c694e5fd7224...
HEAD is now at be1553484 codec(ticdc): avro simplify the unit test (#11010)
✅ Checked. 🎉
🧾 HEAD info:
be1553484fe4c03594eabb8d7435c694e5fd7224
be1553484 codec(ticdc): avro simplify the unit test (#11010)
2a7a65c6f Support Sequences (#10203)
36e9e1bf6 cli(ticdc): allow client authentication to be enabled without tls (#11005)
🚧 Pre-merge heads of pull requests to base SHA: be1553484fe4c03594eabb8d7435c694e5fd7224 ...
Updating be1553484..0de8dc3e4
Fast-forward
 cdc/model/kv.go                                    |   5 +
 cdc/model/sink.go                                  |  35 ++-
 cdc/model/sink_test.go                             |   9 +-
 cdc/processor/processor.go                         |  21 +-
 cdc/processor/sinkmanager/manager.go               |   5 +
 cdc/processor/sourcemanager/manager.go             |  66 +++-
 cdc/redo/reader/reader.go                          |  21 +-
 cdc/sink/dmlsink/factory/factory.go                |   8 +-
 cdc/sink/dmlsink/txn/mysql/mysql.go                |  89 +++---
 cdc/sink/dmlsink/txn/mysql/mysql_test.go           |   2 +-
 cmd/kafka-consumer/main.go                         |   4 +-
 cmd/pulsar-consumer/main.go                        |  17 +-
 cmd/storage-consumer/main.go                       |   4 +-
 errors.toml                                        |   5 +
 pkg/applier/redo.go                                | 303 +++++++++++++++++-
 pkg/applier/redo_test.go                           | 347 ++++++++++++++++++++-
 pkg/errors/cdc_errors.go                           |   4 +
 pkg/errors/helper.go                               |  19 ++
 pkg/sink/codec/open/open_protocol_decoder.go       |   2 +
 .../conf/diff_config.toml                          |  29 ++
 .../changefeed_dup_error_restart/conf/workload     |  13 +
 .../changefeed_dup_error_restart/run.sh            |  54 ++++
 .../integration_tests/force_replicate_table/run.sh |   4 +-
 tests/integration_tests/run_group.sh               |   2 +-
 24 files changed, 970 insertions(+), 98 deletions(-)
 create mode 100644 tests/integration_tests/changefeed_dup_error_restart/conf/diff_config.toml
 create mode 100644 tests/integration_tests/changefeed_dup_error_restart/conf/workload
 create mode 100755 tests/integration_tests/changefeed_dup_error_restart/run.sh
🧾 Pre-merged result:
0de8dc3e43ec741eba58047155ce7f3dba8eb4f7
0de8dc3e4 fix test again
6a342866d fix bit test
0dd104704 fix
✅ Pre merged 🎉
✅ ~~~~~All done.~~~~~~
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // retry
[Pipeline] }
Cache not saved (git/pingcap/tiflow/rev-be15534-0de8dc3 already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (prepare)
[Pipeline] timeout
Timeout set to expire in 20 min
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/third_party_download
[Pipeline] {
[Pipeline] retry
[Pipeline] {
[Pipeline] sh
+ cd ../tiflow
+ ./scripts/download-integration-test-binaries.sh master
Download binaries...
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    41  100    41    0     0    172      0 --:--:-- --:--:-- --:--:--   172
100    41  100    41    0     0    172      0 --:--:-- --:--:-- --:--:--   172
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    41  100    41    0     0    171      0 --:--:-- --:--:-- --:--:--   171
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    41  100    41    0     0   2493      0 --:--:-- --:--:-- --:--:--  2562
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    41  100    41    0     0   1024      0 --:--:-- --:--:-- --:--:--  1051
>>>
download tidb-server.tar.gz from http://fileserver.pingcap.net/download/builds/pingcap/tidb/600b2ed4bf0aa38224a1c4c4c68831820735515c/centos7/tidb-server.tar.gz
2024-05-05 12:48:32 URL:http://fileserver.pingcap.net/download/builds/pingcap/tidb/600b2ed4bf0aa38224a1c4c4c68831820735515c/centos7/tidb-server.tar.gz [536570515/536570515] -> "tmp/tidb-server.tar.gz" [1]
>>>
download pd-server.tar.gz from http://fileserver.pingcap.net/download/builds/pingcap/pd/1679dbca25b3483d1375c7e747da27e99ad77360/centos7/pd-server.tar.gz
2024-05-05 12:48:50 URL:http://fileserver.pingcap.net/download/builds/pingcap/pd/1679dbca25b3483d1375c7e747da27e99ad77360/centos7/pd-server.tar.gz [187372022/187372022] -> "tmp/pd-server.tar.gz" [1]
>>>
download tikv-server.tar.gz from http://fileserver.pingcap.net/download/builds/pingcap/tikv/72a0fd5b00235a7c56014b77ddd933e2a0d33c88/centos7/tikv-server.tar.gz
2024-05-05 12:49:34 URL:http://fileserver.pingcap.net/download/builds/pingcap/tikv/72a0fd5b00235a7c56014b77ddd933e2a0d33c88/centos7/tikv-server.tar.gz [919098782/919098782] -> "tmp/tikv-server.tar.gz" [1]
>>>
download tiflash.tar.gz from http://fileserver.pingcap.net/download/builds/pingcap/tiflash/master/8e170090fad91c94bef8d908e21c195c1d145b02/centos7/tiflash.tar.gz
2024-05-05 12:50:02 URL:http://fileserver.pingcap.net/download/builds/pingcap/tiflash/master/8e170090fad91c94bef8d908e21c195c1d145b02/centos7/tiflash.tar.gz [456057803/456057803] -> "tmp/tiflash.tar.gz" [1]
>>>
download minio.tar.gz from http://fileserver.pingcap.net/download/minio.tar.gz
2024-05-05 12:50:07 URL:http://fileserver.pingcap.net/download/minio.tar.gz [17718777/17718777] -> "tmp/minio.tar.gz" [1]
>>>
download go-ycsb from http://fileserver.pingcap.net/download/builds/pingcap/go-ycsb/test-br/go-ycsb
2024-05-05 12:50:10 URL:http://fileserver.pingcap.net/download/builds/pingcap/go-ycsb/test-br/go-ycsb [45975512/45975512] -> "third_bin/go-ycsb" [1]
>>>
download jq from http://fileserver.pingcap.net/download/builds/pingcap/test/jq-1.6/jq-linux64
2024-05-05 12:50:10 URL:http://fileserver.pingcap.net/download/builds/pingcap/test/jq-1.6/jq-linux64 [3953824/3953824] -> "third_bin/jq" [1]
>>>
download etcd.tar.gz from http://fileserver.pingcap.net/download/builds/pingcap/cdc/etcd-v3.4.7-linux-amd64.tar.gz
2024-05-05 12:50:11 URL:http://fileserver.pingcap.net/download/builds/pingcap/cdc/etcd-v3.4.7-linux-amd64.tar.gz [17310840/17310840] -> "tmp/etcd.tar.gz" [1]
>>>
download sync_diff_inspector.tar.gz from http://fileserver.pingcap.net/download/builds/pingcap/cdc/sync_diff_inspector_hash-d671b084_linux-amd64.tar.gz
2024-05-05 12:50:16 URL:http://fileserver.pingcap.net/download/builds/pingcap/cdc/sync_diff_inspector_hash-d671b084_linux-amd64.tar.gz [79877126/79877126] -> "tmp/sync_diff_inspector.tar.gz" [1]
>>>
download schema-registry.tar.gz from http://fileserver.pingcap.net/download/builds/pingcap/cdc/schema-registry.tar.gz
2024-05-05 12:50:33 URL:http://fileserver.pingcap.net/download/builds/pingcap/cdc/schema-registry.tar.gz [278386006/278386006] -> "tmp/schema-registry.tar.gz" [1]
Download SUCCESS
+ ls -alh ./bin
total 1.9G
drwxr-sr-x.  6 jenkins jenkins  4.0K May  5 12:50 .
drwxr-sr-x. 19 jenkins jenkins  4.0K May  5 12:50 ..
drwxr-sr-x.  2 jenkins jenkins  4.0K May 19  2023 bin
drwxr-sr-x.  4 jenkins jenkins  4.0K May 10  2023 etc
-rwxr-xr-x.  1 jenkins jenkins   17M Apr  2  2020 etcdctl
-rwxr-xr-x.  1 jenkins jenkins   44M May  5 12:50 go-ycsb
-rwxr-xr-x.  1 jenkins jenkins  3.8M May  5 12:50 jq
drwxr-sr-x.  3 jenkins jenkins  4.0K May 10  2023 lib
lrwxrwxrwx.  1 jenkins jenkins    13 Apr 30 11:15 libc++.so.1 -> libc++.so.1.0
-rwxr-xr-x.  1 jenkins jenkins 1016K Nov  7 01:00 libc++.so.1.0
lrwxrwxrwx.  1 jenkins jenkins    16 Apr 30 11:15 libc++abi.so.1 -> libc++abi.so.1.0
-rwxr-xr-x.  1 jenkins jenkins  358K Nov  7 01:00 libc++abi.so.1.0
lrwxrwxrwx.  1 jenkins jenkins    13 Apr 30 11:15 libgmssl.so -> libgmssl.so.3
lrwxrwxrwx.  1 jenkins jenkins    15 Apr 30 11:15 libgmssl.so.3 -> libgmssl.so.3.0
-rwxr-xr-x.  1 jenkins jenkins  2.6M Apr 30 10:34 libgmssl.so.3.0
-rwxr-xr-x.  1 jenkins jenkins  272M Apr 30 11:16 libtiflash_proxy.so
-rwxr-xr-x.  1 jenkins jenkins   50M Jul 29  2020 minio
-rwxr-xr-x.  1 jenkins jenkins   37M Apr 30 16:11 pd-api-bench
-rwxr-xr-x.  1 jenkins jenkins   44M Apr 30 16:10 pd-ctl
-rwxr-xr-x.  1 jenkins jenkins   36M Apr 30 16:10 pd-heartbeat-bench
-rwxr-xr-x.  1 jenkins jenkins   32M Apr 30 16:10 pd-recover
-rwxr-xr-x.  1 jenkins jenkins  106M Apr 30 16:10 pd-server
-rwxr-xr-x.  1 jenkins jenkins   26M Apr 30 16:10 pd-tso-bench
-rwxr-xr-x.  1 jenkins jenkins  3.0M Apr 30 16:11 pd-ut
-rwxr-xr-x.  1 jenkins jenkins   32M Apr 30 16:10 regions-dump
drwxr-sr-x.  4 jenkins jenkins  4.0K May 10  2023 share
-rwxr-xr-x.  1 jenkins jenkins   32M Apr 30 16:11 stores-dump
-rwxr-xr-x.  1 jenkins jenkins  192M Sep 22  2023 sync_diff_inspector
-rwxr-xr-x.  1 jenkins jenkins  208M May  1 10:57 tidb-server
-rwxr-xr-x.  1 jenkins jenkins  380M Apr 30 11:15 tiflash
-rwxr-xr-x.  1 jenkins jenkins  418M Apr 30 11:29 tikv-server
-rwxr-xr-x.  1 jenkins jenkins  2.0M Apr 30 16:11 xprog
+ make check_third_party_binary
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/tidb-server
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/tikv-server
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/pd-server
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/tiflash
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/pd-ctl
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/sync_diff_inspector
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/go-ycsb
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/etcdctl
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/jq
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/minio
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/bin/schema-registry-start
+ cd -
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/third_party_download
+ mkdir -p bin
+ mv ../tiflow/bin/bin ../tiflow/bin/etc ../tiflow/bin/etcdctl ../tiflow/bin/go-ycsb ../tiflow/bin/jq ../tiflow/bin/lib ../tiflow/bin/libc++.so.1 ../tiflow/bin/libc++.so.1.0 ../tiflow/bin/libc++abi.so.1 ../tiflow/bin/libc++abi.so.1.0 ../tiflow/bin/libgmssl.so ../tiflow/bin/libgmssl.so.3 ../tiflow/bin/libgmssl.so.3.0 ../tiflow/bin/libtiflash_proxy.so ../tiflow/bin/minio ../tiflow/bin/pd-api-bench ../tiflow/bin/pd-ctl ../tiflow/bin/pd-heartbeat-bench ../tiflow/bin/pd-recover ../tiflow/bin/pd-server ../tiflow/bin/pd-tso-bench ../tiflow/bin/pd-ut ../tiflow/bin/regions-dump ../tiflow/bin/share ../tiflow/bin/stores-dump ../tiflow/bin/sync_diff_inspector ../tiflow/bin/tidb-server ../tiflow/bin/tiflash ../tiflow/bin/tikv-server ../tiflow/bin/xprog ./bin/
+ ls -alh ./bin
total 1.9G
drwxr-sr-x. 6 jenkins jenkins  4.0K May  5 12:50 .
drwxr-sr-x. 3 jenkins jenkins  4.0K May  5 12:50 ..
drwxr-sr-x. 2 jenkins jenkins  4.0K May 19  2023 bin
drwxr-sr-x. 4 jenkins jenkins  4.0K May 10  2023 etc
-rwxr-xr-x. 1 jenkins jenkins   17M Apr  2  2020 etcdctl
-rwxr-xr-x. 1 jenkins jenkins   44M May  5 12:50 go-ycsb
-rwxr-xr-x. 1 jenkins jenkins  3.8M May  5 12:50 jq
drwxr-sr-x. 3 jenkins jenkins  4.0K May 10  2023 lib
lrwxrwxrwx. 1 jenkins jenkins    13 Apr 30 11:15 libc++.so.1 -> libc++.so.1.0
-rwxr-xr-x. 1 jenkins jenkins 1016K Nov  7 01:00 libc++.so.1.0
lrwxrwxrwx. 1 jenkins jenkins    16 Apr 30 11:15 libc++abi.so.1 -> libc++abi.so.1.0
-rwxr-xr-x. 1 jenkins jenkins  358K Nov  7 01:00 libc++abi.so.1.0
lrwxrwxrwx. 1 jenkins jenkins    13 Apr 30 11:15 libgmssl.so -> libgmssl.so.3
lrwxrwxrwx. 1 jenkins jenkins    15 Apr 30 11:15 libgmssl.so.3 -> libgmssl.so.3.0
-rwxr-xr-x. 1 jenkins jenkins  2.6M Apr 30 10:34 libgmssl.so.3.0
-rwxr-xr-x. 1 jenkins jenkins  272M Apr 30 11:16 libtiflash_proxy.so
-rwxr-xr-x. 1 jenkins jenkins   50M Jul 29  2020 minio
-rwxr-xr-x. 1 jenkins jenkins   37M Apr 30 16:11 pd-api-bench
-rwxr-xr-x. 1 jenkins jenkins   44M Apr 30 16:10 pd-ctl
-rwxr-xr-x. 1 jenkins jenkins   36M Apr 30 16:10 pd-heartbeat-bench
-rwxr-xr-x. 1 jenkins jenkins   32M Apr 30 16:10 pd-recover
-rwxr-xr-x. 1 jenkins jenkins  106M Apr 30 16:10 pd-server
-rwxr-xr-x. 1 jenkins jenkins   26M Apr 30 16:10 pd-tso-bench
-rwxr-xr-x. 1 jenkins jenkins  3.0M Apr 30 16:11 pd-ut
-rwxr-xr-x. 1 jenkins jenkins   32M Apr 30 16:10 regions-dump
drwxr-sr-x. 4 jenkins jenkins  4.0K May 10  2023 share
-rwxr-xr-x. 1 jenkins jenkins   32M Apr 30 16:11 stores-dump
-rwxr-xr-x. 1 jenkins jenkins  192M Sep 22  2023 sync_diff_inspector
-rwxr-xr-x. 1 jenkins jenkins  208M May  1 10:57 tidb-server
-rwxr-xr-x. 1 jenkins jenkins  380M Apr 30 11:15 tiflash
-rwxr-xr-x. 1 jenkins jenkins  418M Apr 30 11:29 tikv-server
-rwxr-xr-x. 1 jenkins jenkins  2.0M Apr 30 16:11 xprog
+ ./bin/tidb-server -V
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
+ ./bin/pd-server -V
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
+ ./bin/tikv-server -V
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ ./bin/tiflash --version
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
+ ./bin/sync_diff_inspector --version
App Name: sync_diff_inspector v2.0
Release Version: v7.4.0
Git Commit Hash: d671b0840063bc2532941f02e02e12627402844c
Git Branch: heads/refs/tags/v7.4.0
UTC Build Time: 2023-09-22 03:51:56
Go Version: go1.21.1
[Pipeline] }
[Pipeline] // retry
[Pipeline] }
[Pipeline] // dir
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] cache
Cache restored successfully (binary/pingcap/tiflow/cdc-integration-test/rev-be15534-0de8dc3)
1191700480 bytes in 2.12 secs (561624076 bytes/sec)
[Pipeline] {
[Pipeline] sh
+ ls -alh ./bin
total 1.2G
drwxr-sr-x.  2 jenkins jenkins 4.0K May  5 12:50 .
drwxr-sr-x. 19 jenkins jenkins 4.0K May  5 12:50 ..
-rwxr-xr-x.  1 jenkins jenkins 220M May  4 22:57 cdc
-rwxr-xr-x.  1 jenkins jenkins 359M May  4 22:57 cdc.test
-rwxr-xr-x.  1 jenkins jenkins 183M May  4 22:53 cdc_kafka_consumer
-rwxr-xr-x.  1 jenkins jenkins 183M May  4 22:53 cdc_pulsar_consumer
-rwxr-xr-x.  1 jenkins jenkins 182M May  4 22:52 cdc_storage_consumer
-rwxr-xr-x.  1 jenkins jenkins  12M May  4 22:53 oauth2-server
+ '[' -f ./bin/cdc ']'
+ '[' -f ./bin/cdc_kafka_consumer ']'
+ '[' -f ./bin/cdc_storage_consumer ']'
+ '[' -f ./bin/cdc.test ']'
+ ls -alh ./bin
total 1.2G
drwxr-sr-x.  2 jenkins jenkins 4.0K May  5 12:50 .
drwxr-sr-x. 19 jenkins jenkins 4.0K May  5 12:50 ..
-rwxr-xr-x.  1 jenkins jenkins 220M May  4 22:57 cdc
-rwxr-xr-x.  1 jenkins jenkins 359M May  4 22:57 cdc.test
-rwxr-xr-x.  1 jenkins jenkins 183M May  4 22:53 cdc_kafka_consumer
-rwxr-xr-x.  1 jenkins jenkins 183M May  4 22:53 cdc_pulsar_consumer
-rwxr-xr-x.  1 jenkins jenkins 182M May  4 22:52 cdc_storage_consumer
-rwxr-xr-x.  1 jenkins jenkins  12M May  4 22:53 oauth2-server
+ ./bin/cdc version
Release Version: v8.2.0-alpha-53-g0de8dc3e4
Git Commit Hash: 0de8dc3e43ec741eba58047155ce7f3dba8eb4f7
Git Branch: HEAD
UTC Build Time: 2024-05-04 14:52:44
Go Version: go version go1.21.0 linux/amd64
Failpoint Build: true
[Pipeline] }
Cache not saved (binary/pingcap/tiflow/cdc-integration-test/rev-be15534-0de8dc3 already exists)
[Pipeline] // cache
[Pipeline] cache
Cache not restored (no such key found)
[Pipeline] {
[Pipeline] sh
+ cp -r ../third_party_download/bin/bin ../third_party_download/bin/etc ../third_party_download/bin/etcdctl ../third_party_download/bin/go-ycsb ../third_party_download/bin/jq ../third_party_download/bin/lib ../third_party_download/bin/libc++.so.1 ../third_party_download/bin/libc++.so.1.0 ../third_party_download/bin/libc++abi.so.1 ../third_party_download/bin/libc++abi.so.1.0 ../third_party_download/bin/libgmssl.so ../third_party_download/bin/libgmssl.so.3 ../third_party_download/bin/libgmssl.so.3.0 ../third_party_download/bin/libtiflash_proxy.so ../third_party_download/bin/minio ../third_party_download/bin/pd-api-bench ../third_party_download/bin/pd-ctl ../third_party_download/bin/pd-heartbeat-bench ../third_party_download/bin/pd-recover ../third_party_download/bin/pd-server ../third_party_download/bin/pd-tso-bench ../third_party_download/bin/pd-ut ../third_party_download/bin/regions-dump ../third_party_download/bin/share ../third_party_download/bin/stores-dump ../third_party_download/bin/sync_diff_inspector ../third_party_download/bin/tidb-server ../third_party_download/bin/tiflash ../third_party_download/bin/tikv-server ../third_party_download/bin/xprog ./bin/
+ ls -alh ./bin
total 3.0G
drwxr-sr-x.  6 jenkins jenkins  4.0K May  5 12:50 .
drwxr-sr-x. 19 jenkins jenkins  4.0K May  5 12:50 ..
drwxr-sr-x.  2 jenkins jenkins  4.0K May  5 12:50 bin
-rwxr-xr-x.  1 jenkins jenkins  220M May  4 22:57 cdc
-rwxr-xr-x.  1 jenkins jenkins  359M May  4 22:57 cdc.test
-rwxr-xr-x.  1 jenkins jenkins  183M May  4 22:53 cdc_kafka_consumer
-rwxr-xr-x.  1 jenkins jenkins  183M May  4 22:53 cdc_pulsar_consumer
-rwxr-xr-x.  1 jenkins jenkins  182M May  4 22:52 cdc_storage_consumer
drwxr-sr-x.  4 jenkins jenkins  4.0K May  5 12:50 etc
-rwxr-xr-x.  1 jenkins jenkins   17M May  5 12:50 etcdctl
-rwxr-xr-x.  1 jenkins jenkins   44M May  5 12:50 go-ycsb
-rwxr-xr-x.  1 jenkins jenkins  3.8M May  5 12:50 jq
drwxr-sr-x.  3 jenkins jenkins  4.0K May  5 12:50 lib
lrwxrwxrwx.  1 jenkins jenkins    13 May  5 12:50 libc++.so.1 -> libc++.so.1.0
-rwxr-xr-x.  1 jenkins jenkins 1016K May  5 12:50 libc++.so.1.0
lrwxrwxrwx.  1 jenkins jenkins    16 May  5 12:50 libc++abi.so.1 -> libc++abi.so.1.0
-rwxr-xr-x.  1 jenkins jenkins  358K May  5 12:50 libc++abi.so.1.0
lrwxrwxrwx.  1 jenkins jenkins    13 May  5 12:50 libgmssl.so -> libgmssl.so.3
lrwxrwxrwx.  1 jenkins jenkins    15 May  5 12:50 libgmssl.so.3 -> libgmssl.so.3.0
-rwxr-xr-x.  1 jenkins jenkins  2.6M May  5 12:50 libgmssl.so.3.0
-rwxr-xr-x.  1 jenkins jenkins  272M May  5 12:50 libtiflash_proxy.so
-rwxr-xr-x.  1 jenkins jenkins   50M May  5 12:50 minio
-rwxr-xr-x.  1 jenkins jenkins   12M May  4 22:53 oauth2-server
-rwxr-xr-x.  1 jenkins jenkins   37M May  5 12:50 pd-api-bench
-rwxr-xr-x.  1 jenkins jenkins   44M May  5 12:50 pd-ctl
-rwxr-xr-x.  1 jenkins jenkins   36M May  5 12:50 pd-heartbeat-bench
-rwxr-xr-x.  1 jenkins jenkins   32M May  5 12:50 pd-recover
-rwxr-xr-x.  1 jenkins jenkins  106M May  5 12:50 pd-server
-rwxr-xr-x.  1 jenkins jenkins   26M May  5 12:50 pd-tso-bench
-rwxr-xr-x.  1 jenkins jenkins  3.0M May  5 12:50 pd-ut
-rwxr-xr-x.  1 jenkins jenkins   32M May  5 12:50 regions-dump
drwxr-sr-x.  4 jenkins jenkins  4.0K May  5 12:50 share
-rwxr-xr-x.  1 jenkins jenkins   32M May  5 12:50 stores-dump
-rwxr-xr-x.  1 jenkins jenkins  192M May  5 12:50 sync_diff_inspector
-rwxr-xr-x.  1 jenkins jenkins  208M May  5 12:50 tidb-server
-rwxr-xr-x.  1 jenkins jenkins  380M May  5 12:50 tiflash
-rwxr-xr-x.  1 jenkins jenkins  418M May  5 12:50 tikv-server
-rwxr-xr-x.  1 jenkins jenkins  2.0M May  5 12:50 xprog
[Pipeline] }
Cache saved successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 100.93 secs (36891611 bytes/sec)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Tests)
[Pipeline] parallel
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G00')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G01')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G02')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G03')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G04')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G05')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G06')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G07')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G08')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G09')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G10')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G11')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G12')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G13')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G14')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G15')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G16')
[Pipeline] { (Branch: Matrix - TEST_GROUP = 'G17')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G00')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G01')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G02')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G03')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G04')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G05')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G06')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G07')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G08')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G09')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G10')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G11')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G12')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G13')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G14')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G15')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G16')
[Pipeline] stage
[Pipeline] { (Matrix - TEST_GROUP = 'G17')
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
[Pipeline] readTrusted
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg
[Pipeline] node
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-mwn3b-m3vz1
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "1b9755866ba8e62dafd792c49975546021c852db"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-mwn3b"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
[Pipeline] {
[Pipeline] checkout
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
The recommended git tool is: git
[Pipeline] {
[Pipeline] node
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-nm9bc-rt7xs
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "c3f5f53c3135e83033a4a251b9cd30d3da0ca39f"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-nm9bc"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
[Pipeline] {
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv
[Pipeline] {
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@6639df12; decorates RemoteLauncher[hudson.remoting.Channel@463349eb:JNLP4-connect connection from 10.233.67.125/10.233.67.125:55706] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
[Pipeline] node
[Pipeline] checkout
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
The recommended git tool is: git
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-c0214-l6jxr
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "7d327f7ab6c6267b4198f5b64e3bdfb48e6391c5"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-c0214"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
[Pipeline] {
[Pipeline] checkout
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@65a7bccc; decorates RemoteLauncher[hudson.remoting.Channel@ef6ef65:JNLP4-connect connection from 10.233.68.143/10.233.68.143:44342] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
The recommended git tool is: git
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@743d1b5e; decorates RemoteLauncher[hudson.remoting.Channel@2e54348a:JNLP4-connect connection from 10.233.70.211/10.233.70.211:47576] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k
[Pipeline] withEnv
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] cache
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-h55pm-r64qd
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-xnvpx-hldq8
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "e02345d62b92337e9ba1db7ade38fbc577901089"
    jenkins/label-digest: "00c2fe0ba55cd929fc0e36d9f6de4f5a661471a9"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-h55pm"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-xnvpx"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    - name: "RACK_COMMAND"
    imagePullPolicy: "IfNotPresent"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg"
    - name: "JENKINS_NAME"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv"
    - name: "JENKINS_NAME"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
      medium: ""
    name: "workspace-volume"

    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-n0psn-dhkfd
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "1453ac006ec52bebc447006b480c9b522c05b6b9"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-n0psn"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-g1835-c6d67
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "b0dc8030b4524201c3f9cfb818771328a390bf5b"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-g1835"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
Obtained pipelines/pingcap/tiflow/latest/pod-pull_cdc_integration_kafka_test.yaml from git https://github.com/PingCAP-QE/ci.git
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-bxr1t-4vv64
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "cee18b999d36e2c393aaa00dedcef9d0105b6ec6"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-bxr1t"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-vv6pz-hld8v
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "a8db7dec29a9e210660e83576bed5a307ac829b4"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-vv6pz"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7 is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-jpkvb-xk4nj
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "f76b968d4c5736dc31f125f17e4e444d4e44c25a"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-jpkvb"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7 in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l25q9-4wd1k
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "42129436f5c5838756667ae1067c19199fdc9187"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l25q9"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Still waiting to schedule task
‘pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85’ is offline
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85 is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l0fvq-9522h
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "4ff9bc843750250cb9a0eeb7fe6b43996b1d77ff"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l0fvq"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85 in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 16.48 secs (225893948 bytes/sec)
[Pipeline] {
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] node
[Pipeline] node
[Pipeline] node
[Pipeline] node
[Pipeline] node
[Pipeline] node
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] sh
[Pipeline] {
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] checkout
[Pipeline] checkout
The recommended git tool is: git
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] checkout
The recommended git tool is: git
[Pipeline] checkout
The recommended git tool is: git
[Pipeline] checkout
The recommended git tool is: git
[Pipeline] checkout
The recommended git tool is: git
[Pipeline] checkout
[Pipeline] checkout
[Pipeline] checkout
The recommended git tool is: git
The recommended git tool is: git
The recommended git tool is: git
[Pipeline] stage
[Pipeline] { (Test)
The recommended git tool is: git
[Pipeline] stage
[Pipeline] { (Test)
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj
[Pipeline] }
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l2kvf-qx8r0
[Pipeline] timeout
Timeout set to expire in 45 min
Created Pod: kubernetes jenkins-tiflow/pingcap-tiflow-pull-cdc-integration-kafka-test-1856-z64vj-z69rl
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] // timeout
[Pipeline] }
[Pipeline] {
[Pipeline] {
[Pipeline] // container
[Pipeline] sh
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@7387ce7b; decorates RemoteLauncher[hudson.remoting.Channel@39c717e3:JNLP4-connect connection from 10.233.108.98/10.233.108.98:38104] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@399e5dc7; decorates RemoteLauncher[hudson.remoting.Channel@13c52f6f:JNLP4-connect connection from 10.233.105.146/10.233.105.146:35836] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@c0b56ce; decorates RemoteLauncher[hudson.remoting.Channel@4184ccce:JNLP4-connect connection from 10.233.107.65/10.233.107.65:57918] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
Cloning repository https://github.com/PingCAP-QE/ci.git
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@163ebf4f; decorates RemoteLauncher[hudson.remoting.Channel@578d2009:JNLP4-connect connection from 10.233.88.206/10.233.88.206:55122] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
[Pipeline] cache
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@8e6678a; decorates RemoteLauncher[hudson.remoting.Channel@574f71e9:JNLP4-connect connection from 10.233.72.69/10.233.72.69:40562] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@4c939c74; decorates RemoteLauncher[hudson.remoting.Channel@7f592e:JNLP4-connect connection from 10.233.106.206/10.233.106.206:40608] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
Cloning repository https://github.com/PingCAP-QE/ci.git
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G00
Run cases: bdr_mode capture_suicide_while_balance_table syncpoint hang_sink_suicide server_config_compatibility changefeed_dup_error_restart kafka_big_messages kafka_compression kafka_messages kafka_sink_error_resume mq_sink_lost_callback mq_sink_dispatcher kafka_column_selector kafka_column_selector_avro debezium lossy_ddl storage_csv_update
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=8be4c628-6a21-4fef-9a1a-e6b6dce05a96
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G00
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-mwn3b
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-mwn3b pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/bdr_mode/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:00 CST 2024] <<<<<< run test case bdr_mode success! >>>>>>
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@73329b75; decorates RemoteLauncher[hudson.remoting.Channel@51e268d5:JNLP4-connect connection from 10.233.123.100/10.233.123.100:56942] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@67932f9; decorates RemoteLauncher[hudson.remoting.Channel@24929784:JNLP4-connect connection from 10.233.66.73/10.233.66.73:60994] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@b6f8817; decorates RemoteLauncher[hudson.remoting.Channel@3acd35eb:JNLP4-connect connection from 10.233.71.248/10.233.71.248:57044] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/capture_suicide_while_balance_table/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:03 CST 2024] <<<<<< run test case capture_suicide_while_balance_table success! >>>>>>
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Avoid second fetch
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-z64vj-z69rl is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-z64vj-9bzjv
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "8aba178bd3f5abb16a083409d7cbcead962c9817"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-z64vj"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-z64vj-z69rl"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-z64vj-z69rl"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-z64vj-z69rl"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-z64vj-z69rl in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-wrhxv-vdd3c
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "71b1051f299a6119ed8aea3640d4335e5e316c73"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-wrhxv"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/syncpoint/run.sh using Sink-Type: kafka... <<=================
kafka downstream isn't support syncpoint record
[Sun May  5 12:53:06 CST 2024] <<<<<< run test case syncpoint success! >>>>>>
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-11vs6-1tz1l
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "7e6ae3ec97236b910bce1b3e95ebf8e1bc010381"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-11vs6"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l2kvf-qx8r0 is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l2kvf-pv1x8
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "2701b86f72a1092cb8bda1e2e1ce0c98fb2e2ad8"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l2kvf"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l2kvf-qx8r0"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l2kvf-qx8r0"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l2kvf-qx8r0"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l2kvf-qx8r0 in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4 is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-rgkc6-k52d8
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "bf8f84f6bdc8d26da4e2d353d86baf31624e636b"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-rgkc6"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4 in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/hang_sink_suicide/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:09 CST 2024] <<<<<< run test case hang_sink_suicide success! >>>>>>
Agent pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5 is provisioned from template pingcap_tiflow_pull_cdc_integration_kafka_test_1856-ww4ds-nkwfq
---
apiVersion: "v1"
kind: "Pod"
metadata:
  annotations:
    buildUrl: "http://jenkins.apps.svc.cluster.local:8080/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
    runUrl: "job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/"
  labels:
    jenkins/jenkins-jenkins-agent: "true"
    jenkins/label-digest: "68c3419de50e6ffc2f66656c3f707c5a957f075b"
    jenkins/label: "pingcap_tiflow_pull_cdc_integration_kafka_test_1856-ww4ds"
  name: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5"
  namespace: "jenkins-tiflow"
spec:
  affinity:
    nodeAffinity:
      requiredDuringSchedulingIgnoredDuringExecution:
        nodeSelectorTerms:
        - matchExpressions:
          - key: "kubernetes.io/arch"
            operator: "In"
            values:
            - "amd64"
  containers:
  - image: "wurstmeister/zookeeper"
    imagePullPolicy: "IfNotPresent"
    name: "zookeeper"
    resources:
      limits:
        cpu: "2000m"
        memory: "4Gi"
      requests:
        cpu: "2000m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - args:
    - "cat"
    image: "hub.pingcap.net/jenkins/golang-tini:1.21"
    imagePullPolicy: "Always"
    name: "golang"
    resources:
      limits:
        cpu: "12"
        memory: "32Gi"
      requests:
        cpu: "12"
        memory: "32Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_CREATE_TOPICS"
      value: "big-message-test:1:1"
    - name: "KAFKA_BROKER_ID"
      value: "1"
    - name: "KAFKA_SSL_KEYSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_ZOOKEEPER_CONNECT"
      value: "localhost:2181"
    - name: "KAFKA_MESSAGE_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_REPLICA_FETCH_MAX_BYTES"
      value: "11534336"
    - name: "KAFKA_ADVERTISED_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "ZK"
      value: "zk"
    - name: "KAFKA_SSL_KEYSTORE_LOCATION"
      value: "/tmp/kafka.server.keystore.jks"
    - name: "KAFKA_SSL_KEY_PASSWORD"
      value: "test1234"
    - name: "KAFKA_SSL_TRUSTSTORE_PASSWORD"
      value: "test1234"
    - name: "KAFKA_LISTENERS"
      value: "SSL://127.0.0.1:9093,PLAINTEXT://127.0.0.1:9092"
    - name: "KAFKA_SSL_TRUSTSTORE_LOCATION"
      value: "/tmp/kafka.server.truststore.jks"
    - name: "RACK_COMMAND"
      value: "curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.keystore.jks\
        \ -o /tmp/kafka.server.keystore.jks && curl -sfL https://github.com/pingcap/tiflow/raw/6e62afcfecc4e3965d8818784327d4bf2600d9fa/tests/_certificates/kafka.server.truststore.jks\
        \ -o /tmp/kafka.server.truststore.jks"
    image: "wurstmeister/kafka:2.12-2.4.1"
    imagePullPolicy: "IfNotPresent"
    name: "kafka"
    resources:
      limits:
        cpu: "4000m"
        memory: "6Gi"
      requests:
        cpu: "4000m"
        memory: "6Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "KAFKA_SERVER"
      value: "127.0.0.1:9092"
    - name: "ZOOKEEPER_SERVER"
      value: "127.0.0.1:2181"
    - name: "DOWNSTREAM_DB_HOST"
      value: "127.0.0.1"
    - name: "USE_FLAT_MESSAGE"
      value: "true"
    - name: "DOWNSTREAM_DB_PORT"
      value: "3306"
    - name: "DB_NAME"
      value: "test"
    image: "rustinliu/ticdc-canal-json-adapter:latest"
    imagePullPolicy: "IfNotPresent"
    name: "canal-adapter"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/tmp"
      name: "volume-0"
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/network-multitool"
    name: "net-tool"
    resources:
      limits:
        memory: "128Mi"
        cpu: "100m"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - image: "hub.pingcap.net/jenkins/python3-requests:latest"
    name: "report"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    tty: true
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "MYSQL_ROOT_PASSWORD"
      value: ""
    - name: "MYSQL_USER"
      value: "mysqluser"
    - name: "MYSQL_PASSWORD"
      value: "mysqlpw"
    - name: "MYSQL_ALLOW_EMPTY_PASSWORD"
      value: "yes"
    - name: "MYSQL_TCP_PORT"
      value: "3310"
    image: "quay.io/debezium/example-mysql:2.4"
    imagePullPolicy: "IfNotPresent"
    name: "mysql"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "BOOTSTRAP_SERVERS"
      value: "127.0.0.1:9092"
    - name: "GROUP_ID"
      value: "1"
    - name: "CONFIG_STORAGE_TOPIC"
      value: "my_connect_configs"
    - name: "OFFSET_STORAGE_TOPIC"
      value: "my_connect_offsets"
    - name: "STATUS_STORAGE_TOPIC"
      value: "my_connect_statuses"
    image: "quay.io/debezium/connect:2.4"
    name: "connect"
    resources:
      requests:
        cpu: "200m"
        memory: "4Gi"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  - env:
    - name: "JENKINS_SECRET"
      value: "********"
    - name: "JENKINS_TUNNEL"
      value: "jenkins-agent.apps.svc.cluster.local:50000"
    - name: "JENKINS_AGENT_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5"
    - name: "JENKINS_NAME"
      value: "pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5"
    - name: "JENKINS_AGENT_WORKDIR"
      value: "/home/jenkins/agent"
    - name: "JENKINS_URL"
      value: "http://jenkins.apps.svc.cluster.local:8080/jenkins/"
    image: "jenkins/inbound-agent:3206.vb_15dcf73f6a_9-2"
    name: "jnlp"
    resources:
      requests:
        memory: "256Mi"
        cpu: "100m"
    volumeMounts:
    - mountPath: "/home/jenkins/agent"
      name: "workspace-volume"
      readOnly: false
  restartPolicy: "Never"
  securityContext:
    fsGroup: 1000
  volumes:
  - emptyDir: {}
    name: "volume-0"
  - emptyDir:
      medium: ""
    name: "workspace-volume"

Running on pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5 in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 8.76 secs (425025191 bytes/sec)
[Pipeline] {
[Pipeline] cache
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/server_config_compatibility/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:12 CST 2024] <<<<<< run test case server_config_compatibility success! >>>>>>
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/changefeed_dup_error_restart/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:16 CST 2024] <<<<<< run test case changefeed_dup_error_restart success! >>>>>>
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_big_messages/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/kafka_big_messages
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 16.76 secs (222218480 bytes/sec)
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] sh
[Pipeline] sh
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] withEnv
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] }
[Pipeline] checkout
The recommended git tool is: git
[Pipeline] checkout
The recommended git tool is: git
[Pipeline] checkout
[Pipeline] checkout
The recommended git tool is: git
[Pipeline] checkout
The recommended git tool is: git
[Pipeline] checkout
The recommended git tool is: git
The recommended git tool is: git
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c66888c0004	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:1848, start at 2024-05-05 12:53:33.351091656 +0800 CST m=+5.134255306	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:55:33.358 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:53:33.347 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:43:33.347 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c66888c0004	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:1848, start at 2024-05-05 12:53:33.351091656 +0800 CST m=+5.134255306	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:55:33.358 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:53:33.347 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:43:33.347 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c66893c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:1924, start at 2024-05-05 12:53:33.417002304 +0800 CST m=+5.148543364	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:55:33.425 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:53:33.391 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:43:33.391 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_big_messages/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_big_messages/tiflash/log/error.log
arg matches is ArgMatches { args: {"config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_big_messages/tiflash-proxy.toml"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_big_messages/tiflash/log/proxy.log"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_big_messages/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Pipeline] }
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@66faa8a3; decorates RemoteLauncher[hudson.remoting.Channel@28071d28:JNLP4-connect connection from 10.233.84.182/10.233.84.182:45522] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
[Pipeline] // timeout
[Pipeline] }
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@3941bfc6; decorates RemoteLauncher[hudson.remoting.Channel@7d3c668e:JNLP4-connect connection from 10.233.97.28/10.233.97.28:54692] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@2a7b1323; decorates RemoteLauncher[hudson.remoting.Channel@2b24016b:JNLP4-connect connection from 10.233.93.28/10.233.93.28:53538] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
Cloning the remote Git repository
Using shallow clone with depth 1
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@24f12cd7; decorates RemoteLauncher[hudson.remoting.Channel@2e3e5dd3:JNLP4-connect connection from 10.233.100.1/10.233.100.1:42098] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
[Pipeline] // timeout
[Pipeline] }
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@5bc5fb1c; decorates RemoteLauncher[hudson.remoting.Channel@70d018cd:JNLP4-connect connection from 10.233.90.127/10.233.90.127:34090] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
[Pipeline] // container
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
No credentials specified
Warning: JENKINS-30600: special launcher org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1@75eb8cbe; decorates RemoteLauncher[hudson.remoting.Channel@3bfcd57e:JNLP4-connect connection from 10.233.86.159/10.233.86.159:51416] will be ignored (a typical symptom is the Git executable not being run inside a designated container)
Cloning the remote Git repository
Using shallow clone with depth 1
[Pipeline] sh
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
Cloning repository https://github.com/PingCAP-QE/ci.git
 > git init /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test # timeout=10
Fetching upstream changes from https://github.com/PingCAP-QE/ci.git
 > git --version # timeout=10
 > git --version # 'git version 2.39.2'
 > git fetch --tags --force --progress --depth=1 -- https://github.com/PingCAP-QE/ci.git +refs/heads/*:refs/remotes/origin/* # timeout=5
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G01
Run cases: http_api http_api_tls api_v2 http_api_tls_with_user_auth cli_tls_with_auth kafka_simple_basic kafka_simple_basic_avro kafka_simple_handle_key_only kafka_simple_handle_key_only_avro kafka_simple_claim_check kafka_simple_claim_check_avro canal_json_adapter_compatibility canal_json_basic canal_json_content_compatible multi_topics avro_basic canal_json_handle_key_only open_protocol_handle_key_only canal_json_claim_check open_protocol_claim_check canal_json_storage_basic canal_json_storage_partition_table multi_tables_ddl
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=5d45a3b5-d204-499e-94ba-9fca5293b21f
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G01
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-nm9bc
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-nm9bc pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/http_api/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:35 CST 2024] <<<<<< run test case http_api success! >>>>>>
[Pipeline] // container
[Pipeline] sh
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G02
Run cases: consistent_replicate_ddl consistent_replicate_gbk consistent_replicate_nfs consistent_replicate_storage_file consistent_replicate_storage_file_large_value consistent_replicate_storage_s3 consistent_partition_table kafka_big_messages_v2 multi_tables_ddl_v2 multi_topics_v2 storage_cleanup csv_storage_basic csv_storage_multi_tables_ddl csv_storage_partition_table
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=ac126590-5d64-42ff-be08-30a68cc0dfd3
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G02
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-c0214
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj pingcap_tiflow_pull_cdc_integration_kafka_test_1856-c0214
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/consistent_replicate_ddl/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:36 CST 2024] <<<<<< run test case consistent_replicate_ddl success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

[Pipeline] cache
[Sun May  5 12:53:36 CST 2024] <<<<<< START cdc server in kafka_big_messages case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_big_messages.33333335.out server --log-file /tmp/tidb_cdc_test/kafka_big_messages/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_big_messages/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/http_api_tls/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:39 CST 2024] <<<<<< run test case http_api_tls success! >>>>>>
Avoid second fetch
Checking out Revision 03312178c534dce949face80c69812d989e55009 (origin/main)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:53:39 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/aa7a1556-7ce7-4089-8586-2e377e4264a4
	{"id":"aa7a1556-7ce7-4089-8586-2e377e4264a4","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884817}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47197821cb
	aa7a1556-7ce7-4089-8586-2e377e4264a4

/tidb/cdc/default/default/upstream/7365374147806527640
	{"id":7365374147806527640,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/aa7a1556-7ce7-4089-8586-2e377e4264a4
	{"id":"aa7a1556-7ce7-4089-8586-2e377e4264a4","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884817}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47197821cb
	aa7a1556-7ce7-4089-8586-2e377e4264a4

/tidb/cdc/default/default/upstream/7365374147806527640
	{"id":7365374147806527640,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/aa7a1556-7ce7-4089-8586-2e377e4264a4
	{"id":"aa7a1556-7ce7-4089-8586-2e377e4264a4","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884817}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47197821cb
	aa7a1556-7ce7-4089-8586-2e377e4264a4

/tidb/cdc/default/default/upstream/7365374147806527640
	{"id":7365374147806527640,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/consistent_replicate_gbk/run.sh using Sink-Type: kafka... <<=================
* About to connect() to 127.0.0.1 port 24927 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:24927; Connection refused
* Closing connection 0
Commit message: "fix(br): use failpoint tidb-server instead (#2951)"
Create changefeed successfully!
ID: bc669e8c-e487-4e38-82c9-240e7ebb4629
Info: {"upstream_id":7365374147806527640,"namespace":"default","id":"bc669e8c-e487-4e38-82c9-240e7ebb4629","sink_uri":"kafka://127.0.0.1:9092/big-message-test?protocol=open-protocol\u0026partition-num=1\u0026kafka-version=2.4.1\u0026max-message-bytes=12582912","create_time":"2024-05-05T12:53:39.918615072+08:00","start_ts":449546765362003969,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546765362003969,"checkpoint_ts":449546765362003969,"checkpoint_time":"2024-05-05 12:53:36.597"}
[Sun May  5 12:53:39 CST 2024] <<<<<< START kafka consumer in kafka_big_messages case >>>>>>
Starting generate kafka big messages...
go: downloading github.com/pingcap/errors v0.11.5-0.20240318064555-6bd07397691f
go: downloading go.uber.org/atomic v1.11.0
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10
 > git config remote.origin.url https://github.com/PingCAP-QE/ci.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git rev-parse origin/main^{commit} # timeout=10
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 03312178c534dce949face80c69812d989e55009 # timeout=10

 You are running an older version of MinIO released 3 years ago 
 Update: Run `mc admin update` 


Attempting encryption of all config, IAM users and policies on MinIO backend
Endpoint:  http://127.0.0.1:24927

Object API (Amazon S3 compatible):
   Go:         https://docs.min.io/docs/golang-client-quickstart-guide
   Java:       https://docs.min.io/docs/java-client-quickstart-guide
   Python:     https://docs.min.io/docs/python-client-quickstart-guide
   JavaScript: https://docs.min.io/docs/javascript-client-quickstart-guide
   .NET:       https://docs.min.io/docs/dotnet-client-quickstart-guide
* About to connect() to 127.0.0.1 port 24927 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 24927 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:24927
> Accept: */*
> 
< HTTP/1.1 403 Forbidden
< Accept-Ranges: bytes
< Content-Length: 226
< Content-Security-Policy: block-all-mixed-content
< Content-Type: application/xml
< Server: MinIO/RELEASE.2020-07-27T18-37-02Z
< Vary: Origin
< X-Amz-Request-Id: 17CC7EA8D043DA98
< X-Xss-Protection: 1; mode=block
< Date: Sun, 05 May 2024 04:53:41 GMT
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
Bucket 's3://logbucket/' created
[Sun May  5 12:53:41 CST 2024] <<<<<< run test case consistent_replicate_gbk success! >>>>>>
Exiting on signal: INTERRUPT
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/api_v2/run.sh using Sink-Type: kafka... <<=================
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/consistent_replicate_nfs/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:44 CST 2024] <<<<<< run test case consistent_replicate_nfs success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

table kafka_big_messages.test exists
check diff failed 1-th time, retry later
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/http_api_tls_with_user_auth/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:45 CST 2024] <<<<<< run test case http_api_tls_with_user_auth success! >>>>>>
check diff failed 2-th time, retry later
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/consistent_replicate_storage_file/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:47 CST 2024] <<<<<< run test case consistent_replicate_storage_file success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/cli_tls_with_auth/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
start tidb cluster in /tmp/tidb_cdc_test/cli_tls_with_auth
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:53:51 CST 2024] <<<<<< run test case kafka_big_messages success! >>>>>>
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/consistent_replicate_storage_file_large_value/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:53:51 CST 2024] <<<<<< run test case consistent_replicate_storage_file_large_value success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 15.11 secs (246404859 bytes/sec)
[Pipeline] {
[Pipeline] cache
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/consistent_replicate_storage_s3/run.sh using Sink-Type: kafka... <<=================
* About to connect() to 127.0.0.1 port 24927 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:24927; Connection refused
* Closing connection 0

 You are running an older version of MinIO released 3 years ago 
 Update: Run `mc admin update` 


Attempting encryption of all config, IAM users and policies on MinIO backend
Endpoint:  http://127.0.0.1:24927

Object API (Amazon S3 compatible):
   Go:         https://docs.min.io/docs/golang-client-quickstart-guide
   Java:       https://docs.min.io/docs/java-client-quickstart-guide
   Python:     https://docs.min.io/docs/python-client-quickstart-guide
   JavaScript: https://docs.min.io/docs/javascript-client-quickstart-guide
   .NET:       https://docs.min.io/docs/dotnet-client-quickstart-guide
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
* About to connect() to 127.0.0.1 port 24927 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 24927 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:24927
> Accept: */*
> 
< HTTP/1.1 403 Forbidden
< Accept-Ranges: bytes
< Content-Length: 226
< Content-Security-Policy: block-all-mixed-content
< Content-Type: application/xml
< Server: MinIO/RELEASE.2020-07-27T18-37-02Z
< Vary: Origin
< X-Amz-Request-Id: 17CC7EAC32539F81
< X-Xss-Protection: 1; mode=block
< Date: Sun, 05 May 2024 04:53:56 GMT
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
Bucket 's3://logbucket/' created
[Sun May  5 12:53:56 CST 2024] <<<<<< run test case consistent_replicate_storage_s3 success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

Exiting on signal: INTERRUPT
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/consistent_partition_table/run.sh using Sink-Type: kafka... <<=================
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Sun May  5 12:53:59 CST 2024] <<<<<< run test case consistent_partition_table success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_big_messages_v2/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_compression/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6856040003	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:1776, start at 2024-05-05 12:54:02.882616116 +0800 CST m=+5.143878713	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:02.889 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:02.881 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:02.881 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6856040003	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:1776, start at 2024-05-05 12:54:02.882616116 +0800 CST m=+5.143878713	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:02.889 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:02.881 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:02.881 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6855d40015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:1865, start at 2024-05-05 12:54:02.901215235 +0800 CST m=+5.110927372	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:02.908 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:02.869 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:02.869 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/cli_tls_with_auth/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/cli_tls_with_auth/tiflash/log/error.log
arg matches is ArgMatches { args: {"data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/cli_tls_with_auth/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/cli_tls_with_auth/tiflash-proxy.toml"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/cli_tls_with_auth/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
The 1 times to try to start tls tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/cli_tls_with_auth
Starting TLS PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
start tidb cluster in /tmp/tidb_cdc_test/kafka_compression
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/kafka_big_messages_v2
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Starting TLS TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting TLS TiDB...
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying TLS TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c690c180005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:3568, start at 2024-05-05 12:54:14.539455179 +0800 CST m=+5.150721382	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:14.546 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:14.534 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:14.534 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
+ pd_host=127.0.0.1
+ pd_port=2579
+ is_tls=true
+ '[' true == true ']'
++ run_cdc_cli tso query --pd=https://127.0.0.1:2579
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.3640.out cli tso query --pd=https://127.0.0.1:2579
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c690a300017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj, pid:2171, start at 2024-05-05 12:54:14.453952845 +0800 CST m=+5.094175622	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:14.461 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:14.462 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:14.462 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c690a300017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj, pid:2171, start at 2024-05-05 12:54:14.453952845 +0800 CST m=+5.094175622	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:14.461 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:14.462 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:14.462 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c690c64000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj, pid:2251, start at 2024-05-05 12:54:14.56781204 +0800 CST m=+5.154189964	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:14.576 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:14.553 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:14.553 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_big_messages_v2/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_big_messages_v2/tiflash/log/error.log
arg matches is ArgMatches { args: {"advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_big_messages_v2/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_big_messages_v2/tiflash-proxy.toml"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_big_messages_v2/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 19.17 secs (194230021 bytes/sec)
[Pipeline] {
[Pipeline] cache
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
+ tso='449546775660855297
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546775660855297 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:54:17 CST 2024] <<<<<< START cdc server in cli_tls_with_auth case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates ']'
+ curl_status_cmd='curl --cacert /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem --cert /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem --key /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem --user ticdc:ticdc_secret -vsL --max-time 20 https://127.0.0.1:8300/debug/info'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.36883690.out server --log-file /tmp/tidb_cdc_test/cli_tls_with_auth/cdc_cli_tls_with_auth_tls1.log --log-level debug --data-dir /tmp/tidb_cdc_test/cli_tls_with_auth/cdc_data_cli_tls_with_auth_tls1 --cluster-id default --config /tmp/tidb_cdc_test/cli_tls_with_auth/server.toml --ca /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem --cert /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/server.pem --key /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/server-key.pem --cert-allowed-cn client --addr 127.0.0.1:8300 --pd https://127.0.0.1:2579
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl --cacert /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem --cert /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem --key /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem --user ticdc:ticdc_secret -vsL --max-time 20 https://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[Sun May  5 12:54:17 CST 2024] <<<<<< START cdc server in kafka_big_messages_v2 case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_big_messages_v2.36303632.out server --log-file /tmp/tidb_cdc_test/kafka_big_messages_v2/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_big_messages_v2/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c692300001a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:5755, start at 2024-05-05 12:54:16.049926366 +0800 CST m=+5.171081567	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:16.057 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:16.049 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:16.049 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c692300001a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:5755, start at 2024-05-05 12:54:16.049926366 +0800 CST m=+5.171081567	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:16.057 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:16.049 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:16.049 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6923c80006	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:5832, start at 2024-05-05 12:54:16.056672531 +0800 CST m=+5.132806706	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:16.063 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:16.050 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:16.050 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_compression/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_compression/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_compression/tiflash/db/proxy"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_compression/tiflash/log/proxy.log"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_compression/tiflash-proxy.toml"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ (( i++ ))
+ (( i <= 50 ))
++ curl --cacert /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem --cert /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem --key /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem --user ticdc:ticdc_secret -vsL --max-time 20 https://127.0.0.1:8300/debug/info
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
*   CAfile: /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem
  CApath: none
* NSS: client certificate from file
* 	subject: CN=client
* 	start date: Feb 18 07:48:00 2020 GMT
* 	expire date: Jan 25 07:48:00 2120 GMT
* 	common name: client
* 	issuer: CN=My own CA,O=PingCAP,L=Beijing,ST=Beijing,C=CN
* SSL connection using TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
* Server certificate:
* 	subject: CN=tidb-server
* 	start date: Feb 18 09:11:00 2020 GMT
* 	expire date: Jan 25 09:11:00 2120 GMT
* 	common name: tidb-server
* 	issuer: CN=My own CA,O=PingCAP,L=Beijing,ST=Beijing,C=CN
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:54:20 GMT
< Content-Length: 1233
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/495ff2f7-70fa-4a30-a528-faa8bc5277dc
	{"id":"495ff2f7-70fa-4a30-a528-faa8bc5277dc","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884857}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/1f5d8f471a222983
	495ff2f7-70fa-4a30-a528-faa8bc5277dc

/tidb/cdc/default/default/upstream/7365374333842603812
	{"id":7365374333842603812,"pd-endpoints":"https://127.0.0.1:2579,https://127.0.0.1:2579","key-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/server-key.pem","cert-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/server.pem","ca-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem","cert-allowed-cn":["client","tidb-server"]}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/495ff2f7-70fa-4a30-a528-faa8bc5277dc
	{"id":"495ff2f7-70fa-4a30-a528-faa8bc5277dc","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884857}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/1f5d8f471a222983
	495ff2f7-70fa-4a30-a528-faa8bc5277dc

/tidb/cdc/default/default/upstream/7365374333842603812
	{"id":7365374333842603812,"pd-endpoints":"https://127.0.0.1:2579,https://127.0.0.1:2579","key-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/server-key.pem","cert-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/server.pem","ca-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem","cert-allowed-cn":["client","tidb-server"]}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/495ff2f7-70fa-4a30-a528-faa8bc5277dc
	{"id":"495ff2f7-70fa-4a30-a528-faa8bc5277dc","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884857}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/1f5d8f471a222983
	495ff2f7-70fa-4a30-a528-faa8bc5277dc

/tidb/cdc/default/default/upstream/7365374333842603812
	{"id":7365374333842603812,"pd-endpoints":"https://127.0.0.1:2579,https://127.0.0.1:2579","key-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/server-key.pem","cert-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/server.pem","ca-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem","cert-allowed-cn":["client","tidb-server"]}'
+ grep -q 'etcd info'
+ break
+ set +x
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:54:20 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/81b8268f-ba0e-479d-85d6-5134048ab5c5
	{"id":"81b8268f-ba0e-479d-85d6-5134048ab5c5","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884858}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471a20bdc7
	81b8268f-ba0e-479d-85d6-5134048ab5c5

/tidb/cdc/default/default/upstream/7365374333539404996
	{"id":7365374333539404996,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/81b8268f-ba0e-479d-85d6-5134048ab5c5
	{"id":"81b8268f-ba0e-479d-85d6-5134048ab5c5","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884858}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471a20bdc7
	81b8268f-ba0e-479d-85d6-5134048ab5c5

/tidb/cdc/default/default/upstream/7365374333539404996
	{"id":7365374333539404996,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/81b8268f-ba0e-479d-85d6-5134048ab5c5
	{"id":"81b8268f-ba0e-479d-85d6-5134048ab5c5","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884858}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471a20bdc7
	81b8268f-ba0e-479d-85d6-5134048ab5c5

/tidb/cdc/default/default/upstream/7365374333539404996
	{"id":7365374333539404996,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Create changefeed successfully!
ID: 58e4e564-6a91-4580-8582-ca6d9cfcd472
Info: {"upstream_id":7365374333539404996,"namespace":"default","id":"58e4e564-6a91-4580-8582-ca6d9cfcd472","sink_uri":"kafka://127.0.0.1:9092/big-message-test?protocol=open-protocol\u0026partition-num=1\u0026kafka-version=2.4.1\u0026max-message-bytes=12582912","create_time":"2024-05-05T12:54:21.035551893+08:00","start_ts":449546776140054529,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":true,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546776140054529,"checkpoint_ts":449546776140054529,"checkpoint_time":"2024-05-05 12:54:17.712"}
[Sun May  5 12:54:21 CST 2024] <<<<<< START kafka consumer in kafka_big_messages_v2 case >>>>>>
Starting generate kafka big messages...
go: downloading github.com/pingcap/errors v0.11.5-0.20240318064555-6bd07397691f
go: downloading go.uber.org/atomic v1.11.0
[Sun May  5 12:54:21 CST 2024] <<<<<< START cdc server in kafka_compression case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.72327234.out server --log-file /tmp/tidb_cdc_test/kafka_compression/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_compression/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.3753.out cli changefeed create --start-ts=449546775660855297 '--sink-uri=kafka://127.0.0.1:9092/ticdc-cli-test-22421?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' --tz=Asia/Shanghai -c=custom-changefeed-name
[WARN] --tz is deprecated in changefeed settings.
Create changefeed successfully!
ID: custom-changefeed-name
Info: {"upstream_id":7365374333842603812,"namespace":"default","id":"custom-changefeed-name","sink_uri":"kafka://127.0.0.1:9092/ticdc-cli-test-22421?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:54:23.421261072+08:00","start_ts":449546775660855297,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546775660855297,"checkpoint_ts":449546775660855297,"checkpoint_time":"2024-05-05 12:54:15.884"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:54:24 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6db82da7-1cb4-4c1f-8dc5-db619fb09cac
	{"id":"6db82da7-1cb4-4c1f-8dc5-db619fb09cac","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884861}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471a22a6cd
	6db82da7-1cb4-4c1f-8dc5-db619fb09cac

/tidb/cdc/default/default/upstream/7365374330422267995
	{"id":7365374330422267995,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6db82da7-1cb4-4c1f-8dc5-db619fb09cac
	{"id":"6db82da7-1cb4-4c1f-8dc5-db619fb09cac","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884861}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471a22a6cd
	6db82da7-1cb4-4c1f-8dc5-db619fb09cac

/tidb/cdc/default/default/upstream/7365374330422267995
	{"id":7365374330422267995,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6db82da7-1cb4-4c1f-8dc5-db619fb09cac
	{"id":"6db82da7-1cb4-4c1f-8dc5-db619fb09cac","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884861}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471a22a6cd
	6db82da7-1cb4-4c1f-8dc5-db619fb09cac

/tidb/cdc/default/default/upstream/7365374330422267995
	{"id":7365374330422267995,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7290.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
[Sun May  5 12:54:24 CST 2024] <<<<<< START kafka consumer in cli_tls_with_auth case >>>>>>
table test.simple not exists for 1-th check, retry later
+ set +x
+ tso='449546777945702403
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546777945702403 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7325.out cli changefeed create --start-ts=449546777945702403 '--sink-uri=kafka://127.0.0.1:9092/ticdc-kafka-compression-gzip-test?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1&compression=gzip' -c gzip
Create changefeed successfully!
ID: gzip
Info: {"upstream_id":7365374330422267995,"namespace":"default","id":"gzip","sink_uri":"kafka://127.0.0.1:9092/ticdc-kafka-compression-gzip-test?protocol=canal-json\u0026enable-tidb-extension=true\u0026kafka-version=2.4.1\u0026compression=gzip","create_time":"2024-05-05T12:54:26.539522268+08:00","start_ts":449546777945702403,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546777945702403,"checkpoint_ts":449546777945702403,"checkpoint_time":"2024-05-05 12:54:24.600"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table test.simple not exists for 2-th check, retry later
table kafka_big_messages.test exists
check diff failed 1-th time, retry later
+ set +x
[Sun May  5 12:54:27 CST 2024] <<<<<< START kafka consumer in kafka_compression case >>>>>>
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_compression/run.sh: line 22: [[: [2024/05/05 12:54:26.494 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:26.534 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:26.655 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:26.665 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:27.630 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:27.640 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]: syntax error: operand expected (error token is "[2024/05/05 12:54:26.494 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:26.534 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:26.655 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:26.665 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:27.630 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]
[2024/05/05 12:54:27.640 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses gzip compression algorithm"]")
table test.gzip_finish_mark not exists for 1-th check, retry later
table test.simple exists
table test.`simple-dash` exists
+ endpoints=https://127.0.0.1:2579
+ changefeed_id=custom-changefeed-name
+ expected_state=normal
+ error_msg=null
+ tls_dir=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates
+ [[ https://127.0.0.1:2579 =~ https ]]
++ cdc cli changefeed query --ca=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem --cert=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem --key=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem --pd=https://127.0.0.1:2579 -c custom-changefeed-name -s
+ info='{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546776067178522,
  "checkpoint_time": "2024-05-05 12:54:17.434",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546776067178522,
  "checkpoint_time": "2024-05-05 12:54:17.434",
  "error": null
}'
{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546776067178522,
  "checkpoint_time": "2024-05-05 12:54:17.434",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374333842603812, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546776067178522, '"checkpoint_time":' '"2024-05-05' '12:54:17.434",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365374333842603812, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546776067178522, '"checkpoint_time":' '"2024-05-05' '12:54:17.434",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
changefeed count 1 check pass, pd_addr: https://127.0.0.1:2579
check diff failed 2-th time, retry later
Error: [CDC:ErrChangefeedUpdateRefused]changefeed update error: can only update changefeed config when it is stopped or failed
update changefeed config should fail when changefeed is running, got Diff of changefeed config:
{Type:update Path:[Config CaseSensitive] From:false To:true}
{Type:update Path:[Config SyncPointInterval] From:<nil> To:0xc0001df8f0}
{Type:update Path:[Config SyncPointRetention] From:<nil> To:0xc0001df8f8}
{Type:update Path:[Config Consistent] From:<nil> To:0xc00129a3f0}
{Type:update Path:[Config Scheduler EnableTableAcrossNodes] From:false To:true}
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4015.out cli changefeed --changefeed-id custom-changefeed-name pause
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
table test.gzip_finish_mark not exists for 2-th check, retry later
check diff successfully
+ set +x
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
table test.gzip_finish_mark exists
check diff successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7463.out cli changefeed pause -c gzip
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:54:33 CST 2024] <<<<<< run test case kafka_big_messages_v2 success! >>>>>>
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7494.out cli changefeed remove -c gzip
+ endpoints=https://127.0.0.1:2579
+ changefeed_id=custom-changefeed-name
+ expected_state=stopped
+ error_msg=null
+ tls_dir=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates
+ [[ https://127.0.0.1:2579 =~ https ]]
++ cdc cli changefeed query --ca=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem --cert=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem --key=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem --pd=https://127.0.0.1:2579 -c custom-changefeed-name -s
+ info='{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "stopped",
  "checkpoint_tso": 449546779487895553,
  "checkpoint_time": "2024-05-05 12:54:30.483",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "stopped",
  "checkpoint_tso": 449546779487895553,
  "checkpoint_time": "2024-05-05 12:54:30.483",
  "error": null
}'
{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "stopped",
  "checkpoint_tso": 449546779487895553,
  "checkpoint_time": "2024-05-05 12:54:30.483",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374333842603812, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"stopped",' '"checkpoint_tso":' 449546779487895553, '"checkpoint_time":' '"2024-05-05' '12:54:30.483",' '"error":' null '}'
++ jq -r .state
+ state=stopped
+ [[ ! stopped == \s\t\o\p\p\e\d ]]
++ echo '{' '"upstream_id":' 7365374333842603812, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"stopped",' '"checkpoint_tso":' 449546779487895553, '"checkpoint_time":' '"2024-05-05' '12:54:30.483",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4112.out cli changefeed update --pd=https://127.0.0.1:2579 --config=/tmp/tidb_cdc_test/cli_tls_with_auth/changefeed.toml --no-confirm --changefeed-id custom-changefeed-name
Changefeed remove successfully.
ID: gzip
CheckpointTs: 449546780134604801
SinkURI: kafka://127.0.0.1:9092/ticdc-kafka-compression-gzip-test?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1&compression=gzip
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
Diff of changefeed config:
{Type:update Path:[Config CaseSensitive] From:false To:true}
{Type:update Path:[Config SyncPointInterval] From:<nil> To:0xc00174e6e8}
{Type:update Path:[Config SyncPointRetention] From:<nil> To:0xc00174e6f8}
{Type:update Path:[Config Consistent] From:<nil> To:0xc001491570}
{Type:update Path:[Config Scheduler EnableTableAcrossNodes] From:false To:true}
Update changefeed config successfully! 
ID: custom-changefeed-name
Info: {"upstream_id":7365374333842603812,"namespace":"default","id":"custom-changefeed-name","sink_uri":"kafka://127.0.0.1:9092/ticdc-cli-test-22421?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:54:23.421261072+08:00","start_ts":449546775660855297,"admin_job_type":1,"config":{"memory_quota":1073741824,"case_sensitive":true,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_table_monitor":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","encoder_concurrency":32,"terminator":"\r\n","enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":true,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"stopped","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":0,"checkpoint_ts":449546779487895553,"checkpoint_time":"2024-05-05 12:54:30.483"}
PASS
coverage: 2.8% of statements in github.com/pingcap/tiflow/...
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7529.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4149.out cli changefeed --changefeed-id custom-changefeed-name resume
PASS
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
+ set +x
+ tso='449546781327360002
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546781327360002 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7563.out cli changefeed create --start-ts=449546781327360002 '--sink-uri=kafka://127.0.0.1:9092/ticdc-kafka-compression-snappy-test?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1&compression=snappy' -c snappy
Create changefeed successfully!
ID: snappy
Info: {"upstream_id":7365374330422267995,"namespace":"default","id":"snappy","sink_uri":"kafka://127.0.0.1:9092/ticdc-kafka-compression-snappy-test?protocol=canal-json\u0026enable-tidb-extension=true\u0026kafka-version=2.4.1\u0026compression=snappy","create_time":"2024-05-05T12:54:39.380026615+08:00","start_ts":449546781327360002,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546781327360002,"checkpoint_ts":449546781327360002,"checkpoint_time":"2024-05-05 12:54:37.500"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
+ set +x
[Sun May  5 12:54:40 CST 2024] <<<<<< START kafka consumer in kafka_compression case >>>>>>
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 22.97 secs (162095868 bytes/sec)
[Pipeline] {
[Pipeline] cache
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_compression/run.sh: line 22: [[: [2024/05/05 12:54:39.344 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:39.375 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:39.480 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:39.489 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:40.480 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:40.488 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]: syntax error: operand expected (error token is "[2024/05/05 12:54:39.344 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:39.375 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:39.480 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:39.489 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:40.480 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]
[2024/05/05 12:54:40.488 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses snappy compression algorithm"]")
table test.snappy_finish_mark not exists for 1-th check, retry later
+ endpoints=https://127.0.0.1:2579
+ changefeed_id=custom-changefeed-name
+ expected_state=normal
+ error_msg=null
+ tls_dir=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates
+ [[ https://127.0.0.1:2579 =~ https ]]
++ cdc cli changefeed query --ca=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem --cert=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem --key=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem --pd=https://127.0.0.1:2579 -c custom-changefeed-name -s
+ info='{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546782633885699,
  "checkpoint_time": "2024-05-05 12:54:42.484",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546782633885699,
  "checkpoint_time": "2024-05-05 12:54:42.484",
  "error": null
}'
{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546782633885699,
  "checkpoint_time": "2024-05-05 12:54:42.484",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374333842603812, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546782633885699, '"checkpoint_time":' '"2024-05-05' '12:54:42.484",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365374333842603812, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546782633885699, '"checkpoint_time":' '"2024-05-05' '12:54:42.484",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4235.out cli changefeed --changefeed-id custom-changefeed-name remove
table test.snappy_finish_mark not exists for 2-th check, retry later
Changefeed remove successfully.
ID: custom-changefeed-name
CheckpointTs: 449546782896029699
SinkURI: kafka://127.0.0.1:9092/ticdc-cli-test-22421?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_tables_ddl_v2/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table test.snappy_finish_mark exists
check diff successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7691.out cli changefeed pause -c snappy
+ set +x
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7727.out cli changefeed remove -c snappy
start tidb cluster in /tmp/tidb_cdc_test/multi_tables_ddl_v2
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Changefeed remove successfully.
ID: snappy
CheckpointTs: 449546782336614431
SinkURI: kafka://127.0.0.1:9092/ticdc-kafka-compression-snappy-test?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1&compression=snappy
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
changefeed count 0 check pass, pd_addr: https://127.0.0.1:2579
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4322.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-cli-test-22421?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' --tz=Asia/Shanghai -c=custom-changefeed-name
[WARN] --tz is deprecated in changefeed settings.
Create changefeed successfully!
ID: custom-changefeed-name
Info: {"upstream_id":7365374333842603812,"namespace":"default","id":"custom-changefeed-name","sink_uri":"kafka://127.0.0.1:9092/ticdc-cli-test-22421?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:54:49.62671813+08:00","start_ts":449546784455786498,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546784455786498,"checkpoint_ts":449546784455786498,"checkpoint_time":"2024-05-05 12:54:49.434"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7768.out cli tso query --pd=http://127.0.0.1:2379
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 5.82 secs (639673259 bytes/sec)
[Pipeline] {
[Pipeline] cache
+ set +x
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ set +x
+ tso='449546784695910405
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546784695910405 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7807.out cli changefeed create --start-ts=449546784695910405 '--sink-uri=kafka://127.0.0.1:9092/ticdc-kafka-compression-lz4-test?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1&compression=lz4' -c lz4
Create changefeed successfully!
ID: lz4
Info: {"upstream_id":7365374330422267995,"namespace":"default","id":"lz4","sink_uri":"kafka://127.0.0.1:9092/ticdc-kafka-compression-lz4-test?protocol=canal-json\u0026enable-tidb-extension=true\u0026kafka-version=2.4.1\u0026compression=lz4","create_time":"2024-05-05T12:54:52.230913378+08:00","start_ts":449546784695910405,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546784695910405,"checkpoint_ts":449546784695910405,"checkpoint_time":"2024-05-05 12:54:50.350"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
[Sun May  5 12:54:53 CST 2024] <<<<<< START kafka consumer in kafka_compression case >>>>>>
+ endpoints=https://127.0.0.1:2579
+ changefeed_id=custom-changefeed-name
+ expected_state=normal
+ error_msg=null
+ tls_dir=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates
+ [[ https://127.0.0.1:2579 =~ https ]]
++ cdc cli changefeed query --ca=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem --cert=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem --key=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem --pd=https://127.0.0.1:2579 -c custom-changefeed-name -s
+ info='{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546785517469700,
  "checkpoint_time": "2024-05-05 12:54:53.484",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546785517469700,
  "checkpoint_time": "2024-05-05 12:54:53.484",
  "error": null
}'
{
  "upstream_id": 7365374333842603812,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546785517469700,
  "checkpoint_time": "2024-05-05 12:54:53.484",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374333842603812, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546785517469700, '"checkpoint_time":' '"2024-05-05' '12:54:53.484",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365374333842603812, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546785517469700, '"checkpoint_time":' '"2024-05-05' '12:54:53.484",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_compression/run.sh: line 22: [[: [2024/05/05 12:54:52.195 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:52.226 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:52.332 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:52.340 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:53.332 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:53.340 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]: syntax error: operand expected (error token is "[2024/05/05 12:54:52.195 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:52.226 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:52.332 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:52.340 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:53.332 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]
[2024/05/05 12:54:53.340 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses lz4 compression algorithm"]")
table test.lz4_finish_mark not exists for 1-th check, retry later
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4435.out cli changefeed create --start-ts=449546775660855297 '--sink-uri=kafka://127.0.0.1:9093/ticdc-cli-test-ssl-6502?protocol=open-protocol&ca=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem&cert=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem&key=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem&kafka-version=2.4.1&max-message-bytes=10485760&insecure-skip-verify=true' --tz=Asia/Shanghai
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[WARN] --tz is deprecated in changefeed settings.
Create changefeed successfully!
ID: cfc1f3a2-e284-4c90-b691-e3e5aa5c1df8
Info: {"upstream_id":7365374333842603812,"namespace":"default","id":"cfc1f3a2-e284-4c90-b691-e3e5aa5c1df8","sink_uri":"kafka://127.0.0.1:9093/ticdc-cli-test-ssl-6502?protocol=open-protocol\u0026ca=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem\u0026cert=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem\u0026key=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760\u0026insecure-skip-verify=true","create_time":"2024-05-05T12:54:55.707201748+08:00","start_ts":449546775660855297,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546775660855297,"checkpoint_ts":449546775660855297,"checkpoint_time":"2024-05-05 12:54:15.884"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table test.lz4_finish_mark not exists for 2-th check, retry later
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4481.out cli unsafe delete-service-gc-safepoint
Confirm that you know what this command will do and use it at your own risk [Y/N]
CDC service GC safepoint truncated in PD!
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.lz4_finish_mark exists
check diff successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7955.out cli changefeed pause -c lz4
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4515.out cli unsafe reset --no-confirm --pd=https://127.0.0.1:2579
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
reset and all metadata truncated in PD!
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6bb8fc0017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj, pid:6083, start at 2024-05-05 12:54:58.407888382 +0800 CST m=+5.146689739	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:58.416 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:58.417 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:58.417 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6bb8fc0017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj, pid:6083, start at 2024-05-05 12:54:58.407888382 +0800 CST m=+5.146689739	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:58.416 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:58.417 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:58.417 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6bba700009	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj, pid:6166, start at 2024-05-05 12:54:58.468609515 +0800 CST m=+5.154626761	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:56:58.475 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:54:58.460 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:44:58.460 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/multi_tables_ddl_v2/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/multi_tables_ddl_v2/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/multi_tables_ddl_v2/tiflash/log/proxy.log"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/multi_tables_ddl_v2/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/multi_tables_ddl_v2/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.7990.out cli changefeed remove -c lz4
+ set +x
Changefeed remove successfully.
ID: lz4
CheckpointTs: 449546785705164820
SinkURI: kafka://127.0.0.1:9092/ticdc-kafka-compression-lz4-test?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1&compression=lz4
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
[Sun May  5 12:55:01 CST 2024] <<<<<< START cdc server in multi_tables_ddl_v2 case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_tables_ddl_v2.75387540.out server --log-file /tmp/tidb_cdc_test/multi_tables_ddl_v2/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_tables_ddl_v2/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.8022.out cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4626.out cli unsafe resolve-lock --region=118
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
+ set +x
+ tso='449546788064198658
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546788064198658 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.8058.out cli changefeed create --start-ts=449546788064198658 '--sink-uri=kafka://127.0.0.1:9092/ticdc-kafka-compression-zstd-test?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1&compression=zstd' -c zstd
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:55:04 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8409f966-4327-4954-9c11-16a71bb3390d
	{"id":"8409f966-4327-4954-9c11-16a71bb3390d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884902}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ac662c5
	8409f966-4327-4954-9c11-16a71bb3390d

/tidb/cdc/default/default/upstream/7365374520020508771
	{"id":7365374520020508771,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8409f966-4327-4954-9c11-16a71bb3390d
	{"id":"8409f966-4327-4954-9c11-16a71bb3390d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884902}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ac662c5
	8409f966-4327-4954-9c11-16a71bb3390d

/tidb/cdc/default/default/upstream/7365374520020508771
	{"id":7365374520020508771,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8409f966-4327-4954-9c11-16a71bb3390d
	{"id":"8409f966-4327-4954-9c11-16a71bb3390d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884902}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ac662c5
	8409f966-4327-4954-9c11-16a71bb3390d

/tidb/cdc/default/default/upstream/7365374520020508771
	{"id":7365374520020508771,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Create changefeed successfully!
ID: test-normal
Info: {"upstream_id":7365374520020508771,"namespace":"default","id":"test-normal","sink_uri":"kafka://127.0.0.1:9092/ticdc-multi-tables-ddl-test-normal-26261?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:55:04.974999469+08:00","start_ts":449546787649748993,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["multi_tables_ddl_test.t1","multi_tables_ddl_test.t2","multi_tables_ddl_test.t3","multi_tables_ddl_test.t4","multi_tables_ddl_test.t1_7","multi_tables_ddl_test.t2_7","multi_tables_ddl_test.finish_mark"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":true,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546787649748993,"checkpoint_ts":449546787649748993,"checkpoint_time":"2024-05-05 12:55:01.618"}
Create changefeed successfully!
ID: zstd
Info: {"upstream_id":7365374330422267995,"namespace":"default","id":"zstd","sink_uri":"kafka://127.0.0.1:9092/ticdc-kafka-compression-zstd-test?protocol=canal-json\u0026enable-tidb-extension=true\u0026kafka-version=2.4.1\u0026compression=zstd","create_time":"2024-05-05T12:55:05.081671333+08:00","start_ts":449546788064198658,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546788064198658,"checkpoint_ts":449546788064198658,"checkpoint_time":"2024-05-05 12:55:03.199"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Create changefeed successfully!
ID: test-error-1
Info: {"upstream_id":7365374520020508771,"namespace":"default","id":"test-error-1","sink_uri":"kafka://127.0.0.1:9092/ticdc-multi-tables-ddl-test-error-1-22637?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:55:05.190200262+08:00","start_ts":449546787649748993,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["multi_tables_ddl_test.t5","multi_tables_ddl_test.t6","multi_tables_ddl_test.t7","multi_tables_ddl_test.t8"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":true,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546787649748993,"checkpoint_ts":449546787649748993,"checkpoint_time":"2024-05-05 12:55:01.618"}
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 12.55 secs (296637110 bytes/sec)
[Pipeline] {
[Pipeline] cache
Create changefeed successfully!
ID: test-error-2
Info: {"upstream_id":7365374520020508771,"namespace":"default","id":"test-error-2","sink_uri":"kafka://127.0.0.1:9092/ticdc-multi-tables-ddl-test-error-2-169?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:55:05.373164612+08:00","start_ts":449546787649748993,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["multi_tables_ddl_test.t9","multi_tables_ddl_test.t10"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":true,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546787649748993,"checkpoint_ts":449546787649748993,"checkpoint_time":"2024-05-05 12:55:01.618"}
[Sun May  5 12:55:05 CST 2024] <<<<<< START kafka consumer in multi_tables_ddl_v2 case >>>>>>
[Sun May  5 12:55:05 CST 2024] <<<<<< START kafka consumer in multi_tables_ddl_v2 case >>>>>>
[Sun May  5 12:55:05 CST 2024] <<<<<< START kafka consumer in multi_tables_ddl_v2 case >>>>>>
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_tls_with_auth.cli.4660.out cli unsafe resolve-lock --region=118 --ts=449546787483549700
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
+ set +x
[Sun May  5 12:55:06 CST 2024] <<<<<< START kafka consumer in kafka_compression case >>>>>>
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_compression/run.sh: line 22: [[: [2024/05/05 12:55:05.047 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:05.077 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:05.184 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:05.192 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:06.183 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:06.191 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]: syntax error: operand expected (error token is "[2024/05/05 12:55:05.047 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:05.077 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:05.184 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:05.192 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:06.183 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]
[2024/05/05 12:55:06.191 +08:00] [INFO] [sarama.go:96] ["Kafka producer uses zstd compression algorithm"]")
table test.zstd_finish_mark not exists for 1-th check, retry later
+ set +x
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   218  100   212  100     6   1614     45 --:--:-- --:--:-- --:--:--  1618
{
    "error_msg": "[CDC:ErrAPIInvalidParam]invalid log level: json: cannot unmarshal string into Go value of type struct { Level string \"json:\\\"log_level\\\"\" }",
    "error_code": "CDC:ErrAPIInvalidParam"
table multi_tables_ddl_test.t55 not exists for 1-th check, retry later
table test.zstd_finish_mark not exists for 2-th check, retry later
table multi_tables_ddl_test.t55 not exists for 2-th check, retry later
}  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   181  100   181    0     0   1389      0 --:--:-- --:--:-- --:--:--  1392
{
 "version": "v8.2.0-alpha-53-g0de8dc3e4",
 "git_hash": "0de8dc3e43ec741eba58047155ce7f3dba8eb4f7",
 "id": "5aa79c4b-05fc-462e-8774-dae09e632454",
 "pid": 3693,
 "is_owner": true
}wait process cdc.test exit for 1-th time...
table test.zstd_finish_mark exists
check diff successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.8178.out cli changefeed pause -c zstd
wait process cdc.test exit for 2-th time...
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:55:12 CST 2024] <<<<<< run test case cli_tls_with_auth success! >>>>>>
table multi_tables_ddl_test.t55 not exists for 3-th check, retry later
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_compression.cli.8216.out cli changefeed remove -c zstd
Changefeed remove successfully.
ID: zstd
CheckpointTs: 449546789073715234
SinkURI: kafka://127.0.0.1:9092/ticdc-kafka-compression-zstd-test?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1&compression=zstd
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
table multi_tables_ddl_test.t55 not exists for 4-th check, retry later
+ set +x
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
table multi_tables_ddl_test.t55 exists
table multi_tables_ddl_test.t66 exists
table multi_tables_ddl_test.t7 exists
table multi_tables_ddl_test.t88 exists
table multi_tables_ddl_test.finish_mark not exists for 1-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:55:17 CST 2024] <<<<<< run test case kafka_compression success! >>>>>>
table multi_tables_ddl_test.finish_mark not exists for 2-th check, retry later
table multi_tables_ddl_test.finish_mark exists
check table exists success
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test-normal
+ expected_state=normal
+ error_msg=null
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test-normal -s
+ info='{
  "upstream_id": 7365374520020508771,
  "namespace": "default",
  "id": "test-normal",
  "state": "normal",
  "checkpoint_tso": 449546789432328198,
  "checkpoint_time": "2024-05-05 12:55:08.418",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374520020508771,
  "namespace": "default",
  "id": "test-normal",
  "state": "normal",
  "checkpoint_tso": 449546789432328198,
  "checkpoint_time": "2024-05-05 12:55:08.418",
  "error": null
}'
{
  "upstream_id": 7365374520020508771,
  "namespace": "default",
  "id": "test-normal",
  "state": "normal",
  "checkpoint_tso": 449546789432328198,
  "checkpoint_time": "2024-05-05 12:55:08.418",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374520020508771, '"namespace":' '"default",' '"id":' '"test-normal",' '"state":' '"normal",' '"checkpoint_tso":' 449546789432328198, '"checkpoint_time":' '"2024-05-05' '12:55:08.418",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365374520020508771, '"namespace":' '"default",' '"id":' '"test-normal",' '"state":' '"normal",' '"checkpoint_tso":' 449546789432328198, '"checkpoint_time":' '"2024-05-05' '12:55:08.418",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test-error-1
+ expected_state=normal
+ error_msg=null
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test-error-1 -s
+ info='{
  "upstream_id": 7365374520020508771,
  "namespace": "default",
  "id": "test-error-1",
  "state": "normal",
  "checkpoint_tso": 449546792551841797,
  "checkpoint_time": "2024-05-05 12:55:20.318",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374520020508771,
  "namespace": "default",
  "id": "test-error-1",
  "state": "normal",
  "checkpoint_tso": 449546792551841797,
  "checkpoint_time": "2024-05-05 12:55:20.318",
  "error": null
}'
{
  "upstream_id": 7365374520020508771,
  "namespace": "default",
  "id": "test-error-1",
  "state": "normal",
  "checkpoint_tso": 449546792551841797,
  "checkpoint_time": "2024-05-05 12:55:20.318",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374520020508771, '"namespace":' '"default",' '"id":' '"test-error-1",' '"state":' '"normal",' '"checkpoint_tso":' 449546792551841797, '"checkpoint_time":' '"2024-05-05' '12:55:20.318",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365374520020508771, '"namespace":' '"default",' '"id":' '"test-error-1",' '"state":' '"normal",' '"checkpoint_tso":' 449546792551841797, '"checkpoint_time":' '"2024-05-05' '12:55:20.318",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test-error-2
+ expected_state=failed
+ error_msg=ErrSyncRenameTableFailed
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test-error-2 -s
+ info='{
  "upstream_id": 7365374520020508771,
  "namespace": "default",
  "id": "test-error-2",
  "state": "failed",
  "checkpoint_tso": 449546788947099685,
  "checkpoint_time": "2024-05-05 12:55:06.567",
  "error": {
    "time": "2024-05-05T12:55:09.360041375+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrSyncRenameTableFailed",
    "message": "[CDC:ErrSyncRenameTableFailed]table'\''s old name is not in filter rule, and its new name in filter rule table id '\''130'\'', ddl query: [rename table t11 to t9], it'\''s an unexpected behavior, if you want to replicate this table, please add its old name to filter rule."
  }
}'
+ echo '{
  "upstream_id": 7365374520020508771,
  "namespace": "default",
  "id": "test-error-2",
  "state": "failed",
  "checkpoint_tso": 449546788947099685,
  "checkpoint_time": "2024-05-05 12:55:06.567",
  "error": {
    "time": "2024-05-05T12:55:09.360041375+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrSyncRenameTableFailed",
    "message": "[CDC:ErrSyncRenameTableFailed]table'\''s old name is not in filter rule, and its new name in filter rule table id '\''130'\'', ddl query: [rename table t11 to t9], it'\''s an unexpected behavior, if you want to replicate this table, please add its old name to filter rule."
  }
}'
{
  "upstream_id": 7365374520020508771,
  "namespace": "default",
  "id": "test-error-2",
  "state": "failed",
  "checkpoint_tso": 449546788947099685,
  "checkpoint_time": "2024-05-05 12:55:06.567",
  "error": {
    "time": "2024-05-05T12:55:09.360041375+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrSyncRenameTableFailed",
    "message": "[CDC:ErrSyncRenameTableFailed]table's old name is not in filter rule, and its new name in filter rule table id '130', ddl query: [rename table t11 to t9], it's an unexpected behavior, if you want to replicate this table, please add its old name to filter rule."
  }
}
++ jq -r .state
++ echo '{' '"upstream_id":' 7365374520020508771, '"namespace":' '"default",' '"id":' '"test-error-2",' '"state":' '"failed",' '"checkpoint_tso":' 449546788947099685, '"checkpoint_time":' '"2024-05-05' '12:55:06.567",' '"error":' '{' '"time":' '"2024-05-05T12:55:09.360041375+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrSyncRenameTableFailed",' '"message":' '"[CDC:ErrSyncRenameTableFailed]table'\''s' old name is not in filter rule, and its new name in filter rule table id ''\''130'\'',' ddl query: '[rename' table t11 to 't9],' 'it'\''s' an unexpected behavior, if you want to replicate this table, please add its old name to filter 'rule."' '}' '}'
+ state=failed
+ [[ ! failed == \f\a\i\l\e\d ]]
++ jq -r .error.message
++ echo '{' '"upstream_id":' 7365374520020508771, '"namespace":' '"default",' '"id":' '"test-error-2",' '"state":' '"failed",' '"checkpoint_tso":' 449546788947099685, '"checkpoint_time":' '"2024-05-05' '12:55:06.567",' '"error":' '{' '"time":' '"2024-05-05T12:55:09.360041375+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrSyncRenameTableFailed",' '"message":' '"[CDC:ErrSyncRenameTableFailed]table'\''s' old name is not in filter rule, and its new name in filter rule table id ''\''130'\'',' ddl query: '[rename' table t11 to 't9],' 'it'\''s' an unexpected behavior, if you want to replicate this table, please add its old name to filter 'rule."' '}' '}'
+ message='[CDC:ErrSyncRenameTableFailed]table'\''s old name is not in filter rule, and its new name in filter rule table id '\''130'\'', ddl query: [rename table t11 to t9], it'\''s an unexpected behavior, if you want to replicate this table, please add its old name to filter rule.'
+ [[ ! [CDC:ErrSyncRenameTableFailed]table's old name is not in filter rule, and its new name in filter rule table id '130', ddl query: [rename table t11 to t9], it's an unexpected behavior, if you want to replicate this table, please add its old name to filter rule. =~ ErrSyncRenameTableFailed ]]
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:55:23 CST 2024] <<<<<< run test case multi_tables_ddl_v2 success! >>>>>>
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_basic/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/kafka_simple_basic
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_messages/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:55:28 CST 2024] <<<<<< run test case kafka_messages success! >>>>>>
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 19.76 secs (188437599 bytes/sec)
[Pipeline] {
[Pipeline] cache
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_sink_error_resume/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/kafka_sink_error_resume
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_topics_v2/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6e0118000e	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:5752, start at 2024-05-05 12:55:35.763802295 +0800 CST m=+5.074133531	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:57:35.770 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:55:35.750 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:45:35.750 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6e0118000e	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:5752, start at 2024-05-05 12:55:35.763802295 +0800 CST m=+5.074133531	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:57:35.770 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:55:35.750 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:45:35.750 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6e02700017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:5828, start at 2024-05-05 12:55:35.871165608 +0800 CST m=+5.133192411	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:57:35.878 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:55:35.836 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:45:35.836 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_simple_basic/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_simple_basic/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_simple_basic/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_simple_basic/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_simple_basic/tiflash/log/proxy.log"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
start tidb cluster in /tmp/tidb_cdc_test/multi_topics_v2
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
[Sun May  5 12:55:38 CST 2024] <<<<<< START cdc server in kafka_simple_basic case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_basic.72087210.out server --log-file /tmp/tidb_cdc_test/kafka_simple_basic/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_simple_basic/cdc_data --cluster-id default
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 7.05 secs (528499259 bytes/sec)
[Pipeline] {
[Pipeline] cache
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:55:42 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f4fba014-61dd-4145-8b16-560eb3d266e8
	{"id":"f4fba014-61dd-4145-8b16-560eb3d266e8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884939}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471b5e77c7
	f4fba014-61dd-4145-8b16-560eb3d266e8

/tidb/cdc/default/default/upstream/7365374678700519148
	{"id":7365374678700519148,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f4fba014-61dd-4145-8b16-560eb3d266e8
	{"id":"f4fba014-61dd-4145-8b16-560eb3d266e8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884939}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471b5e77c7
	f4fba014-61dd-4145-8b16-560eb3d266e8

/tidb/cdc/default/default/upstream/7365374678700519148
	{"id":7365374678700519148,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f4fba014-61dd-4145-8b16-560eb3d266e8
	{"id":"f4fba014-61dd-4145-8b16-560eb3d266e8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884939}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471b5e77c7
	f4fba014-61dd-4145-8b16-560eb3d266e8

/tidb/cdc/default/default/upstream/7365374678700519148
	{"id":7365374678700519148,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_basic.cli.7264.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-simple-basic-31761?protocol=simple' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_basic/conf/changefeed.toml -c simple-basic
Create changefeed successfully!
ID: simple-basic
Info: {"upstream_id":7365374678700519148,"namespace":"default","id":"simple-basic","sink_uri":"kafka://127.0.0.1:9092/ticdc-simple-basic-31761?protocol=simple","create_time":"2024-05-05T12:55:42.553554938+08:00","start_ts":449546798340505604,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":5,"send_bootstrap_in_msg_count":100,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"correctness","corruption_handle_level":"error"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546798340505604,"checkpoint_ts":449546798340505604,"checkpoint_time":"2024-05-05 12:55:42.400"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6e9ba00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:9396, start at 2024-05-05 12:55:45.668418182 +0800 CST m=+5.096849262	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:57:45.678 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:55:45.640 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:45:45.640 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6e9ba00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:9396, start at 2024-05-05 12:55:45.668418182 +0800 CST m=+5.096849262	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:57:45.678 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:55:45.640 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:45:45.640 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6e9d000014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:9480, start at 2024-05-05 12:55:45.764166081 +0800 CST m=+5.144377378	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:57:45.772 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:55:45.728 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:45:45.728 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_sink_error_resume/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_sink_error_resume/tiflash/log/error.log
arg matches is ArgMatches { args: {"data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_sink_error_resume/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_sink_error_resume/tiflash-proxy.toml"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_sink_error_resume/tiflash/log/proxy.log"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6ea95c0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj, pid:9158, start at 2024-05-05 12:55:46.537320781 +0800 CST m=+5.130977661	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:57:46.543 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:55:46.519 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:45:46.519 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6ea95c0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj, pid:9158, start at 2024-05-05 12:55:46.537320781 +0800 CST m=+5.130977661	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:57:46.543 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:55:46.519 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:45:46.519 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c6eaad80013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-c0214-fpstj, pid:9239, start at 2024-05-05 12:55:46.635877822 +0800 CST m=+5.178177705	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:57:46.644 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:55:46.614 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:45:46.614 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/multi_topics_v2/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/multi_topics_v2/tiflash/log/error.log
arg matches is ArgMatches { args: {"addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/multi_topics_v2/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/multi_topics_v2/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/multi_topics_v2/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Sun May  5 12:55:48 CST 2024] <<<<<< START cdc server in kafka_sink_error_resume case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/sink/dmlsink/mq/dmlproducer/KafkaSinkAsyncSendError=1*return(true)'
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_sink_error_resume.1084310845.out server --log-file /tmp/tidb_cdc_test/kafka_sink_error_resume/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_sink_error_resume/cdc_data --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[Sun May  5 12:55:49 CST 2024] <<<<<< START kafka consumer in kafka_simple_basic case >>>>>>
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_topics_v2.cli.10616.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
+ tso='449546800337780737
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546800337780737 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:55:51 CST 2024] <<<<<< START cdc server in multi_topics_v2 case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_topics_v2.1066510667.out server --log-file /tmp/tidb_cdc_test/multi_topics_v2/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_topics_v2/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:55:51 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c1eade76-7055-4faf-a91e-9f5e968ff9a3
	{"id":"c1eade76-7055-4faf-a91e-9f5e968ff9a3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884949}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471b7d48c7
	c1eade76-7055-4faf-a91e-9f5e968ff9a3

/tidb/cdc/default/default/upstream/7365374722846030955
	{"id":7365374722846030955,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c1eade76-7055-4faf-a91e-9f5e968ff9a3
	{"id":"c1eade76-7055-4faf-a91e-9f5e968ff9a3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884949}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471b7d48c7
	c1eade76-7055-4faf-a91e-9f5e968ff9a3

/tidb/cdc/default/default/upstream/7365374722846030955
	{"id":7365374722846030955,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c1eade76-7055-4faf-a91e-9f5e968ff9a3
	{"id":"c1eade76-7055-4faf-a91e-9f5e968ff9a3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884949}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471b7d48c7
	c1eade76-7055-4faf-a91e-9f5e968ff9a3

/tidb/cdc/default/default/upstream/7365374722846030955
	{"id":7365374722846030955,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 12:55:52 CST 2024] <<<<<< START kafka consumer in kafka_sink_error_resume case >>>>>>
check_changefeed_status 127.0.0.1:8300 f6f4f6ae-2037-46f2-b305-34f68355da11 warning last_warning kafka sink injected error
+ endpoint=127.0.0.1:8300
+ changefeed_id=f6f4f6ae-2037-46f2-b305-34f68355da11
+ expected_state=warning
+ field=last_warning
+ error_pattern=kafka
++ curl 127.0.0.1:8300/api/v2/changefeeds/f6f4f6ae-2037-46f2-b305-34f68355da11/status
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    86  100    86    0     0    617      0 --:--:-- --:--:-- --:--:--   618
+ info='{"state":"normal","resolved_ts":449546800867573763,"checkpoint_ts":449546800867573763}'
+ echo '{"state":"normal","resolved_ts":449546800867573763,"checkpoint_ts":449546800867573763}'
{"state":"normal","resolved_ts":449546800867573763,"checkpoint_ts":449546800867573763}
++ echo '{"state":"normal","resolved_ts":449546800867573763,"checkpoint_ts":449546800867573763}'
++ jq -r .state
table test.finish_mark_for_ddl not exists for 1-th check, retry later
+ state=normal
+ [[ ! normal == \w\a\r\n\i\n\g ]]
+ echo 'changefeed state normal does not equal to warning'
changefeed state normal does not equal to warning
+ exit 1
run task failed 1-th time, retry later
table test.finish_mark_for_ddl not exists for 2-th check, retry later
check_changefeed_status 127.0.0.1:8300 f6f4f6ae-2037-46f2-b305-34f68355da11 warning last_warning kafka sink injected error
+ endpoint=127.0.0.1:8300
+ changefeed_id=f6f4f6ae-2037-46f2-b305-34f68355da11
+ expected_state=warning
+ field=last_warning
+ error_pattern=kafka
++ curl 127.0.0.1:8300/api/v2/changefeeds/f6f4f6ae-2037-46f2-b305-34f68355da11/status
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    86  100    86    0     0    781      0 --:--:-- --:--:-- --:--:--   788
+ info='{"state":"normal","resolved_ts":449546800933109793,"checkpoint_ts":449546800933109793}'
+ echo '{"state":"normal","resolved_ts":449546800933109793,"checkpoint_ts":449546800933109793}'
{"state":"normal","resolved_ts":449546800933109793,"checkpoint_ts":449546800933109793}
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:55:54 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8ac614d1-10f7-4cac-8547-18ffc284c3b1
	{"id":"8ac614d1-10f7-4cac-8547-18ffc284c3b1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884951}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471b8857d0
	8ac614d1-10f7-4cac-8547-18ffc284c3b1

/tidb/cdc/default/default/upstream/7365374727742339792
	{"id":7365374727742339792,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8ac614d1-10f7-4cac-8547-18ffc284c3b1
	{"id":"8ac614d1-10f7-4cac-8547-18ffc284c3b1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884951}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471b8857d0
	8ac614d1-10f7-4cac-8547-18ffc284c3b1

/tidb/cdc/default/default/upstream/7365374727742339792
	{"id":7365374727742339792,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8ac614d1-10f7-4cac-8547-18ffc284c3b1
	{"id":"8ac614d1-10f7-4cac-8547-18ffc284c3b1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714884951}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471b8857d0
	8ac614d1-10f7-4cac-8547-18ffc284c3b1

/tidb/cdc/default/default/upstream/7365374727742339792
	{"id":7365374727742339792,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_topics_v2.cli.10719.out cli changefeed create --start-ts=449546800337780737 '--sink-uri=kafka://127.0.0.1:9092/multi_topics?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1' --config /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_topics_v2/conf/changefeed.toml
++ echo '{"state":"normal","resolved_ts":449546800933109793,"checkpoint_ts":449546800933109793}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \w\a\r\n\i\n\g ]]
+ echo 'changefeed state normal does not equal to warning'
changefeed state normal does not equal to warning
+ exit 1
run task failed 2-th time, retry later
Create changefeed successfully!
ID: 8bd4ce73-8560-4de1-8077-b12e1ea3b094
Info: {"upstream_id":7365374727742339792,"namespace":"default","id":"8bd4ce73-8560-4de1-8077-b12e1ea3b094","sink_uri":"kafka://127.0.0.1:9092/multi_topics?protocol=canal-json\u0026enable-tidb-extension=true\u0026kafka-version=2.4.1","create_time":"2024-05-05T12:55:55.039425712+08:00","start_ts":449546800337780737,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"dispatchers":[{"matcher":["test.*"],"topic":"{schema}_{table}"}],"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":true,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546800337780737,"checkpoint_ts":449546800337780737,"checkpoint_time":"2024-05-05 12:55:50.019"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
+ set +x
table test.finish_mark_for_ddl not exists for 3-th check, retry later
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 15.51 secs (240151455 bytes/sec)
[Pipeline] {
[Pipeline] cache
table test.finish_mark_for_ddl not exists for 4-th check, retry later
check_changefeed_status 127.0.0.1:8300 f6f4f6ae-2037-46f2-b305-34f68355da11 warning last_warning kafka sink injected error
+ endpoint=127.0.0.1:8300
+ changefeed_id=f6f4f6ae-2037-46f2-b305-34f68355da11
+ expected_state=warning
+ field=last_warning
+ error_pattern=kafka
++ curl 127.0.0.1:8300/api/v2/changefeeds/f6f4f6ae-2037-46f2-b305-34f68355da11/status
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   244  100   244    0     0   2199      0 --:--:-- --:--:-- --:--:--  2218
+ info='{"state":"warning","resolved_ts":449546802597724165,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}'
+ echo '{"state":"warning","resolved_ts":449546802597724165,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}'
{"state":"warning","resolved_ts":449546802597724165,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}
++ echo '{"state":"warning","resolved_ts":449546802597724165,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka' sink injected 'error"}}'
++ jq -r .state
+ state=warning
+ [[ ! warning == \w\a\r\n\i\n\g ]]
+ [[ -z last_warning ]]
++ echo '{"state":"warning","resolved_ts":449546802597724165,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka' sink injected 'error"}}'
++ jq -r .last_warning.message
+ error_msg='kafka sink injected error'
+ [[ ! kafka sink injected error =~ kafka ]]
run task successfully
check_changefeed_status 127.0.0.1:8300 f6f4f6ae-2037-46f2-b305-34f68355da11 normal
+ endpoint=127.0.0.1:8300
+ changefeed_id=f6f4f6ae-2037-46f2-b305-34f68355da11
+ expected_state=normal
+ field=
+ error_pattern=
++ curl 127.0.0.1:8300/api/v2/changefeeds/f6f4f6ae-2037-46f2-b305-34f68355da11/status
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   244  100   244    0     0   1823      0 --:--:-- --:--:-- --:--:--  1834
+ info='{"state":"warning","resolved_ts":449546802597724165,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}'
+ echo '{"state":"warning","resolved_ts":449546802597724165,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}'
{"state":"warning","resolved_ts":449546802597724165,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}
++ echo '{"state":"warning","resolved_ts":449546802597724165,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka' sink injected 'error"}}'
++ jq -r .state
+ state=warning
+ [[ ! warning == \n\o\r\m\a\l ]]
+ echo 'changefeed state warning does not equal to normal'
changefeed state warning does not equal to normal
+ exit 1
run task failed 1-th time, retry later
table test.finish_mark_for_ddl exists
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_basic.cli.7389.out cli changefeed pause -c simple-basic
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
check_changefeed_status 127.0.0.1:8300 f6f4f6ae-2037-46f2-b305-34f68355da11 normal
+ endpoint=127.0.0.1:8300
+ changefeed_id=f6f4f6ae-2037-46f2-b305-34f68355da11
+ expected_state=normal
+ field=
+ error_pattern=
++ curl 127.0.0.1:8300/api/v2/changefeeds/f6f4f6ae-2037-46f2-b305-34f68355da11/status
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   244  100   244    0     0   3801      0 --:--:-- --:--:-- --:--:--  3812
+ info='{"state":"warning","resolved_ts":449546803122012168,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}'
+ echo '{"state":"warning","resolved_ts":449546803122012168,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}'
{"state":"warning","resolved_ts":449546803122012168,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}
++ echo '{"state":"warning","resolved_ts":449546803122012168,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka' sink injected 'error"}}'
++ jq -r .state
+ state=warning
+ [[ ! warning == \n\o\r\m\a\l ]]
+ echo 'changefeed state warning does not equal to normal'
changefeed state warning does not equal to normal
+ exit 1
run task failed 2-th time, retry later
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_basic.cli.7421.out cli changefeed resume -c simple-basic
PASS
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
+ set +x
check_changefeed_status 127.0.0.1:8300 f6f4f6ae-2037-46f2-b305-34f68355da11 normal
+ endpoint=127.0.0.1:8300
+ changefeed_id=f6f4f6ae-2037-46f2-b305-34f68355da11
+ expected_state=normal
+ field=
+ error_pattern=
++ curl 127.0.0.1:8300/api/v2/changefeeds/f6f4f6ae-2037-46f2-b305-34f68355da11/status
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   244  100   244    0     0   1529      0 --:--:-- --:--:-- --:--:--  1534
+ info='{"state":"warning","resolved_ts":449546804432994312,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}'
+ echo '{"state":"warning","resolved_ts":449546804432994312,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}'
{"state":"warning","resolved_ts":449546804432994312,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka sink injected error"}}
++ echo '{"state":"warning","resolved_ts":449546804432994312,"checkpoint_ts":449546800959324190,"last_warning":{"time":"2024-05-05T12:55:56.300129243+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrProcessorUnknown","message":"kafka' sink injected 'error"}}'
++ jq -r .state
+ state=warning
+ [[ ! warning == \n\o\r\m\a\l ]]
+ echo 'changefeed state warning does not equal to normal'
changefeed state warning does not equal to normal
+ exit 1
run task failed 3-th time, retry later
check_changefeed_status 127.0.0.1:8300 f6f4f6ae-2037-46f2-b305-34f68355da11 normal
+ endpoint=127.0.0.1:8300
+ changefeed_id=f6f4f6ae-2037-46f2-b305-34f68355da11
+ expected_state=normal
+ field=
+ error_pattern=
++ curl 127.0.0.1:8300/api/v2/changefeeds/f6f4f6ae-2037-46f2-b305-34f68355da11/status
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    86  100    86    0     0   1338      0 --:--:-- --:--:-- --:--:--  1343
+ info='{"state":"normal","resolved_ts":449546806005596169,"checkpoint_ts":449546805743452169}'
+ echo '{"state":"normal","resolved_ts":449546806005596169,"checkpoint_ts":449546805743452169}'
{"state":"normal","resolved_ts":449546806005596169,"checkpoint_ts":449546805743452169}
++ echo '{"state":"normal","resolved_ts":449546806005596169,"checkpoint_ts":449546805743452169}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
+ [[ -z '' ]]
++ echo '{"state":"normal","resolved_ts":449546806005596169,"checkpoint_ts":449546805743452169}'
++ jq -r .last_error
+ error_msg=null
+ [[ ! null == \n\u\l\l ]]
++ echo '{"state":"normal","resolved_ts":449546806005596169,"checkpoint_ts":449546805743452169}'
++ jq -r .last_warning
+ error_msg=null
+ [[ ! null == \n\u\l\l ]]
+ exit 0
run task successfully
table kafka_sink_error_resume.t1 exists
table kafka_sink_error_resume.t2 exists
check diff successfully
check diff failed 1-th time, retry later
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:56:15 CST 2024] <<<<<< run test case kafka_sink_error_resume success! >>>>>>
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 20.65 secs (180361932 bytes/sec)
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] sh
[Pipeline] sh
[Pipeline] sh
table test.finish_mark not exists for 3-th check, retry later
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] sh
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] sh
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
/brokers/ids/1
[Pipeline] sh
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] sh
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] sh
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] sh
table test.finish_mark not exists for 4-th check, retry later
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
table test.finish_mark not exists for 5-th check, retry later
table test.finish_mark not exists for 6-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/mq_sink_lost_callback/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:56:27 CST 2024] <<<<<< run test case mq_sink_lost_callback success! >>>>>>
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
table test.finish_mark not exists for 7-th check, retry later
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/mq_sink_dispatcher/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] stage
[Pipeline] { (Test)
table test.finish_mark not exists for 8-th check, retry later
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 45 min
[Pipeline] {
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] withCredentials
Masking supported pattern matches of $TICDC_COVERALLS_TOKEN or $TICDC_CODECOV_TOKEN
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] {
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
start tidb cluster in /tmp/tidb_cdc_test/mq_sink_dispatcher
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
[Pipeline] // container
[Pipeline] sh
table test.finish_mark not exists for 9-th check, retry later
[Pipeline] sh
[Pipeline] sh
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G03
Run cases: row_format drop_many_tables processor_stop_delay partition_table
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=b3148747-7651-4002-b9f1-8caf0e075932
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G03
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-xnvpx
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg pingcap_tiflow_pull_cdc_integration_kafka_test_1856-xnvpx
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/row_format/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G07
Run cases: kv_client_stream_reconnect cdc split_region
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=071f1c93-c256-43e9-af52-98d0ec761618
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G07
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-h55pm
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv pingcap_tiflow_pull_cdc_integration_kafka_test_1856-h55pm
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kv_client_stream_reconnect/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
Verifying downstream PD is started...
[Pipeline] sh
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G04
Run cases: foreign_key ddl_puller_lag ddl_only_block_related_table changefeed_auto_stop
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=8b67646b-5caa-44c2-85cd-db035d4b1745
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G04
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-n0psn
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km pingcap_tiflow_pull_cdc_integration_kafka_test_1856-n0psn
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/foreign_key/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[Pipeline] sh
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G09
Run cases: gc_safepoint changefeed_pause_resume cli_with_auth savepoint synced_status
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=d1b5445e-4893-478f-9efb-82e4da405832
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G09
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-g1835
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-g1835 pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/gc_safepoint/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[Pipeline] sh
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G10
Run cases: default_value simple cdc_server_tips event_filter sql_mode
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=5ec73c1b-e8fe-4361-8a30-de6c891b8a74
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G10
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-bxr1t
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-bxr1t pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/default_value/run.sh using Sink-Type: kafka... <<=================
[Pipeline] sh
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G06
Run cases: sink_retry changefeed_error ddl_sequence resourcecontrol
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=935741d1-ced2-4927-b9f4-8bc5ac47d4d0
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT=tcp://10.233.0.1:443
KUBERNETES_PORT_443_TCP_PORT=443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G06
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-vv6pz
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q pingcap_tiflow_pull_cdc_integration_kafka_test_1856-vv6pz
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/sink_retry/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table test.finish_mark exists
check diff successfully
[Pipeline] sh
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G11
[Pipeline] // container
[Pipeline] sh
Run cases: resolve_lock move_table autorandom generate_column
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=f38c11ad-b38b-47de-868a-0c38a6ebd11a
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G11
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-jpkvb
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-jpkvb pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/resolve_lock/run.sh using Sink-Type: kafka... <<=================
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G08
Run cases: processor_err_chan changefeed_reconstruct multi_capture synced_status_with_redo
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=e7d515c5-63b5-44d2-8445-1071fd709a1f
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G08
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l25q9
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l25q9 pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/processor_err_chan/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
[Pipeline] {
[Pipeline] cache
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G05
Run cases: charset_gbk ddl_manager multi_source
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=ff01e308-cf9f-4867-90e6-e44da63f163d
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT=tcp://10.233.0.1:443
KUBERNETES_PORT_443_TCP_PORT=443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G05
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l0fvq
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l0fvq pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/charset_gbk/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
start tidb cluster in /tmp/tidb_cdc_test/gc_safepoint
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/kv_client_stream_reconnect
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/foreign_key
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
The 1 times to try to start tidb cluster...
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/row_format
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/sink_retry
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/charset_gbk
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/processor_err_chan
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
start tidb cluster in /tmp/tidb_cdc_test/default_value
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/resolve_lock
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 7.05 secs (528290285 bytes/sec)
[Pipeline] {
[Pipeline] cache
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c723d480017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:12331, start at 2024-05-05 12:56:45.185595787 +0800 CST m=+5.190539359	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:45.192 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:45.189 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:45.189 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c723d480017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:12331, start at 2024-05-05 12:56:45.185595787 +0800 CST m=+5.190539359	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:45.192 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:45.189 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:45.189 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c723f640003	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:12412, start at 2024-05-05 12:56:45.274496562 +0800 CST m=+5.230005333	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:45.281 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:45.273 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:45.273 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/mq_sink_dispatcher/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/mq_sink_dispatcher/tiflash/log/error.log
arg matches is ArgMatches { args: {"advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/mq_sink_dispatcher/tiflash/log/proxy.log"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/mq_sink_dispatcher/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/mq_sink_dispatcher/tiflash/db/proxy"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table test.finish_mark not exists for 1-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 2-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Sun May  5 12:56:50 CST 2024] <<<<<< START cdc server in mq_sink_dispatcher case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.mq_sink_dispatcher.1385713859.out server --log-file /tmp/tidb_cdc_test/mq_sink_dispatcher/cdc.log --log-level info --data-dir /tmp/tidb_cdc_test/mq_sink_dispatcher/cdc_data --cluster-id default
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c728620000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv, pid:1302, start at 2024-05-05 12:56:49.81714764 +0800 CST m=+5.183501083	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:49.823 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.800 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.800 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7284a80018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:1353, start at 2024-05-05 12:56:49.756036735 +0800 CST m=+5.107419513	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:49.762 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.756 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.756 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7284a80018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:1353, start at 2024-05-05 12:56:49.756036735 +0800 CST m=+5.107419513	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:49.762 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.756 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.756 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72857c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:1442, start at 2024-05-05 12:56:49.790945832 +0800 CST m=+5.084081243	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:49.800 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.759 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.759 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/gc_safepoint/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/gc_safepoint/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/gc_safepoint/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/gc_safepoint/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/gc_safepoint/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7286e40013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:1291, start at 2024-05-05 12:56:49.870825609 +0800 CST m=+5.076707835	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:49.877 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.849 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.849 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7286e40013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:1291, start at 2024-05-05 12:56:49.870825609 +0800 CST m=+5.076707835	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:49.877 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.849 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.849 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7287b00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:1366, start at 2024-05-05 12:56:49.926174796 +0800 CST m=+5.077464510	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:49.932 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.900 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.900 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/sink_retry/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/sink_retry/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/sink_retry/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/sink_retry/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/sink_retry/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table test.finish_mark not exists for 3-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72973c001d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:1347, start at 2024-05-05 12:56:50.944403803 +0800 CST m=+5.159991762	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:50.951 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:50.944 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:50.944 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72973c001d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:1347, start at 2024-05-05 12:56:50.944403803 +0800 CST m=+5.159991762	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:50.951 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:50.944 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:50.944 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7298cc0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:1422, start at 2024-05-05 12:56:51.021845095 +0800 CST m=+5.182146850	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:51.027 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:50.995 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:50.995 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c729f480010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85, pid:1347, start at 2024-05-05 12:56:51.42627705 +0800 CST m=+5.329636286	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:51.434 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:51.410 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:51.410 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/row_format/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/row_format/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/row_format/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/row_format/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/row_format/tiflash-proxy.toml"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c728a0c0018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:1409, start at 2024-05-05 12:56:50.072791461 +0800 CST m=+5.161417052	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:50.081 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:50.051 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:50.051 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c728a0c0018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:1409, start at 2024-05-05 12:56:50.072791461 +0800 CST m=+5.161417052	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:50.081 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:50.051 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:50.051 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c728934000d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:1483, start at 2024-05-05 12:56:50.01072127 +0800 CST m=+5.044391398	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:50.017 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.997 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.997 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.sink_retry.cli.2812.out cli tso query --pd=http://127.0.0.1:2379
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c728620000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv, pid:1302, start at 2024-05-05 12:56:49.81714764 +0800 CST m=+5.183501083	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:49.823 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.800 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.800 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7286d00016	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv, pid:1384, start at 2024-05-05 12:56:49.893737137 +0800 CST m=+5.207956225	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:49.901 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:49.894 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:49.894 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kv_client_stream_reconnect/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kv_client_stream_reconnect/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kv_client_stream_reconnect/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kv_client_stream_reconnect/tiflash/log/proxy.log"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kv_client_stream_reconnect/tiflash-proxy.toml"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/foreign_key/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/foreign_key/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/foreign_key/tiflash/db/proxy"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/foreign_key/tiflash/log/proxy.log"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/foreign_key/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Sun May  5 12:56:52 CST 2024] <<<<<< START cdc server in gc_safepoint case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/pkg/txnutil/gc/InjectGcSafepointUpdateInterval=return(500)'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.gc_safepoint.28252827.out server --log-file /tmp/tidb_cdc_test/gc_safepoint/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/gc_safepoint/cdc_data --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:56:53 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9f1342b6-6e2d-40a5-8859-6f685cba12a9
	{"id":"9f1342b6-6e2d-40a5-8859-6f685cba12a9","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885010}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c635cd0
	9f1342b6-6e2d-40a5-8859-6f685cba12a9

/tidb/cdc/default/default/upstream/7365374968019435740
	{"id":7365374968019435740,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9f1342b6-6e2d-40a5-8859-6f685cba12a9
	{"id":"9f1342b6-6e2d-40a5-8859-6f685cba12a9","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885010}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c635cd0
	9f1342b6-6e2d-40a5-8859-6f685cba12a9

/tidb/cdc/default/default/upstream/7365374968019435740
	{"id":7365374968019435740,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9f1342b6-6e2d-40a5-8859-6f685cba12a9
	{"id":"9f1342b6-6e2d-40a5-8859-6f685cba12a9","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885010}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c635cd0
	9f1342b6-6e2d-40a5-8859-6f685cba12a9

/tidb/cdc/default/default/upstream/7365374968019435740
	{"id":7365374968019435740,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.mq_sink_dispatcher.cli.13935.out cli tso query --pd=http://127.0.0.1:2379
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72ace00018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:1420, start at 2024-05-05 12:56:52.329798133 +0800 CST m=+5.055755155	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:52.336 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:52.330 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:52.330 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72ace00018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:1420, start at 2024-05-05 12:56:52.329798133 +0800 CST m=+5.055755155	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:52.336 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:52.330 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:52.330 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72afe00003	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:1506, start at 2024-05-05 12:56:52.473002048 +0800 CST m=+5.143388019	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:52.480 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:52.472 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:52.472 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/default_value/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/default_value/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/default_value/tiflash/log/proxy.log"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/default_value/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/default_value/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table test.finish_mark not exists for 4-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c729f480010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85, pid:1347, start at 2024-05-05 12:56:51.42627705 +0800 CST m=+5.329636286	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:51.434 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:51.410 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:51.410 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72a0140014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85, pid:1427, start at 2024-05-05 12:56:51.494721215 +0800 CST m=+5.341458901	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:51.501 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:51.461 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:51.461 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/charset_gbk/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/charset_gbk/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/charset_gbk/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/charset_gbk/tiflash/log/proxy.log"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/charset_gbk/tiflash-proxy.toml"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 6.64 secs (560576457 bytes/sec)
[Pipeline] {
[Pipeline] cache
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72a2580013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:1349, start at 2024-05-05 12:56:51.628419057 +0800 CST m=+5.186767640	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:51.635 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:51.606 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:51.606 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72a2580013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:1349, start at 2024-05-05 12:56:51.628419057 +0800 CST m=+5.186767640	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:51.635 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:51.606 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:51.606 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72a3040015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:1431, start at 2024-05-05 12:56:51.693650699 +0800 CST m=+5.201801341	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:51.702 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:51.699 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:51.699 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/processor_err_chan/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/processor_err_chan/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/processor_err_chan/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/processor_err_chan/tiflash-proxy.toml"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/processor_err_chan/tiflash/log/proxy.log"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ set +x
+ tso='449546816965836801
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546816965836801 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
***************** properties *****************
"mysql.port"="4000"
"mysql.user"="root"
"insertproportion"="0"
"threadcount"="2"
"requestdistribution"="uniform"
"operationcount"="0"
"mysql.db"="sink_retry"
"dotransactions"="false"
"mysql.host"="127.0.0.1"
"recordcount"="10"
"workload"="core"
"updateproportion"="0"
"scanproportion"="0"
"readallfields"="true"
"readproportion"="0"
**********************************************
[Sun May  5 12:56:54 CST 2024] <<<<<< START cdc server in kv_client_stream_reconnect case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/kv/kvClientForceReconnect=return(true)'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kv_client_stream_reconnect.28312833.out server --log-file /tmp/tidb_cdc_test/kv_client_stream_reconnect/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kv_client_stream_reconnect/cdc_data --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.foreign_key.cli.2812.out cli tso query --pd=http://127.0.0.1:2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.row_format.cli.2810.out cli tso query --pd=http://127.0.0.1:2379
Run finished, takes 8.705396ms
INSERT - Takes(s): 0.0, Count: 10, OPS: 2088.9, Avg(us): 1644, Min(us): 959, Max(us): 3839, 95th(us): 4000, 99th(us): 4000
[Sun May  5 12:56:55 CST 2024] <<<<<< START cdc server in sink_retry case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/sink/dmlsink/txn/mysql/MySQLSinkTxnRandomError=25%return(true)'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.sink_retry.28692871.out server --log-file /tmp/tidb_cdc_test/sink_retry/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/sink_retry/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ set +x
+ tso='449546817094025219
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546817094025219 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.mq_sink_dispatcher.cli.13968.out cli changefeed create --start-ts=449546817094025219 '--sink-uri=kafka://127.0.0.1:9092/dispatcher-test?protocol=canal-json&enable-tidb-extension=true' -c test --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/mq_sink_dispatcher/conf/changefeed.toml
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.default_value.cli.2894.out cli tso query --pd=http://127.0.0.1:2379
table test.finish_mark not exists for 5-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72d2280005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:1482, start at 2024-05-05 12:56:54.670933723 +0800 CST m=+5.067210826	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:54.677 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:54.666 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:54.666 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72d2280005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:1482, start at 2024-05-05 12:56:54.670933723 +0800 CST m=+5.067210826	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:54.677 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:54.666 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:54.666 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c72d2dc0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:1564, start at 2024-05-05 12:56:54.73498002 +0800 CST m=+5.083077559	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:58:54.741 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:56:54.711 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:46:54.711 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/resolve_lock/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/resolve_lock/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/resolve_lock/tiflash-proxy.toml"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/resolve_lock/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/resolve_lock/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Create changefeed successfully!
ID: test
Info: {"upstream_id":7365374968019435740,"namespace":"default","id":"test","sink_uri":"kafka://127.0.0.1:9092/dispatcher-test?protocol=canal-json\u0026enable-tidb-extension=true","create_time":"2024-05-05T12:56:55.835235718+08:00","start_ts":449546817094025219,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"dispatchers":[{"matcher":["verify.t"],"partition":"index-value"},{"matcher":["dispatcher.index"],"partition":"index-value","index":"idx_a"}],"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546817094025219,"checkpoint_ts":449546817094025219,"checkpoint_time":"2024-05-05 12:56:53.939"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
+ set +x
+ tso='449546817397063682
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546817397063682 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:56:56 CST 2024] <<<<<< START cdc server in row_format case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.row_format.28502852.out server --log-file /tmp/tidb_cdc_test/row_format/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/row_format/cdc_data --cluster-id default
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:56:56 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f02ebb9c-825f-4552-a4f3-d9bd3b01a4f2
	{"id":"f02ebb9c-825f-4552-a4f3-d9bd3b01a4f2","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885013}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c778cc4
	f02ebb9c-825f-4552-a4f3-d9bd3b01a4f2

/tidb/cdc/default/default/upstream/7365374996549856445
	{"id":7365374996549856445,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f02ebb9c-825f-4552-a4f3-d9bd3b01a4f2
	{"id":"f02ebb9c-825f-4552-a4f3-d9bd3b01a4f2","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885013}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c778cc4
	f02ebb9c-825f-4552-a4f3-d9bd3b01a4f2

/tidb/cdc/default/default/upstream/7365374996549856445
	{"id":7365374996549856445,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f02ebb9c-825f-4552-a4f3-d9bd3b01a4f2
	{"id":"f02ebb9c-825f-4552-a4f3-d9bd3b01a4f2","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885013}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c778cc4
	f02ebb9c-825f-4552-a4f3-d9bd3b01a4f2

/tidb/cdc/default/default/upstream/7365374996549856445
	{"id":7365374996549856445,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 12:56:56 CST 2024] <<<<<< START kafka consumer in gc_safepoint case >>>>>>
0
check diff failed 1-th time, retry later
[Sun May  5 12:56:56 CST 2024] <<<<<< START cdc server in charset_gbk case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.charset_gbk.28572859.out server --log-file /tmp/tidb_cdc_test/charset_gbk/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/charset_gbk/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ set +x
+ tso='449546817516339201
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546817516339201 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:56:57 CST 2024] <<<<<< START cdc server in foreign_key case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.foreign_key.28502852.out server --log-file /tmp/tidb_cdc_test/foreign_key/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/foreign_key/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ set +x
check_changefeed_state http://127.0.0.1:2379 test normal null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test
+ expected_state=normal
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test -s
+ info='{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546817094025219,
  "checkpoint_time": "2024-05-05 12:56:53.939",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546817094025219,
  "checkpoint_time": "2024-05-05 12:56:53.939",
  "error": null
}'
{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546817094025219,
  "checkpoint_time": "2024-05-05 12:56:53.939",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374968019435740, '"namespace":' '"default",' '"id":' '"test",' '"state":' '"normal",' '"checkpoint_tso":' 449546817094025219, '"checkpoint_time":' '"2024-05-05' '12:56:53.939",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365374968019435740, '"namespace":' '"default",' '"id":' '"test",' '"state":' '"normal",' '"checkpoint_tso":' 449546817094025219, '"checkpoint_time":' '"2024-05-05' '12:56:53.939",' '"error":' null '}'
++ jq -r .error.message
+ set +x
+ tso='449546817615953922
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546817615953922 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:56:57 CST 2024] <<<<<< START cdc server in default_value case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.default_value.29282930.out server --log-file /tmp/tidb_cdc_test/default_value/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/default_value/cdc_data --cluster-id default
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table test.finish_mark exists
check diff successfully
+ message=null
+ [[ ! null =~ null ]]
run task successfully
check_changefeed_state http://127.0.0.1:2379 test failed ErrDispatcherFailed
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test
+ expected_state=failed
+ error_msg=ErrDispatcherFailed
+ tls_dir=ErrDispatcherFailed
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test -s
+ info='{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546817094025219,
  "checkpoint_time": "2024-05-05 12:56:53.939",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546817094025219,
  "checkpoint_time": "2024-05-05 12:56:53.939",
  "error": null
}'
{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546817094025219,
  "checkpoint_time": "2024-05-05 12:56:53.939",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374968019435740, '"namespace":' '"default",' '"id":' '"test",' '"state":' '"normal",' '"checkpoint_tso":' 449546817094025219, '"checkpoint_time":' '"2024-05-05' '12:56:53.939",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \f\a\i\l\e\d ]]
+ echo 'changefeed state normal does not equal to failed'
changefeed state normal does not equal to failed
+ exit 1
run task failed 1-th time, retry later
wait process cdc.test exit for 1-th time...
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.resolve_lock.cli.2887.out cli tso query --pd=http://127.0.0.1:2379
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:56:58 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5a35c9e1-12ff-4726-8887-2653a0993b07
	{"id":"5a35c9e1-12ff-4726-8887-2653a0993b07","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885015}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7591c9
	5a35c9e1-12ff-4726-8887-2653a0993b07

/tidb/cdc/default/default/upstream/7365374998242431469
	{"id":7365374998242431469,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5a35c9e1-12ff-4726-8887-2653a0993b07
	{"id":"5a35c9e1-12ff-4726-8887-2653a0993b07","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885015}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7591c9
	5a35c9e1-12ff-4726-8887-2653a0993b07

/tidb/cdc/default/default/upstream/7365374998242431469
	{"id":7365374998242431469,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5a35c9e1-12ff-4726-8887-2653a0993b07
	{"id":"5a35c9e1-12ff-4726-8887-2653a0993b07","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885015}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7591c9
	5a35c9e1-12ff-4726-8887-2653a0993b07

/tidb/cdc/default/default/upstream/7365374998242431469
	{"id":7365374998242431469,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 12:56:58 CST 2024] <<<<<< START kafka consumer in kv_client_stream_reconnect case >>>>>>
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:56:58 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1863a8e7-b588-4584-86f4-a3d7092a5835
	{"id":"1863a8e7-b588-4584-86f4-a3d7092a5835","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885015}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7a0fe6
	1863a8e7-b588-4584-86f4-a3d7092a5835

/tidb/cdc/default/default/upstream/7365374996658467769
	{"id":7365374996658467769,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1863a8e7-b588-4584-86f4-a3d7092a5835
	{"id":"1863a8e7-b588-4584-86f4-a3d7092a5835","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885015}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7a0fe6
	1863a8e7-b588-4584-86f4-a3d7092a5835

/tidb/cdc/default/default/upstream/7365374996658467769
	{"id":7365374996658467769,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1863a8e7-b588-4584-86f4-a3d7092a5835
	{"id":"1863a8e7-b588-4584-86f4-a3d7092a5835","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885015}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7a0fe6
	1863a8e7-b588-4584-86f4-a3d7092a5835

/tidb/cdc/default/default/upstream/7365374996658467769
	{"id":7365374996658467769,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.sink_retry.cli.2930.out cli changefeed create --start-ts=449546816965836801 '--sink-uri=kafka://127.0.0.1:9092/ticdc-sink-retry-test-28077?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
wait process cdc.test exit for 2-th time...
Create changefeed successfully!
ID: 684f53d4-3942-4968-a633-55f86d26d502
Info: {"upstream_id":7365374996658467769,"namespace":"default","id":"684f53d4-3942-4968-a633-55f86d26d502","sink_uri":"kafka://127.0.0.1:9092/ticdc-sink-retry-test-28077?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:56:58.675870768+08:00","start_ts":449546816965836801,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546816965836801,"checkpoint_ts":449546816965836801,"checkpoint_time":"2024-05-05 12:56:53.450"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:56:59 CST 2024] <<<<<< run test case kafka_simple_basic success! >>>>>>
check diff failed 2-th time, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:56:59 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/dd298b26-204f-4a22-8b52-8e2d3df30dbc
	{"id":"dd298b26-204f-4a22-8b52-8e2d3df30dbc","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885016}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7d38cc
	dd298b26-204f-4a22-8b52-8e2d3df30dbc

/tidb/cdc/default/default/upstream/7365375008015752437
	{"id":7365375008015752437,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/dd298b26-204f-4a22-8b52-8e2d3df30dbc
	{"id":"dd298b26-204f-4a22-8b52-8e2d3df30dbc","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885016}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7d38cc
	dd298b26-204f-4a22-8b52-8e2d3df30dbc

/tidb/cdc/default/default/upstream/7365375008015752437
	{"id":7365375008015752437,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/dd298b26-204f-4a22-8b52-8e2d3df30dbc
	{"id":"dd298b26-204f-4a22-8b52-8e2d3df30dbc","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885016}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7d38cc
	dd298b26-204f-4a22-8b52-8e2d3df30dbc

/tidb/cdc/default/default/upstream/7365375008015752437
	{"id":7365375008015752437,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ set +x
+ tso='449546818215215105
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546818215215105 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:56:59 CST 2024] <<<<<< START cdc server in resolve_lock case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.resolve_lock.29232925.out server --log-file /tmp/tidb_cdc_test/resolve_lock/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/resolve_lock/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:56:59 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6c73936a-bb2f-4c91-a83f-279dc7cc69b8
	{"id":"6c73936a-bb2f-4c91-a83f-279dc7cc69b8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885016}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7a09d1
	6c73936a-bb2f-4c91-a83f-279dc7cc69b8

/tidb/cdc/default/default/upstream/7365375002654228359
	{"id":7365375002654228359,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6c73936a-bb2f-4c91-a83f-279dc7cc69b8
	{"id":"6c73936a-bb2f-4c91-a83f-279dc7cc69b8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885016}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7a09d1
	6c73936a-bb2f-4c91-a83f-279dc7cc69b8

/tidb/cdc/default/default/upstream/7365375002654228359
	{"id":7365375002654228359,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6c73936a-bb2f-4c91-a83f-279dc7cc69b8
	{"id":"6c73936a-bb2f-4c91-a83f-279dc7cc69b8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885016}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7a09d1
	6c73936a-bb2f-4c91-a83f-279dc7cc69b8

/tidb/cdc/default/default/upstream/7365375002654228359
	{"id":7365375002654228359,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.row_format.cli.2903.out cli changefeed create --start-ts=449546817397063682 '--sink-uri=kafka://127.0.0.1:9092/ticdc-row-format-test-2910?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: 03f2dbb8-f905-4bf5-9fc6-c77e743740e4
Info: {"upstream_id":7365375008015752437,"namespace":"default","id":"03f2dbb8-f905-4bf5-9fc6-c77e743740e4","sink_uri":"mysql://normal:xxxxx@127.0.0.1:3306/","create_time":"2024-05-05T12:56:59.809886099+08:00","start_ts":449546817754890243,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546817754890243,"checkpoint_ts":449546817754890243,"checkpoint_time":"2024-05-05 12:56:56.460"}
[Sun May  5 12:56:59 CST 2024] <<<<<< START kafka consumer in charset_gbk case >>>>>>
+ set +x
[Sun May  5 12:57:00 CST 2024] <<<<<< START kafka consumer in sink_retry case >>>>>>
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:00 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/269f71b1-455a-41f0-90ea-10970a7b6093
	{"id":"269f71b1-455a-41f0-90ea-10970a7b6093","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885017}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c768cd4
	269f71b1-455a-41f0-90ea-10970a7b6093

/tidb/cdc/default/default/upstream/7365374998499002207
	{"id":7365374998499002207,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/269f71b1-455a-41f0-90ea-10970a7b6093
	{"id":"269f71b1-455a-41f0-90ea-10970a7b6093","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885017}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c768cd4
	269f71b1-455a-41f0-90ea-10970a7b6093

/tidb/cdc/default/default/upstream/7365374998499002207
	{"id":7365374998499002207,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/269f71b1-455a-41f0-90ea-10970a7b6093
	{"id":"269f71b1-455a-41f0-90ea-10970a7b6093","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885017}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c768cd4
	269f71b1-455a-41f0-90ea-10970a7b6093

/tidb/cdc/default/default/upstream/7365374998499002207
	{"id":7365374998499002207,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.foreign_key.cli.2908.out cli changefeed create --start-ts=449546817516339201 '--sink-uri=kafka://127.0.0.1:9092/ticdc-foreign-key-test-23817?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: d381c350-ba80-47fb-840b-f16893c80db7
Info: {"upstream_id":7365375002654228359,"namespace":"default","id":"d381c350-ba80-47fb-840b-f16893c80db7","sink_uri":"kafka://127.0.0.1:9092/ticdc-row-format-test-2910?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:57:00.226611241+08:00","start_ts":449546817397063682,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546817397063682,"checkpoint_ts":449546817397063682,"checkpoint_time":"2024-05-05 12:56:55.095"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
check_changefeed_state http://127.0.0.1:2379 test failed ErrDispatcherFailed
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test
+ expected_state=failed
+ error_msg=ErrDispatcherFailed
+ tls_dir=ErrDispatcherFailed
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test -s
+ info='{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546818063958044,
  "checkpoint_time": "2024-05-05 12:56:57.639",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546818063958044,
  "checkpoint_time": "2024-05-05 12:56:57.639",
  "error": null
}'
{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546818063958044,
  "checkpoint_time": "2024-05-05 12:56:57.639",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374968019435740, '"namespace":' '"default",' '"id":' '"test",' '"state":' '"normal",' '"checkpoint_tso":' 449546818063958044, '"checkpoint_time":' '"2024-05-05' '12:56:57.639",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \f\a\i\l\e\d ]]
+ echo 'changefeed state normal does not equal to failed'
changefeed state normal does not equal to failed
+ exit 1
run task failed 2-th time, retry later
Create changefeed successfully!
ID: e084f1c6-7208-4e0f-9b1e-94ab4d115627
Info: {"upstream_id":7365374998499002207,"namespace":"default","id":"e084f1c6-7208-4e0f-9b1e-94ab4d115627","sink_uri":"kafka://127.0.0.1:9092/ticdc-foreign-key-test-23817?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:57:00.569614493+08:00","start_ts":449546817516339201,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546817516339201,"checkpoint_ts":449546817516339201,"checkpoint_time":"2024-05-05 12:56:55.550"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:00 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c0403ba4-6842-47e9-bffe-d3b8b7c41106
	{"id":"c0403ba4-6842-47e9-bffe-d3b8b7c41106","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885017}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c85b4cf
	c0403ba4-6842-47e9-bffe-d3b8b7c41106

/tidb/cdc/default/default/upstream/7365375012014775215
	{"id":7365375012014775215,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c0403ba4-6842-47e9-bffe-d3b8b7c41106
	{"id":"c0403ba4-6842-47e9-bffe-d3b8b7c41106","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885017}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c85b4cf
	c0403ba4-6842-47e9-bffe-d3b8b7c41106

/tidb/cdc/default/default/upstream/7365375012014775215
	{"id":7365375012014775215,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ grep -q 'etcd info'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c0403ba4-6842-47e9-bffe-d3b8b7c41106
	{"id":"c0403ba4-6842-47e9-bffe-d3b8b7c41106","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885017}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c85b4cf
	c0403ba4-6842-47e9-bffe-d3b8b7c41106

/tidb/cdc/default/default/upstream/7365375012014775215
	{"id":7365375012014775215,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.default_value.cli.2995.out cli changefeed create --start-ts=449546817615953922 '--sink-uri=kafka://127.0.0.1:9092/ticdc-default-value-test-16885?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
check diff failed 3-th time, retry later
Create changefeed successfully!
ID: bbaf700e-54bf-4c1f-88bd-685fa7718511
Info: {"upstream_id":7365375012014775215,"namespace":"default","id":"bbaf700e-54bf-4c1f-88bd-685fa7718511","sink_uri":"kafka://127.0.0.1:9092/ticdc-default-value-test-16885?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:57:00.958836714+08:00","start_ts":449546817615953922,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546817615953922,"checkpoint_ts":449546817615953922,"checkpoint_time":"2024-05-05 12:56:55.930"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
[Sun May  5 12:57:01 CST 2024] <<<<<< START kafka consumer in row_format case >>>>>>
+ set +x
[Sun May  5 12:57:02 CST 2024] <<<<<< START kafka consumer in foreign_key case >>>>>>
+ set +x
[Sun May  5 12:57:02 CST 2024] <<<<<< START kafka consumer in default_value case >>>>>>
go: downloading github.com/google/uuid v1.6.0
go: downloading go.uber.org/zap v1.27.0
go: downloading github.com/pingcap/errors v0.11.5-0.20240318064555-6bd07397691f
go: downloading github.com/pingcap/log v1.1.1-0.20240314023424-862ccc32f18d
go: downloading golang.org/x/time v0.5.0
go: downloading github.com/pingcap/failpoint v0.0.0-20220801062533-2eaa32854a6c
go: downloading golang.org/x/sync v0.7.0
go: downloading github.com/BurntSushi/toml v1.3.2
go: downloading github.com/pingcap/tidb v1.1.0-beta.0.20240415145106-cd9c676e9ba4
go: downloading github.com/pingcap/tidb-tools v0.0.0-20240305021104-9f9bea84490b
go: downloading google.golang.org/grpc v1.62.1
go: downloading github.com/go-sql-driver/mysql v1.7.1
go: downloading gopkg.in/natefinch/lumberjack.v2 v2.2.1
go: downloading go.uber.org/atomic v1.11.0
go: downloading go.uber.org/multierr v1.11.0
go: downloading github.com/pingcap/tidb/pkg/parser v0.0.0-20240410110152-5fc42c9be2f5
go: downloading github.com/coreos/go-semver v0.3.1
go: downloading golang.org/x/sys v0.19.0
go: downloading google.golang.org/genproto/googleapis/rpc v0.0.0-20240401170217-c3f982113cda
go: downloading google.golang.org/protobuf v1.33.0
go: downloading golang.org/x/net v0.24.0
go: downloading github.com/golang/protobuf v1.5.4
go: downloading google.golang.org/genproto v0.0.0-20240401170217-c3f982113cda
go: downloading golang.org/x/text v0.14.0
check diff successfully
check_safepoint_forward http://127.0.0.1:2379 7365374996549856445 449546819130097665 449546817675198468
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:02 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/603f137c-4bec-4713-8958-3b827c9030fd
	{"id":"603f137c-4bec-4713-8958-3b827c9030fd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885020}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c8aeccd
	603f137c-4bec-4713-8958-3b827c9030fd

/tidb/cdc/default/default/upstream/7365375018838600413
	{"id":7365375018838600413,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/603f137c-4bec-4713-8958-3b827c9030fd
	{"id":"603f137c-4bec-4713-8958-3b827c9030fd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885020}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c8aeccd
	603f137c-4bec-4713-8958-3b827c9030fd

/tidb/cdc/default/default/upstream/7365375018838600413
	{"id":7365375018838600413,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/603f137c-4bec-4713-8958-3b827c9030fd
	{"id":"603f137c-4bec-4713-8958-3b827c9030fd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885020}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c8aeccd
	603f137c-4bec-4713-8958-3b827c9030fd

/tidb/cdc/default/default/upstream/7365375018838600413
	{"id":7365375018838600413,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.resolve_lock.cli.2980.out cli changefeed create --start-ts=449546818215215105 '--sink-uri=kafka://127.0.0.1:9092/ticdc-resolve-lock-test-32386?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: 97714657-5721-4b3d-9315-75a363b084a8
Info: {"upstream_id":7365375018838600413,"namespace":"default","id":"97714657-5721-4b3d-9315-75a363b084a8","sink_uri":"kafka://127.0.0.1:9092/ticdc-resolve-lock-test-32386?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:57:03.206703864+08:00","start_ts":449546818215215105,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546818215215105,"checkpoint_ts":449546818215215105,"checkpoint_time":"2024-05-05 12:56:58.216"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
run task successfully
check_changefeed_state http://127.0.0.1:2379 ecb9af18-3e60-45b0-a052-3f7dd963394c stopped null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=ecb9af18-3e60-45b0-a052-3f7dd963394c
+ expected_state=stopped
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c ecb9af18-3e60-45b0-a052-3f7dd963394c -s
+ info='{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "ecb9af18-3e60-45b0-a052-3f7dd963394c",
  "state": "stopped",
  "checkpoint_tso": 449546819654647810,
  "checkpoint_time": "2024-05-05 12:57:03.707",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "ecb9af18-3e60-45b0-a052-3f7dd963394c",
  "state": "stopped",
  "checkpoint_tso": 449546819654647810,
  "checkpoint_time": "2024-05-05 12:57:03.707",
  "error": null
}'
{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "ecb9af18-3e60-45b0-a052-3f7dd963394c",
  "state": "stopped",
  "checkpoint_tso": 449546819654647810,
  "checkpoint_time": "2024-05-05 12:57:03.707",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374996549856445, '"namespace":' '"default",' '"id":' '"ecb9af18-3e60-45b0-a052-3f7dd963394c",' '"state":' '"stopped",' '"checkpoint_tso":' 449546819654647810, '"checkpoint_time":' '"2024-05-05' '12:57:03.707",' '"error":' null '}'
++ jq -r .state
+ state=stopped
+ [[ ! stopped == \s\t\o\p\p\e\d ]]
++ echo '{' '"upstream_id":' 7365374996549856445, '"namespace":' '"default",' '"id":' '"ecb9af18-3e60-45b0-a052-3f7dd963394c",' '"state":' '"stopped",' '"checkpoint_tso":' 449546819654647810, '"checkpoint_time":' '"2024-05-05' '12:57:03.707",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
check_safepoint_equal http://127.0.0.1:2379 7365374996549856445
+ set +x
[Sun May  5 12:57:04 CST 2024] <<<<<< START kafka consumer in resolve_lock case >>>>>>
go: downloading github.com/pingcap/tidb v1.1.0-beta.0.20240415145106-cd9c676e9ba4
go: downloading github.com/tikv/client-go/v2 v2.0.8-0.20240409022718-714958ccd4d5
go: downloading github.com/pingcap/tidb/pkg/parser v0.0.0-20240410110152-5fc42c9be2f5
go: downloading github.com/pingcap/kvproto v0.0.0-20240227073058-929ab83f9754
go: downloading github.com/pingcap/log v1.1.1-0.20240314023424-862ccc32f18d
go: downloading github.com/pingcap/errors v0.11.5-0.20240318064555-6bd07397691f
go: downloading github.com/tikv/pd/client v0.0.0-20240322051414-fb9e2d561b6e
go: downloading github.com/BurntSushi/toml v1.3.2
go: downloading github.com/pingcap/tidb-tools v0.0.0-20240305021104-9f9bea84490b
go: downloading go.uber.org/zap v1.27.0
go: downloading gopkg.in/natefinch/lumberjack.v2 v2.2.1
[Sun May  5 12:57:03 CST 2024] <<<<<< START cdc server in processor_err_chan case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/processor/ProcessorAddTableError=1*return(true)'
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.processor_err_chan.29512953.out server --log-file /tmp/tidb_cdc_test/processor_err_chan/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/processor_err_chan/cdc_data --cluster-id default --config /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/processor_err_chan/conf/server.toml --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
check_changefeed_state http://127.0.0.1:2379 test failed ErrDispatcherFailed
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test
+ expected_state=failed
+ error_msg=ErrDispatcherFailed
+ tls_dir=ErrDispatcherFailed
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test -s
+ info='{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "failed",
  "checkpoint_tso": 449546818063958044,
  "checkpoint_time": "2024-05-05 12:56:57.639",
  "error": {
    "time": "2024-05-05T12:57:01.378970013+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrDispatcherFailed",
    "message": "[CDC:ErrDispatcherFailed]index not found when dispatch event, table: index, index: idx_a"
  }
}'
+ echo '{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "failed",
  "checkpoint_tso": 449546818063958044,
  "checkpoint_time": "2024-05-05 12:56:57.639",
  "error": {
    "time": "2024-05-05T12:57:01.378970013+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrDispatcherFailed",
    "message": "[CDC:ErrDispatcherFailed]index not found when dispatch event, table: index, index: idx_a"
  }
}'
{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "failed",
  "checkpoint_tso": 449546818063958044,
  "checkpoint_time": "2024-05-05 12:56:57.639",
  "error": {
    "time": "2024-05-05T12:57:01.378970013+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrDispatcherFailed",
    "message": "[CDC:ErrDispatcherFailed]index not found when dispatch event, table: index, index: idx_a"
  }
}
++ jq -r .state
++ echo '{' '"upstream_id":' 7365374968019435740, '"namespace":' '"default",' '"id":' '"test",' '"state":' '"failed",' '"checkpoint_tso":' 449546818063958044, '"checkpoint_time":' '"2024-05-05' '12:56:57.639",' '"error":' '{' '"time":' '"2024-05-05T12:57:01.378970013+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrDispatcherFailed",' '"message":' '"[CDC:ErrDispatcherFailed]index' not found when dispatch event, table: index, index: 'idx_a"' '}' '}'
+ state=failed
+ [[ ! failed == \f\a\i\l\e\d ]]
++ jq -r .error.message
++ echo '{' '"upstream_id":' 7365374968019435740, '"namespace":' '"default",' '"id":' '"test",' '"state":' '"failed",' '"checkpoint_tso":' 449546818063958044, '"checkpoint_time":' '"2024-05-05' '12:56:57.639",' '"error":' '{' '"time":' '"2024-05-05T12:57:01.378970013+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrDispatcherFailed",' '"message":' '"[CDC:ErrDispatcherFailed]index' not found when dispatch event, table: index, index: 'idx_a"' '}' '}'
+ message='[CDC:ErrDispatcherFailed]index not found when dispatch event, table: index, index: idx_a'
+ [[ ! [CDC:ErrDispatcherFailed]index not found when dispatch event, table: index, index: idx_a =~ ErrDispatcherFailed ]]
run task successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.mq_sink_dispatcher.cli.14233.out cli changefeed update -c test '--sink-uri=kafka://127.0.0.1:9092/dispatcher-test?protocol=canal-json&enable-tidb-extension=true' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/mq_sink_dispatcher/conf/new_changefeed.toml --no-confirm
go: downloading github.com/cznic/mathutil v0.0.0-20181122101859-297441e03548
go: downloading golang.org/x/exp v0.0.0-20240409090435-93d18d7e34b8
go: downloading github.com/tikv/client-go/v2 v2.0.8-0.20240409022718-714958ccd4d5
go: downloading go.etcd.io/etcd/client/v3 v3.5.12
go: downloading github.com/influxdata/tdigest v0.0.1
go: downloading github.com/tiancaiamao/gp v0.0.0-20221230034425-4025bc8a4d4a
go: downloading github.com/pingcap/kvproto v0.0.0-20240227073058-929ab83f9754
go: downloading github.com/docker/go-units v0.5.0
go: downloading github.com/prometheus/client_model v0.6.1
go: downloading github.com/pingcap/tipb v0.0.0-20240318032315-55a7867ddd50
go: downloading github.com/pingcap/sysutil v1.0.1-0.20240311050922-ae81ee01f3a5
go: downloading github.com/prometheus/client_golang v1.19.0
go: downloading github.com/opentracing/opentracing-go v1.2.0
go: downloading github.com/ngaut/pools v0.0.0-20180318154953-b7bc8c42aac7
go: downloading github.com/shirou/gopsutil/v3 v3.24.2
go: downloading github.com/spf13/pflag v1.0.5
go: downloading github.com/cockroachdb/errors v1.11.1
go: downloading github.com/grpc-ecosystem/go-grpc-middleware v1.4.0
go: downloading github.com/coocood/freecache v1.2.1
go: downloading github.com/danjacques/gofslock v0.0.0-20240212154529-d899e02bfe22
go: downloading github.com/jellydator/ttlcache/v3 v3.0.1
go: downloading gopkg.in/yaml.v2 v2.4.0
go: downloading github.com/tikv/pd/client v0.0.0-20240322051414-fb9e2d561b6e
go: downloading github.com/uber/jaeger-client-go v2.30.0+incompatible
go: downloading github.com/stretchr/testify v1.9.0
go: downloading github.com/gorilla/mux v1.8.0
go: downloading github.com/scalalang2/golang-fifo v0.1.5
go: downloading cloud.google.com/go/storage v1.39.1
go: downloading github.com/tidwall/btree v1.7.0
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azcore v1.9.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.5.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/storage/azblob v1.0.0
go: downloading github.com/twmb/murmur3 v1.1.6
go: downloading github.com/opentracing/basictracer-go v1.1.0
go: downloading go.etcd.io/etcd/api/v3 v3.5.12
go: downloading github.com/aliyun/alibaba-cloud-sdk-go v1.61.1581
go: downloading github.com/aws/aws-sdk-go v1.50.0
go: downloading github.com/tikv/pd v1.1.0-beta.0.20240407022249-7179657d129b
go: downloading github.com/google/btree v1.1.2
go: downloading github.com/gogo/protobuf v1.3.2
go: downloading github.com/go-resty/resty/v2 v2.11.0
go: downloading github.com/klauspost/compress v1.17.8
go: downloading golang.org/x/tools v0.20.0
go: downloading github.com/ks3sdklib/aws-sdk-go v1.2.9
go: downloading cloud.google.com/go v0.112.2
go: downloading golang.org/x/oauth2 v0.18.0
go: downloading google.golang.org/api v0.170.0
go: downloading go.uber.org/mock v0.4.0
go: downloading github.com/yangkeao/ldap/v3 v3.4.5-0.20230421065457-369a3bab1117
go: downloading github.com/ngaut/sync2 v0.0.0-20141008032647-7a24ed77b2ef
go: downloading github.com/cockroachdb/pebble v1.1.0
go: downloading github.com/jfcg/sorty/v2 v2.1.0
go: downloading github.com/cespare/xxhash/v2 v2.3.0
go: downloading github.com/joho/sqltocsv v0.0.0-20210428211105-a6d6801d59df
go: downloading github.com/dgraph-io/ristretto v0.1.1
go: downloading github.com/carlmjohnson/flagext v0.21.0
go: downloading github.com/jedib0t/go-pretty/v6 v6.2.2
go: downloading github.com/dolthub/swiss v0.2.1
go: downloading github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec
go: downloading github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc
go: downloading github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2
go: downloading gopkg.in/yaml.v3 v3.0.1
go: downloading github.com/golang/snappy v0.0.4
go: downloading github.com/lestrrat-go/jwx/v2 v2.0.21
go: downloading go.etcd.io/etcd/client/pkg/v3 v3.5.12
go: downloading github.com/Azure/azure-sdk-for-go/sdk/internal v1.5.1
go: downloading github.com/AzureAD/microsoft-authentication-library-for-go v1.2.1
go: downloading golang.org/x/crypto v0.22.0
go: downloading github.com/beorn7/perks v1.0.1
go: downloading github.com/prometheus/common v0.52.2
go: downloading github.com/prometheus/procfs v0.13.0
go: downloading github.com/cockroachdb/logtags v0.0.0-20230118201751-21c54148d20b
go: downloading github.com/cockroachdb/redact v1.1.5
go: downloading github.com/getsentry/sentry-go v0.27.0
go: downloading github.com/pkg/errors v0.9.1
go: downloading github.com/uber/jaeger-lib v2.4.1+incompatible
go: downloading github.com/cloudfoundry/gosigar v1.3.6
go: downloading github.com/dgryski/go-farm v0.0.0-20200201041132-a6ae2369ad13
go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2
go: downloading github.com/otiai10/copy v1.2.0
go: downloading github.com/spkg/bom v1.0.0
go: downloading github.com/xitongsys/parquet-go v1.6.0
go: downloading go.uber.org/atomic v1.11.0
go: downloading github.com/tikv/pd v1.1.0-beta.0.20240407022249-7179657d129b
go: downloading go.uber.org/multierr v1.11.0
go: downloading github.com/pingcap/failpoint v0.0.0-20220801062533-2eaa32854a6c
go: downloading google.golang.org/grpc v1.62.1
go: downloading github.com/coreos/go-semver v0.3.1
go: downloading github.com/go-sql-driver/mysql v1.7.1
go: downloading github.com/google/uuid v1.6.0
go: downloading github.com/opentracing/opentracing-go v1.2.0
go: downloading github.com/tiancaiamao/gp v0.0.0-20221230034425-4025bc8a4d4a
go: downloading github.com/pkg/errors v0.9.1
go: downloading go.etcd.io/etcd/client/v3 v3.5.12
go: downloading go.etcd.io/etcd/api/v3 v3.5.12
go: downloading github.com/grpc-ecosystem/go-grpc-middleware v1.4.0
go: downloading github.com/prometheus/client_golang v1.19.0
go: downloading golang.org/x/sync v0.7.0
go: downloading github.com/cznic/mathutil v0.0.0-20181122101859-297441e03548
go: downloading github.com/twmb/murmur3 v1.1.6
go: downloading github.com/dgryski/go-farm v0.0.0-20200201041132-a6ae2369ad13
go: downloading github.com/docker/go-units v0.5.0
go: downloading github.com/golang/protobuf v1.5.4
go: downloading github.com/prometheus/client_model v0.6.1
go: downloading github.com/gogo/protobuf v1.3.2
go: downloading github.com/google/btree v1.1.2
go: downloading google.golang.org/protobuf v1.33.0
go: downloading github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec
go: downloading go.etcd.io/etcd/client/pkg/v3 v3.5.12
go: downloading google.golang.org/genproto/googleapis/api v0.0.0-20240401170217-c3f982113cda
go: downloading google.golang.org/genproto v0.0.0-20240401170217-c3f982113cda
Diff of changefeed config:
{Type:update Path:[Config SyncPointInterval] From:<nil> To:0xc00165fa08}
{Type:update Path:[Config SyncPointRetention] From:<nil> To:0xc00165fa18}
{Type:update Path:[Config Sink DispatchRules 0 Matcher 0] From:verify.t To:dispatcher.index}
{Type:delete Path:[Config Sink DispatchRules 1 Matcher 0] From:dispatcher.index To:<nil>}
{Type:delete Path:[Config Sink DispatchRules 1 PartitionRule] From:index-value To:<nil>}
{Type:delete Path:[Config Sink DispatchRules 1 IndexName] From:idx_a To:<nil>}
{Type:update Path:[Config Consistent] From:<nil> To:0xc001427340}
go: downloading github.com/jfcg/sixb v1.3.8
go: downloading github.com/Azure/go-ntlmssp v0.0.0-20221128193559-754e69321358
go: downloading github.com/go-asn1-ber/asn1-ber v1.5.4
go: downloading github.com/tklauser/go-sysconf v0.3.12
go: downloading github.com/cheggaaa/pb/v3 v3.0.8
go: downloading github.com/google/pprof v0.0.0-20240117000934-35fc243c5815
go: downloading github.com/wangjohn/quickselect v0.0.0-20161129230411-ed8402a42d5f
go: downloading google.golang.org/genproto/googleapis/api v0.0.0-20240401170217-c3f982113cda
go: downloading github.com/dolthub/maphash v0.1.0
go: downloading github.com/robfig/cron/v3 v3.0.1
go: downloading cloud.google.com/go/compute/metadata v0.2.3
go: downloading cloud.google.com/go/iam v1.1.7
go: downloading github.com/googleapis/gax-go/v2 v2.12.3
go: downloading cloud.google.com/go/compute v1.25.1
go: downloading github.com/robfig/cron v1.2.0
go: downloading github.com/kr/pretty v0.3.1
go: downloading github.com/coreos/go-systemd/v22 v22.5.0
go: downloading github.com/pingcap/goleveldb v0.0.0-20191226122134-f82aafb29989
go: downloading github.com/pingcap/badger v1.5.1-0.20230103063557-828f39b09b6d
go: downloading github.com/mattn/go-runewidth v0.0.15
go: downloading github.com/kylelemons/godebug v1.1.0
go: downloading github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c
go: downloading github.com/VividCortex/ewma v1.2.0
go: downloading github.com/fatih/color v1.16.0
go: downloading github.com/mattn/go-colorable v0.1.13
go: downloading github.com/mattn/go-isatty v0.0.20
go: downloading go.opencensus.io v0.23.1-0.20220331163232-052120675fac
go: downloading go.opentelemetry.io/otel v1.24.0
go: downloading go.opentelemetry.io/otel/trace v1.24.0
go: downloading github.com/apache/thrift v0.16.0
go: downloading github.com/tklauser/numcpus v0.6.1
go: downloading github.com/kr/text v0.2.0
go: downloading github.com/rogpeppe/go-internal v1.12.0
go: downloading github.com/lestrrat-go/blackmagic v1.0.2
go: downloading github.com/lestrrat-go/httprc v1.0.5
go: downloading github.com/lestrrat-go/iter v1.0.2
go: downloading github.com/lestrrat-go/option v1.0.1
go: downloading github.com/dustin/go-humanize v1.0.1
go: downloading github.com/golang/glog v1.2.0
go: downloading github.com/golang-jwt/jwt/v5 v5.2.0
go: downloading github.com/rivo/uniseg v0.4.7
go: downloading github.com/lestrrat-go/httpcc v1.0.1
go: downloading github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da
go: downloading github.com/ncw/directio v1.0.5
go: downloading github.com/coocood/bbloom v0.0.0-20190830030839-58deb6228d64
go: downloading github.com/coocood/rtutil v0.0.0-20190304133409-c84515f646f2
go: downloading github.com/klauspost/cpuid v1.3.1
go: downloading github.com/golang-jwt/jwt v3.2.2+incompatible
go: downloading github.com/beorn7/perks v1.0.1
go: downloading github.com/cespare/xxhash/v2 v2.3.0
go: downloading github.com/prometheus/common v0.52.2
go: downloading github.com/prometheus/procfs v0.13.0
go: downloading github.com/coreos/go-systemd/v22 v22.5.0
go: downloading github.com/cloudfoundry/gosigar v1.3.6
go: downloading golang.org/x/exp v0.0.0-20240409090435-93d18d7e34b8
go: downloading golang.org/x/net v0.24.0
go: downloading google.golang.org/genproto/googleapis/rpc v0.0.0-20240401170217-c3f982113cda
go: downloading golang.org/x/sys v0.19.0
Update changefeed config successfully! 
ID: test
Info: {"upstream_id":7365374968019435740,"namespace":"default","id":"test","sink_uri":"kafka://127.0.0.1:9092/dispatcher-test?protocol=canal-json\u0026enable-tidb-extension=true","create_time":"2024-05-05T12:56:55.835235718+08:00","start_ts":449546817094025219,"admin_job_type":1,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_table_monitor":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","dispatchers":[{"matcher":["dispatcher.index"],"partition":"index-value"}],"encoder_concurrency":32,"terminator":"\r\n","enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"failed","error":{"addr":"127.0.0.1:8300","code":"CDC:ErrDispatcherFailed","message":"[CDC:ErrDispatcherFailed]index not found when dispatch event, table: index, index: idx_a"},"creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":0,"checkpoint_ts":449546818063958044,"checkpoint_time":"2024-05-05 12:56:57.639"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
go: downloading github.com/go-logr/logr v1.4.1
go: downloading go.opentelemetry.io/otel/metric v1.24.0
go: downloading github.com/go-logr/stdr v1.2.2
go: downloading golang.org/x/text v0.14.0
go: downloading github.com/DataDog/zstd v1.5.5
go: downloading github.com/cockroachdb/tokenbucket v0.0.0-20230807174530-cc333fc44b06
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.mq_sink_dispatcher.cli.14283.out cli changefeed resume -c test
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:06 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4b87e1dc-662a-4067-85a3-0cc9b25ff559
	{"id":"4b87e1dc-662a-4067-85a3-0cc9b25ff559","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885024}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7ca080
	4b87e1dc-662a-4067-85a3-0cc9b25ff559

/tidb/cdc/default/default/upstream/7365375007752655348
	{"id":7365375007752655348,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4b87e1dc-662a-4067-85a3-0cc9b25ff559
	{"id":"4b87e1dc-662a-4067-85a3-0cc9b25ff559","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885024}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7ca080
	4b87e1dc-662a-4067-85a3-0cc9b25ff559

/tidb/cdc/default/default/upstream/7365375007752655348
	{"id":7365375007752655348,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4b87e1dc-662a-4067-85a3-0cc9b25ff559
	{"id":"4b87e1dc-662a-4067-85a3-0cc9b25ff559","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885024}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471c7ca080
	4b87e1dc-662a-4067-85a3-0cc9b25ff559

/tidb/cdc/default/default/upstream/7365375007752655348
	{"id":7365375007752655348,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 12:57:07 CST 2024] <<<<<< START kafka consumer in processor_err_chan case >>>>>>
check_changefeed_state http://127.0.0.1:2379 e35394f5-610d-40d4-89ac-3c647772eb67 normal null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=e35394f5-610d-40d4-89ac-3c647772eb67
+ expected_state=normal
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c e35394f5-610d-40d4-89ac-3c647772eb67 -s
+ info='{
  "upstream_id": 7365375007752655348,
  "namespace": "default",
  "id": "e35394f5-610d-40d4-89ac-3c647772eb67",
  "state": "normal",
  "checkpoint_tso": 449546820480139267,
  "checkpoint_time": "2024-05-05 12:57:06.856",
  "error": null
}'
+ echo '{
  "upstream_id": 7365375007752655348,
  "namespace": "default",
  "id": "e35394f5-610d-40d4-89ac-3c647772eb67",
  "state": "normal",
  "checkpoint_tso": 449546820480139267,
  "checkpoint_time": "2024-05-05 12:57:06.856",
  "error": null
}'
{
  "upstream_id": 7365375007752655348,
  "namespace": "default",
  "id": "e35394f5-610d-40d4-89ac-3c647772eb67",
  "state": "normal",
  "checkpoint_tso": 449546820480139267,
  "checkpoint_time": "2024-05-05 12:57:06.856",
  "error": null
}
++ echo '{' '"upstream_id":' 7365375007752655348, '"namespace":' '"default",' '"id":' '"e35394f5-610d-40d4-89ac-3c647772eb67",' '"state":' '"normal",' '"checkpoint_tso":' 449546820480139267, '"checkpoint_time":' '"2024-05-05' '12:57:06.856",' '"error":' null '}'
++ jq -r .state
PASS
go: downloading github.com/pingcap/tipb v0.0.0-20240318032315-55a7867ddd50
go: downloading github.com/coocood/freecache v1.2.1
go: downloading github.com/opentracing/basictracer-go v1.1.0
go: downloading github.com/pingcap/sysutil v1.0.1-0.20240311050922-ae81ee01f3a5
go: downloading github.com/shirou/gopsutil/v3 v3.24.2
go: downloading github.com/danjacques/gofslock v0.0.0-20240212154529-d899e02bfe22
go: downloading github.com/cockroachdb/errors v1.11.1
go: downloading github.com/uber/jaeger-client-go v2.30.0+incompatible
go: downloading github.com/spf13/pflag v1.0.5
go: downloading github.com/dgraph-io/ristretto v0.1.1
go: downloading github.com/jellydator/ttlcache/v3 v3.0.1
go: downloading gopkg.in/yaml.v2 v2.4.0
go: downloading github.com/gorilla/mux v1.8.0
go: downloading github.com/ngaut/pools v0.0.0-20180318154953-b7bc8c42aac7
go: downloading github.com/influxdata/tdigest v0.0.1
go: downloading github.com/yangkeao/ldap/v3 v3.4.5-0.20230421065457-369a3bab1117
go: downloading github.com/dolthub/swiss v0.2.1
go: downloading cloud.google.com/go/storage v1.39.1
go: downloading github.com/golang/snappy v0.0.4
go: downloading golang.org/x/tools v0.20.0
go: downloading github.com/scalalang2/golang-fifo v0.1.5
go: downloading github.com/aws/aws-sdk-go v1.50.0
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.5.1
go: downloading github.com/otiai10/copy v1.2.0
go: downloading github.com/tidwall/btree v1.7.0
go: downloading go.uber.org/mock v0.4.0
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azcore v1.9.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/storage/azblob v1.0.0
go: downloading github.com/stretchr/testify v1.9.0
go: downloading github.com/cockroachdb/pebble v1.1.0
go: downloading github.com/joho/sqltocsv v0.0.0-20210428211105-a6d6801d59df
go: downloading github.com/aliyun/alibaba-cloud-sdk-go v1.61.1581
go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2
go: downloading github.com/klauspost/compress v1.17.8
go: downloading github.com/carlmjohnson/flagext v0.21.0
go: downloading github.com/jedib0t/go-pretty/v6 v6.2.2
go: downloading github.com/spkg/bom v1.0.0
go: downloading golang.org/x/time v0.5.0
go: downloading github.com/go-resty/resty/v2 v2.11.0
go: downloading github.com/ks3sdklib/aws-sdk-go v1.2.9
go: downloading golang.org/x/oauth2 v0.18.0
go: downloading google.golang.org/api v0.170.0
go: downloading github.com/lestrrat-go/jwx/v2 v2.0.21
go: downloading github.com/jfcg/sorty/v2 v2.1.0
go: downloading github.com/xitongsys/parquet-go v1.6.0
go: downloading github.com/cheggaaa/pb/v3 v3.0.8
go: downloading github.com/pingcap/badger v1.5.1-0.20230103063557-828f39b09b6d
go: downloading github.com/sourcegraph/appdash v0.0.0-20190731080439-ebfcffb1b5c0
go: downloading github.com/fatih/color v1.16.0
go: downloading github.com/vbauerster/mpb/v7 v7.5.3
go: downloading golang.org/x/term v0.19.0
go: downloading github.com/pingcap/goleveldb v0.0.0-20191226122134-f82aafb29989
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365375007752655348, '"namespace":' '"default",' '"id":' '"e35394f5-610d-40d4-89ac-3c647772eb67",' '"state":' '"normal",' '"checkpoint_tso":' 449546820480139267, '"checkpoint_time":' '"2024-05-05' '12:57:06.856",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
go: downloading github.com/ngaut/sync2 v0.0.0-20141008032647-7a24ed77b2ef
go: downloading github.com/spf13/cobra v1.8.0
go: downloading github.com/dolthub/maphash v0.1.0
go: downloading github.com/Azure/go-ntlmssp v0.0.0-20221128193559-754e69321358
go: downloading github.com/go-asn1-ber/asn1-ber v1.5.4
go: downloading cloud.google.com/go v0.112.2
go: downloading github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc
go: downloading github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2
go: downloading gopkg.in/yaml.v3 v3.0.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/internal v1.5.1
go: downloading github.com/AzureAD/microsoft-authentication-library-for-go v1.2.1
go: downloading golang.org/x/crypto v0.22.0
go: downloading github.com/jfcg/sixb v1.3.8
go: downloading github.com/VividCortex/ewma v1.2.0
go: downloading github.com/mattn/go-colorable v0.1.13
go: downloading github.com/mattn/go-isatty v0.0.20
go: downloading github.com/mattn/go-runewidth v0.0.15
go: downloading github.com/google/pprof v0.0.0-20240117000934-35fc243c5815
go: downloading github.com/wangjohn/quickselect v0.0.0-20161129230411-ed8402a42d5f
go: downloading github.com/robfig/cron/v3 v3.0.1
go: downloading github.com/json-iterator/go v1.1.12
go: downloading cloud.google.com/go/compute/metadata v0.2.3
go: downloading github.com/Masterminds/semver v1.5.0
go: downloading k8s.io/api v0.28.6
go: downloading github.com/emirpasic/gods v1.18.1
go: downloading github.com/apache/thrift v0.16.0
go: downloading cloud.google.com/go/compute v1.25.1
go: downloading github.com/robfig/cron v1.2.0
go: downloading github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d
go: downloading github.com/uber/jaeger-lib v2.4.1+incompatible
go: downloading github.com/cockroachdb/logtags v0.0.0-20230118201751-21c54148d20b
go: downloading github.com/cockroachdb/redact v1.1.5
go: downloading github.com/getsentry/sentry-go v0.27.0
go: downloading github.com/tklauser/go-sysconf v0.3.12
go: downloading github.com/dustin/go-humanize v1.0.1
go: downloading github.com/golang/glog v1.2.0
go: downloading github.com/rivo/uniseg v0.4.7
go: downloading github.com/lestrrat-go/blackmagic v1.0.2
go: downloading github.com/lestrrat-go/httprc v1.0.5
go: downloading github.com/lestrrat-go/iter v1.0.2
go: downloading github.com/lestrrat-go/option v1.0.1
go: downloading github.com/kr/pretty v0.3.1
go: downloading github.com/lestrrat-go/httpcc v1.0.1
go: downloading github.com/kylelemons/godebug v1.1.0
go: downloading github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c
go: downloading github.com/rogpeppe/go-internal v1.12.0
go: downloading github.com/kr/text v0.2.0
go: downloading github.com/golang-jwt/jwt/v5 v5.2.0
go: downloading github.com/tklauser/numcpus v0.6.1
go: downloading github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd
go: downloading github.com/modern-go/reflect2 v1.0.2
go: downloading cloud.google.com/go/iam v1.1.7
go: downloading github.com/googleapis/gax-go/v2 v2.12.3
go: downloading go.opencensus.io v0.23.1-0.20220331163232-052120675fac
go: downloading go.opentelemetry.io/otel v1.24.0
go: downloading github.com/golang-jwt/jwt v3.2.2+incompatible
go: downloading go.opentelemetry.io/otel/trace v1.24.0
go: downloading github.com/ncw/directio v1.0.5
go: downloading github.com/coocood/rtutil v0.0.0-20190304133409-c84515f646f2
go: downloading github.com/coocood/bbloom v0.0.0-20190830030839-58deb6228d64
go: downloading github.com/klauspost/cpuid v1.3.1
run task successfully
check diff failed 1-th time, retry later
go: downloading github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da
go: downloading github.com/go-logr/logr v1.4.1
go: downloading go.opentelemetry.io/otel/metric v1.24.0
go: downloading github.com/go-logr/stdr v1.2.2
check_changefeed_state http://127.0.0.1:2379 ecb9af18-3e60-45b0-a052-3f7dd963394c normal null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=ecb9af18-3e60-45b0-a052-3f7dd963394c
+ expected_state=normal
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c ecb9af18-3e60-45b0-a052-3f7dd963394c -s
go: downloading github.com/DataDog/zstd v1.5.5
go: downloading github.com/cockroachdb/tokenbucket v0.0.0-20230807174530-cc333fc44b06
go: downloading k8s.io/apimachinery v0.28.6
table charset_gbk_test0.t0 exists
table charset_gbk_test0.t1 exists
table charset_gbk_test1.t0 not exists for 1-th check, retry later
+ info='{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "ecb9af18-3e60-45b0-a052-3f7dd963394c",
  "state": "normal",
  "checkpoint_tso": 449546819654647810,
  "checkpoint_time": "2024-05-05 12:57:03.707",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "ecb9af18-3e60-45b0-a052-3f7dd963394c",
  "state": "normal",
  "checkpoint_tso": 449546819654647810,
  "checkpoint_time": "2024-05-05 12:57:03.707",
  "error": null
}'
{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "ecb9af18-3e60-45b0-a052-3f7dd963394c",
  "state": "normal",
  "checkpoint_tso": 449546819654647810,
  "checkpoint_time": "2024-05-05 12:57:03.707",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374996549856445, '"namespace":' '"default",' '"id":' '"ecb9af18-3e60-45b0-a052-3f7dd963394c",' '"state":' '"normal",' '"checkpoint_tso":' 449546819654647810, '"checkpoint_time":' '"2024-05-05' '12:57:03.707",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365374996549856445, '"namespace":' '"default",' '"id":' '"ecb9af18-3e60-45b0-a052-3f7dd963394c",' '"state":' '"normal",' '"checkpoint_tso":' 449546819654647810, '"checkpoint_time":' '"2024-05-05' '12:57:03.707",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
check_safepoint_forward http://127.0.0.1:2379 7365374996549856445 449546819654647809 449546819654647810
go: downloading gopkg.in/inf.v0 v0.9.1
go: downloading k8s.io/klog/v2 v2.120.1
go: downloading github.com/google/gofuzz v1.2.0
go: downloading sigs.k8s.io/structured-merge-diff/v4 v4.4.1
go: downloading sigs.k8s.io/json v0.0.0-20221116044647-bc3834ca7abd
go: downloading k8s.io/utils v0.0.0-20230726121419-3b25d923346b
+ set +x
check_changefeed_state http://127.0.0.1:2379 test normal null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test
+ expected_state=normal
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test -s
+ info='{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546818063958044,
  "checkpoint_time": "2024-05-05 12:56:57.639",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546818063958044,
  "checkpoint_time": "2024-05-05 12:56:57.639",
  "error": null
}'
{
  "upstream_id": 7365374968019435740,
  "namespace": "default",
  "id": "test",
  "state": "normal",
  "checkpoint_tso": 449546818063958044,
  "checkpoint_time": "2024-05-05 12:56:57.639",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374968019435740, '"namespace":' '"default",' '"id":' '"test",' '"state":' '"normal",' '"checkpoint_tso":' 449546818063958044, '"checkpoint_time":' '"2024-05-05' '12:56:57.639",' '"error":' null '}'
++ jq -r .state
go: downloading go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.49.0
go: downloading github.com/googleapis/enterprise-certificate-proxy v0.3.2
go: downloading github.com/google/s2a-go v0.1.7
go: downloading go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.49.0
go: downloading github.com/felixge/httpsnoop v1.0.4
go: downloading github.com/jmespath/go-jmespath v0.4.0
table charset_gbk_test1.t0 exists
table test.finish_mark not exists for 1-th check, retry later
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365374968019435740, '"namespace":' '"default",' '"id":' '"test",' '"state":' '"normal",' '"checkpoint_tso":' 449546818063958044, '"checkpoint_time":' '"2024-05-05' '12:56:57.639",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
run task successfully
table test.finish_mark not exists for 1-th check, retry later
check_changefeed_state http://127.0.0.1:2379 ecb9af18-3e60-45b0-a052-3f7dd963394c stopped null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=ecb9af18-3e60-45b0-a052-3f7dd963394c
+ expected_state=stopped
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c ecb9af18-3e60-45b0-a052-3f7dd963394c -s
go: downloading github.com/modern-go/reflect2 v1.0.2
go: downloading github.com/json-iterator/go v1.1.12
go: downloading github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd
+ info='{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "ecb9af18-3e60-45b0-a052-3f7dd963394c",
  "state": "stopped",
  "checkpoint_tso": 449546819654647810,
  "checkpoint_time": "2024-05-05 12:57:03.707",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "ecb9af18-3e60-45b0-a052-3f7dd963394c",
  "state": "stopped",
  "checkpoint_tso": 449546819654647810,
  "checkpoint_time": "2024-05-05 12:57:03.707",
  "error": null
}'
{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "ecb9af18-3e60-45b0-a052-3f7dd963394c",
  "state": "stopped",
  "checkpoint_tso": 449546819654647810,
  "checkpoint_time": "2024-05-05 12:57:03.707",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374996549856445, '"namespace":' '"default",' '"id":' '"ecb9af18-3e60-45b0-a052-3f7dd963394c",' '"state":' '"stopped",' '"checkpoint_tso":' 449546819654647810, '"checkpoint_time":' '"2024-05-05' '12:57:03.707",' '"error":' null '}'
++ jq -r .state
+ state=stopped
+ [[ ! stopped == \s\t\o\p\p\e\d ]]
++ echo '{' '"upstream_id":' 7365374996549856445, '"namespace":' '"default",' '"id":' '"ecb9af18-3e60-45b0-a052-3f7dd963394c",' '"state":' '"stopped",' '"checkpoint_tso":' 449546819654647810, '"checkpoint_time":' '"2024-05-05' '12:57:03.707",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
check_changefeed_state http://127.0.0.1:2379 12680d01-2340-4c9c-8b65-1d6c9518bf46 normal null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=12680d01-2340-4c9c-8b65-1d6c9518bf46
+ expected_state=normal
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c 12680d01-2340-4c9c-8b65-1d6c9518bf46 -s
+ info='{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "12680d01-2340-4c9c-8b65-1d6c9518bf46",
  "state": "normal",
  "checkpoint_tso": 449546821227249665,
  "checkpoint_time": "2024-05-05 12:57:09.706",
  "error": null
}'
+ echo '{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "12680d01-2340-4c9c-8b65-1d6c9518bf46",
  "state": "normal",
  "checkpoint_tso": 449546821227249665,
  "checkpoint_time": "2024-05-05 12:57:09.706",
  "error": null
}'
{
  "upstream_id": 7365374996549856445,
  "namespace": "default",
  "id": "12680d01-2340-4c9c-8b65-1d6c9518bf46",
  "state": "normal",
  "checkpoint_tso": 449546821227249665,
  "checkpoint_time": "2024-05-05 12:57:09.706",
  "error": null
}
++ echo '{' '"upstream_id":' 7365374996549856445, '"namespace":' '"default",' '"id":' '"12680d01-2340-4c9c-8b65-1d6c9518bf46",' '"state":' '"normal",' '"checkpoint_tso":' 449546821227249665, '"checkpoint_time":' '"2024-05-05' '12:57:09.706",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365374996549856445, '"namespace":' '"default",' '"id":' '"12680d01-2340-4c9c-8b65-1d6c9518bf46",' '"state":' '"normal",' '"checkpoint_tso":' 449546821227249665, '"checkpoint_time":' '"2024-05-05' '12:57:09.706",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
check_safepoint_equal http://127.0.0.1:2379 7365374996549856445
check diff failed 2-th time, retry later
table foreign_key.finish_mark not exists for 1-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_basic_avro/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table test.finish_mark not exists for 2-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
go: downloading github.com/google/s2a-go v0.1.7
go: downloading go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.49.0
go: downloading github.com/googleapis/enterprise-certificate-proxy v0.3.2
go: downloading go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.49.0
go: downloading github.com/felixge/httpsnoop v1.0.4
go: downloading github.com/jmespath/go-jmespath v0.4.0
table row_format.finish_mark not exists for 1-th check, retry later
check diff successfully
wait process cdc.test exit for 1-th time...
table foreign_key.finish_mark not exists for 2-th check, retry later
wait process cdc.test exit for 2-th time...
table row_format.finish_mark not exists for 2-th check, retry later
table test.finish_mark exists
check diff successfully
run task successfully
table test.finish_mark not exists for 3-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:57:13 CST 2024] <<<<<< run test case processor_err_chan success! >>>>>>
Changefeed remove successfully.
ID: ecb9af18-3e60-45b0-a052-3f7dd963394c
CheckpointTs: 449546819654647810
SinkURI: kafka://127.0.0.1:9092/ticdc-gc-safepoint-10045?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760
check_safepoint_forward http://127.0.0.1:2379 7365374996549856445 449546821227249664 449546821227249665 449546819654647810
start tidb cluster in /tmp/tidb_cdc_test/kafka_simple_basic_avro
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
cdc.test: no process found
wait process cdc.test exit for 1-th time...
process cdc.test already exit
[Sun May  5 12:57:13 CST 2024] <<<<<< run test case mq_sink_dispatcher success! >>>>>>
run task successfully
table foreign_key.finish_mark not exists for 3-th check, retry later
table row_format.finish_mark not exists for 3-th check, retry later
Changefeed remove successfully.
ID: 12680d01-2340-4c9c-8b65-1d6c9518bf46
CheckpointTs: 449546822275825667
SinkURI: kafka://127.0.0.1:9092/ticdc-gc-safepoint-10045?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760
check_safepoint_cleared http://127.0.0.1:2379 7365374996549856445
run task successfully
table test.finish_mark not exists for 4-th check, retry later
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
table foreign_key.finish_mark not exists for 4-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:57:16 CST 2024] <<<<<< run test case gc_safepoint success! >>>>>>
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table test.finish_mark exists
check table exists success
check diff successfully
table row_format.finish_mark not exists for 4-th check, retry later
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
table foreign_key.finish_mark not exists for 5-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:57:18 CST 2024] <<<<<< run test case charset_gbk success! >>>>>>
table row_format.finish_mark not exists for 5-th check, retry later
table foreign_key.finish_mark not exists for 6-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table row_format.finish_mark not exists for 6-th check, retry later
table foreign_key.finish_mark not exists for 7-th check, retry later
table row_format.finish_mark not exists for 7-th check, retry later
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 26.53 secs (140331721 bytes/sec)
[Pipeline] {
[Pipeline] cache
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table foreign_key.finish_mark not exists for 8-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7497c80013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:8775, start at 2024-05-05 12:57:23.725717568 +0800 CST m=+5.092606885	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:23.732 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:23.698 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:23.698 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7497c80013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:8775, start at 2024-05-05 12:57:23.725717568 +0800 CST m=+5.092606885	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:23.732 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:23.698 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:23.698 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7498580015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:8858, start at 2024-05-05 12:57:23.755588399 +0800 CST m=+5.073220493	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:23.763 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:23.734 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:23.734 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_simple_basic_avro/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_simple_basic_avro/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_simple_basic_avro/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_simple_basic_avro/tiflash-proxy.toml"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_simple_basic_avro/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table row_format.finish_mark not exists for 8-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/changefeed_reconstruct/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/mq_sink_dispatcher/run.sh: line 1: 14355 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/new_changefeed.toml" 2>&1
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_column_selector/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table foreign_key.finish_mark exists
check diff successfully
[Sun May  5 12:57:26 CST 2024] <<<<<< START cdc server in kafka_simple_basic_avro case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_basic_avro.1019510197.out server --log-file /tmp/tidb_cdc_test/kafka_simple_basic_avro/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_simple_basic_avro/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
wait process cdc.test exit for 1-th time...
table row_format.finish_mark not exists for 9-th check, retry later
wait process cdc.test exit for 2-th time...
start tidb cluster in /tmp/tidb_cdc_test/changefeed_reconstruct
Starting Upstream PD...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:57:28 CST 2024] <<<<<< run test case foreign_key success! >>>>>>
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/kafka_column_selector
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table row_format.finish_mark not exists for 10-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:30 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6bfc5b43-6665-44fe-b649-212f8e94dadd
	{"id":"6bfc5b43-6665-44fe-b649-212f8e94dadd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885047}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471cfe46c8
	6bfc5b43-6665-44fe-b649-212f8e94dadd

/tidb/cdc/default/default/upstream/7365375144998087327
	{"id":7365375144998087327,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6bfc5b43-6665-44fe-b649-212f8e94dadd
	{"id":"6bfc5b43-6665-44fe-b649-212f8e94dadd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885047}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471cfe46c8
	6bfc5b43-6665-44fe-b649-212f8e94dadd

/tidb/cdc/default/default/upstream/7365375144998087327
	{"id":7365375144998087327,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/6bfc5b43-6665-44fe-b649-212f8e94dadd
	{"id":"6bfc5b43-6665-44fe-b649-212f8e94dadd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885047}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471cfe46c8
	6bfc5b43-6665-44fe-b649-212f8e94dadd

/tidb/cdc/default/default/upstream/7365375144998087327
	{"id":7365375144998087327,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_basic_avro.cli.10254.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-simple-basic-avro-12004?protocol=simple&encoding-format=avro' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_basic_avro/conf/changefeed.toml
Create changefeed successfully!
ID: d2b849ae-4ec7-489a-b5bc-ff65c54324d4
Info: {"upstream_id":7365375144998087327,"namespace":"default","id":"d2b849ae-4ec7-489a-b5bc-ff65c54324d4","sink_uri":"kafka://127.0.0.1:9092/ticdc-simple-basic-avro-12004?protocol=simple\u0026encoding-format=avro","create_time":"2024-05-05T12:57:30.481600261+08:00","start_ts":449546826638426117,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"correctness","corruption_handle_level":"error"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546826638426117,"checkpoint_ts":449546826638426117,"checkpoint_time":"2024-05-05 12:57:30.348"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
check diff failed 1-th time, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/ddl_manager/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
Verifying downstream PD is started...
table row_format.finish_mark exists
check diff successfully
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/changefeed_pause_resume/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
wait process cdc.test exit for 1-th time...
+ set +x
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table sink_retry.finish_mark_1 exists
check diff successfully
***************** properties *****************
"dotransactions"="false"
"mysql.port"="4000"
"readproportion"="0"
"workload"="core"
"mysql.host"="127.0.0.1"
"mysql.db"="sink_retry"
"operationcount"="0"
"threadcount"="2"
"mysql.user"="root"
"insertproportion"="0"
"readallfields"="true"
"updateproportion"="0"
"requestdistribution"="uniform"
"recordcount"="10"
"scanproportion"="0"
**********************************************
Run finished, takes 3.798355ms
INSERT - Takes(s): 0.0, Count: 10, OPS: 3776.7, Avg(us): 676, Min(us): 480, Max(us): 1479, 95th(us): 2000, 99th(us): 2000
wait process cdc.test exit for 2-th time...
check diff successfully
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:57:32 CST 2024] <<<<<< run test case row_format success! >>>>>>
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
start tidb cluster in /tmp/tidb_cdc_test/ddl_manager
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
wait process cdc.test exit for 3-th time...
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 8.87 secs (419932389 bytes/sec)
[Pipeline] {
[Pipeline] cache
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 12:57:34 CST 2024] <<<<<< run test case kv_client_stream_reconnect success! >>>>>>
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/changefeed_pause_resume
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Sun May  5 12:57:36 CST 2024] <<<<<< START kafka consumer in kafka_simple_basic_avro case >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c758a980019	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:15532, start at 2024-05-05 12:57:39.285504813 +0800 CST m=+5.184738139	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:39.292 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:39.288 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:39.288 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c758f08001a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:4383, start at 2024-05-05 12:57:39.57181183 +0800 CST m=+5.166746623	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:39.578 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:39.572 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:39.572 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c758a980019	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:15532, start at 2024-05-05 12:57:39.285504813 +0800 CST m=+5.184738139	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:39.292 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:39.288 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:39.288 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c758a700015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:15618, start at 2024-05-05 12:57:39.275843955 +0800 CST m=+5.127954966	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:39.285 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:39.278 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:39.278 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_column_selector/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_column_selector/tiflash/log/error.log
arg matches is ArgMatches { args: {"advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_column_selector/tiflash/log/proxy.log"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_column_selector/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_column_selector/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table test.finish_mark not exists for 1-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c758f08001a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:4383, start at 2024-05-05 12:57:39.57181183 +0800 CST m=+5.166746623	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:39.578 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:39.572 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:39.572 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c758fa40013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:4461, start at 2024-05-05 12:57:39.589385596 +0800 CST m=+5.128504341	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:39.596 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:39.561 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:39.561 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/changefeed_reconstruct/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/changefeed_reconstruct/tiflash/log/error.log
arg matches is ArgMatches { args: {"advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/changefeed_reconstruct/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/changefeed_reconstruct/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/changefeed_reconstruct/tiflash-proxy.toml"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/ddl_puller_lag/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table test.finish_mark not exists for 2-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c75d2440018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85, pid:4229, start at 2024-05-05 12:57:43.867265909 +0800 CST m=+5.122990768	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:43.875 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:43.874 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:43.874 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c75d2440018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85, pid:4229, start at 2024-05-05 12:57:43.867265909 +0800 CST m=+5.122990768	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:43.875 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:43.874 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:43.874 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c75d2f0000e	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85, pid:4310, start at 2024-05-05 12:57:43.885359624 +0800 CST m=+5.095548785	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:43.893 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:43.868 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:43.868 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/drop_many_tables/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[Sun May  5 12:57:44 CST 2024] <<<<<< START cdc server in kafka_column_selector case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_column_selector.1705517057.out server --log-file /tmp/tidb_cdc_test/kafka_column_selector/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_column_selector/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/ddl_manager/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/ddl_manager/tiflash/log/error.log
arg matches is ArgMatches { args: {"config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/ddl_manager/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/ddl_manager/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/ddl_manager/tiflash/log/proxy.log"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Sun May  5 12:57:44 CST 2024] <<<<<< START cdc server in changefeed_reconstruct case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_reconstruct.58645866.out server --log-file /tmp/tidb_cdc_test/changefeed_reconstruct/cdcserver1.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_reconstruct/cdc_dataserver1 --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
start tidb cluster in /tmp/tidb_cdc_test/ddl_puller_lag
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table test.finish_mark not exists for 3-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/cdc/run.sh using Sink-Type: kafka... <<=================
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c75ec180016	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:4867, start at 2024-05-05 12:57:45.527939701 +0800 CST m=+5.111071176	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:45.534 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:45.528 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:45.528 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c75ec180016	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:4867, start at 2024-05-05 12:57:45.527939701 +0800 CST m=+5.111071176	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:45.534 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:45.528 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:45.528 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c75ee7c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:4953, start at 2024-05-05 12:57:45.65304093 +0800 CST m=+5.183574159	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:45.659 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:45.631 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:45.631 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/changefeed_pause_resume/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/changefeed_pause_resume/tiflash/log/error.log
arg matches is ArgMatches { args: {"addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/changefeed_pause_resume/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/changefeed_pause_resume/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/changefeed_pause_resume/tiflash-proxy.toml"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 10.83 secs (343705070 bytes/sec)
[Pipeline] {
[Pipeline] cache
[Sun May  5 12:57:47 CST 2024] <<<<<< START cdc server in ddl_manager case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ExecuteDDLSlowly=return(true)'
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_manager.56295631.out server --log-file /tmp/tidb_cdc_test/ddl_manager/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_manager/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:47 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/91aadcfc-ec96-4b31-8961-fa45746efe35
	{"id":"91aadcfc-ec96-4b31-8961-fa45746efe35","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885064}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d38a3cd
	91aadcfc-ec96-4b31-8961-fa45746efe35

/tidb/cdc/default/default/upstream/7365375204508423971
	{"id":7365375204508423971,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/91aadcfc-ec96-4b31-8961-fa45746efe35
	{"id":"91aadcfc-ec96-4b31-8961-fa45746efe35","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885064}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d38a3cd
	91aadcfc-ec96-4b31-8961-fa45746efe35

/tidb/cdc/default/default/upstream/7365375204508423971
	{"id":7365375204508423971,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/91aadcfc-ec96-4b31-8961-fa45746efe35
	{"id":"91aadcfc-ec96-4b31-8961-fa45746efe35","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885064}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d38a3cd
	91aadcfc-ec96-4b31-8961-fa45746efe35

/tidb/cdc/default/default/upstream/7365375204508423971
	{"id":7365375204508423971,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_column_selector.cli.17110.out cli changefeed create --start-ts=449546830345142273 '--sink-uri=kafka://127.0.0.1:9092/column-selector-test?protocol=canal-json&partition-num=1&enable-tidb-extension=true' -c test --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_column_selector/conf/changefeed.toml
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:47 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5e579050-db85-4d68-b92e-bd6623e1830c
	{"id":"5e579050-db85-4d68-b92e-bd6623e1830c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885065}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d37fdca
	5e579050-db85-4d68-b92e-bd6623e1830c

/tidb/cdc/default/default/upstream/7365375212295188035
	{"id":7365375212295188035,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5e579050-db85-4d68-b92e-bd6623e1830c
	{"id":"5e579050-db85-4d68-b92e-bd6623e1830c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885065}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d37fdca
	5e579050-db85-4d68-b92e-bd6623e1830c

/tidb/cdc/default/default/upstream/7365375212295188035
	{"id":7365375212295188035,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5e579050-db85-4d68-b92e-bd6623e1830c
	{"id":"5e579050-db85-4d68-b92e-bd6623e1830c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885065}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d37fdca
	5e579050-db85-4d68-b92e-bd6623e1830c

/tidb/cdc/default/default/upstream/7365375212295188035
	{"id":7365375212295188035,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Create changefeed successfully!
ID: test
Info: {"upstream_id":7365375204508423971,"namespace":"default","id":"test","sink_uri":"kafka://127.0.0.1:9092/column-selector-test?protocol=canal-json\u0026partition-num=1\u0026enable-tidb-extension=true","create_time":"2024-05-05T12:57:48.067096668+08:00","start_ts":449546830345142273,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"column_selectors":[{"matcher":["test.t1"],"columns":["a","b"]},{"matcher":["test.*"],"columns":["*","!b"]},{"matcher":["test1.t1"],"columns":["column*","!column1"]}],"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546830345142273,"checkpoint_ts":449546830345142273,"checkpoint_time":"2024-05-05 12:57:44.488"}
PASS
[Sun May  5 12:57:48 CST 2024] <<<<<< START kafka consumer in changefeed_reconstruct case >>>>>>
***************** properties *****************
"readproportion"="0"
"mysql.port"="4000"
"recordcount"="50"
"dotransactions"="false"
"requestdistribution"="uniform"
"operationcount"="0"
"insertproportion"="0"
"updateproportion"="0"
"mysql.db"="changefeed_reconstruct"
"threadcount"="4"
"workload"="core"
"readallfields"="true"
"mysql.host"="127.0.0.1"
"mysql.user"="root"
"scanproportion"="0"
**********************************************
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Run finished, takes 31.938724ms
INSERT - Takes(s): 0.0, Count: 47, OPS: 3631.0, Avg(us): 2264, Min(us): 900, Max(us): 18915, 95th(us): 19000, 99th(us): 19000
table changefeed_reconstruct.usertable not exists for 1-th check, retry later
table test.finish_mark not exists for 4-th check, retry later
[Sun May  5 12:57:48 CST 2024] <<<<<< START cdc server in changefeed_pause_resume case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_pause_resume.63306332.out server --log-file /tmp/tidb_cdc_test/changefeed_pause_resume/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_pause_resume/cdc_data --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ set +x
The 1 times to try to start tidb cluster...
Starting build checksum checker...
go: downloading github.com/pingcap/log v1.1.1-0.20240314023424-862ccc32f18d
go: downloading github.com/pingcap/tidb v1.1.0-beta.0.20240415145106-cd9c676e9ba4
go: downloading go.uber.org/zap v1.27.0
go: downloading github.com/aws/aws-sdk-go-v2 v1.19.1
go: downloading github.com/pingcap/tidb/pkg/parser v0.0.0-20240410110152-5fc42c9be2f5
go: downloading github.com/pingcap/tidb-tools v0.0.0-20240305021104-9f9bea84490b
go: downloading github.com/klauspost/compress v1.17.8
go: downloading github.com/tinylib/msgp v1.1.6
go: downloading golang.org/x/net v0.24.0
go: downloading google.golang.org/grpc v1.62.1
go: downloading github.com/BurntSushi/toml v1.3.2
go: downloading github.com/spf13/cobra v1.8.0
go: downloading github.com/tikv/client-go/v2 v2.0.8-0.20240409022718-714958ccd4d5
go: downloading github.com/go-sql-driver/mysql v1.7.1
go: downloading github.com/gin-gonic/gin v1.9.1
go: downloading github.com/gogo/protobuf v1.3.2
go: downloading github.com/xdg/scram v1.0.5
go: downloading github.com/pierrec/lz4/v4 v4.1.18
go: downloading github.com/IBM/sarama v1.41.2
go: downloading github.com/apache/pulsar-client-go v0.11.0
go: downloading github.com/pingcap/kvproto v0.0.0-20240227073058-929ab83f9754
go: downloading github.com/tikv/pd/client v0.0.0-20240322051414-fb9e2d561b6e
go: downloading github.com/coreos/go-semver v0.3.1
go: downloading github.com/json-iterator/go v1.1.12
go: downloading golang.org/x/sync v0.7.0
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azcore v1.9.1
go: downloading github.com/KimMachineGun/automemlimit v0.2.4
go: downloading github.com/pingcap/failpoint v0.0.0-20220801062533-2eaa32854a6c
go: downloading github.com/shirou/gopsutil/v3 v3.24.2
go: downloading github.com/grpc-ecosystem/go-grpc-prometheus v1.2.0
go: downloading cloud.google.com/go/storage v1.39.1
go: downloading github.com/aws/aws-sdk-go v1.50.0
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:50 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/986154aa-a174-46d6-9e0d-0addbb2aa78c
	{"id":"986154aa-a174-46d6-9e0d-0addbb2aa78c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885067}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4cb4c9
	986154aa-a174-46d6-9e0d-0addbb2aa78c

/tidb/cdc/default/default/upstream/7365375225435268540
	{"id":7365375225435268540,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/986154aa-a174-46d6-9e0d-0addbb2aa78c
	{"id":"986154aa-a174-46d6-9e0d-0addbb2aa78c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885067}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4cb4c9
	986154aa-a174-46d6-9e0d-0addbb2aa78c

/tidb/cdc/default/default/upstream/7365375225435268540
	{"id":7365375225435268540,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/986154aa-a174-46d6-9e0d-0addbb2aa78c
	{"id":"986154aa-a174-46d6-9e0d-0addbb2aa78c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885067}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4cb4c9
	986154aa-a174-46d6-9e0d-0addbb2aa78c

/tidb/cdc/default/default/upstream/7365375225435268540
	{"id":7365375225435268540,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_manager.cli.5691.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' -c=ddl-manager
table changefeed_reconstruct.usertable not exists for 2-th check, retry later
go: downloading github.com/modern-go/reflect2 v1.0.2
go: downloading github.com/phayes/freeport v0.0.0-20180830031419-95f893ade6f2
go: downloading github.com/prometheus/client_golang v1.19.0
go: downloading github.com/stretchr/testify v1.9.0
go: downloading golang.org/x/time v0.5.0
go: downloading gopkg.in/natefinch/lumberjack.v2 v2.2.1
go: downloading github.com/containerd/cgroups v1.0.4
go: downloading github.com/xdg/stringprep v1.0.3
go: downloading golang.org/x/crypto v0.22.0
go: downloading github.com/tikv/pd v1.1.0-beta.0.20240407022249-7179657d129b
go: downloading github.com/philhofer/fwd v1.1.1
go: downloading github.com/spf13/pflag v1.0.5
go: downloading github.com/stretchr/objx v0.5.2
go: downloading github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc
go: downloading golang.org/x/text v0.14.0
go: downloading github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2
go: downloading gopkg.in/yaml.v3 v3.0.1
go: downloading github.com/godbus/dbus/v5 v5.0.4
go: downloading github.com/cilium/ebpf v0.4.0
go: downloading github.com/docker/go-units v0.5.0
go: downloading github.com/coreos/go-systemd/v22 v22.5.0
go: downloading github.com/opencontainers/runtime-spec v1.0.2
go: downloading golang.org/x/sys v0.19.0
go: downloading github.com/sirupsen/logrus v1.9.3
go: downloading github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd
go: downloading github.com/gin-contrib/sse v0.1.0
go: downloading github.com/mattn/go-isatty v0.0.20
Create changefeed successfully!
ID: ddl-manager
Info: {"upstream_id":7365375225435268540,"namespace":"default","id":"ddl-manager","sink_uri":"kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:57:50.590091651+08:00","start_ts":449546831901229061,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546831901229061,"checkpoint_ts":449546831901229061,"checkpoint_time":"2024-05-05 12:57:50.424"}
PASS
table test.finish_mark not exists for 5-th check, retry later
go: downloading github.com/go-playground/validator/v10 v10.14.0
go: downloading github.com/pelletier/go-toml/v2 v2.0.8
go: downloading github.com/ugorji/go/codec v1.2.11
go: downloading cloud.google.com/go v0.112.2
go: downloading google.golang.org/protobuf v1.33.0
go: downloading go.uber.org/multierr v1.11.0
go: downloading github.com/Azure/azure-sdk-for-go/sdk/internal v1.5.1
go: downloading github.com/godbus/dbus v0.0.0-20190726142602-4481cbc300e2
go: downloading github.com/beorn7/perks v1.0.1
go: downloading github.com/cespare/xxhash/v2 v2.3.0
go: downloading github.com/prometheus/client_model v0.6.1
go: downloading github.com/prometheus/common v0.52.2
go: downloading github.com/prometheus/procfs v0.13.0
go: downloading github.com/aws/smithy-go v1.13.5
go: downloading github.com/golang/protobuf v1.5.4
go: downloading github.com/go-playground/universal-translator v0.18.1
go: downloading github.com/gabriel-vasile/mimetype v1.4.2
go: downloading github.com/leodido/go-urn v1.2.4
go: downloading github.com/bits-and-blooms/bitset v1.4.0
go: downloading github.com/linkedin/goavro/v2 v2.11.1
go: downloading github.com/pkg/errors v0.9.1
go: downloading github.com/eapache/go-resiliency v1.4.0
go: downloading github.com/eapache/go-xerial-snappy v0.0.0-20230731223053-c322873962e3
go: downloading github.com/eapache/queue v1.1.0
go: downloading github.com/hashicorp/go-multierror v1.1.1
go: downloading github.com/jcmturner/gofork v1.7.6
go: downloading github.com/jcmturner/gokrb5/v8 v8.4.4
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
go: downloading github.com/rcrowley/go-metrics v0.0.0-20201227073835-cf1acfcdf475
go: downloading github.com/AthenZ/athenz v1.10.39
go: downloading github.com/spaolacci/murmur3 v1.1.0
go: downloading golang.org/x/mod v0.17.0
go: downloading golang.org/x/oauth2 v0.18.0
go: downloading github.com/DataDog/zstd v1.5.5
go: downloading github.com/pierrec/lz4 v2.6.1+incompatible
go: downloading github.com/go-playground/locales v0.14.1
go: downloading cloud.google.com/go/compute/metadata v0.2.3
go: downloading cloud.google.com/go/iam v1.1.7
go: downloading github.com/google/uuid v1.6.0
go: downloading github.com/googleapis/gax-go/v2 v2.12.3
go: downloading cloud.google.com/go/compute v1.25.1
go: downloading google.golang.org/api v0.170.0
go: downloading google.golang.org/genproto v0.0.0-20240401170217-c3f982113cda
go: downloading github.com/hashicorp/errwrap v1.0.0
go: downloading github.com/golang/snappy v0.0.4
go: downloading github.com/jcmturner/dnsutils/v2 v2.0.0
go: downloading github.com/99designs/keyring v1.2.1
go: downloading github.com/hashicorp/go-uuid v1.0.3
go: downloading github.com/golang-jwt/jwt v3.2.2+incompatible
go: downloading go.opencensus.io v0.23.1-0.20220331163232-052120675fac
go: downloading go.opentelemetry.io/otel v1.24.0
go: downloading go.opentelemetry.io/otel/trace v1.24.0
go: downloading google.golang.org/genproto/googleapis/rpc v0.0.0-20240401170217-c3f982113cda
go: downloading google.golang.org/genproto/googleapis/api v0.0.0-20240401170217-c3f982113cda
go: downloading github.com/opentracing/opentracing-go v1.2.0
go: downloading github.com/jcmturner/rpc/v2 v2.0.3
go: downloading github.com/dvsekhvalnov/jose2go v1.5.0
go: downloading github.com/gsterjov/go-libsecret v0.0.0-20161001094733-a6f4afe4910c
go: downloading github.com/mtibben/percent v0.2.1
go: downloading golang.org/x/term v0.19.0
go: downloading github.com/jcmturner/aescts/v2 v2.0.0
go: downloading github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
go: downloading github.com/go-logr/logr v1.4.1
go: downloading go.opentelemetry.io/otel/metric v1.24.0
go: downloading github.com/go-logr/stdr v1.2.2
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:51 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/2f9a8644-b818-416b-a9dd-2048a5e01a22
	{"id":"2f9a8644-b818-416b-a9dd-2048a5e01a22","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885069}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4f73c3
	2f9a8644-b818-416b-a9dd-2048a5e01a22

/tidb/cdc/default/default/upstream/7365375237395232766
	{"id":7365375237395232766,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/2f9a8644-b818-416b-a9dd-2048a5e01a22
	{"id":"2f9a8644-b818-416b-a9dd-2048a5e01a22","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885069}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4f73c3
	2f9a8644-b818-416b-a9dd-2048a5e01a22

/tidb/cdc/default/default/upstream/7365375237395232766
	{"id":7365375237395232766,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/2f9a8644-b818-416b-a9dd-2048a5e01a22
	{"id":"2f9a8644-b818-416b-a9dd-2048a5e01a22","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885069}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4f73c3
	2f9a8644-b818-416b-a9dd-2048a5e01a22

/tidb/cdc/default/default/upstream/7365375237395232766
	{"id":7365375237395232766,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 12:57:52 CST 2024] <<<<<< START kafka consumer in changefeed_pause_resume case >>>>>>
+ set +x
[Sun May  5 12:57:52 CST 2024] <<<<<< START kafka consumer in ddl_manager case >>>>>>
table changefeed_reconstruct.usertable exists
table test.finish_mark not exists for 6-th check, retry later
check diff successfully
table changefeed_pause_resume.t1 not exists for 1-th check, retry later
go: downloading github.com/ardielle/ardielle-go v1.5.2
start tidb cluster in /tmp/tidb_cdc_test/drop_many_tables
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
wait process 5869 exit for 1-th time...
start tidb cluster in /tmp/tidb_cdc_test/cdc
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
go: downloading github.com/cznic/mathutil v0.0.0-20181122101859-297441e03548
go: downloading golang.org/x/exp v0.0.0-20240409090435-93d18d7e34b8
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.5.1
go: downloading github.com/aliyun/alibaba-cloud-sdk-go v1.61.1581
go: downloading github.com/coocood/freecache v1.2.1
go: downloading github.com/grpc-ecosystem/go-grpc-middleware v1.4.0
go: downloading github.com/uber/jaeger-client-go v2.30.0+incompatible
go: downloading github.com/pingcap/tipb v0.0.0-20240318032315-55a7867ddd50
go: downloading github.com/google/btree v1.1.2
go: downloading github.com/jellydator/ttlcache/v3 v3.0.1
go: downloading github.com/tiancaiamao/gp v0.0.0-20221230034425-4025bc8a4d4a
go: downloading github.com/Azure/azure-sdk-for-go/sdk/storage/azblob v1.0.0
go: downloading github.com/pingcap/sysutil v1.0.1-0.20240311050922-ae81ee01f3a5
go: downloading github.com/cloudfoundry/gosigar v1.3.6
go: downloading github.com/ks3sdklib/aws-sdk-go v1.2.9
go: downloading github.com/danjacques/gofslock v0.0.0-20240212154529-d899e02bfe22
go: downloading github.com/go-resty/resty/v2 v2.11.0
go: downloading github.com/cockroachdb/errors v1.11.1
go: downloading go.etcd.io/etcd/client/v3 v3.5.12
go: downloading go.etcd.io/etcd/api/v3 v3.5.12
go: downloading github.com/opentracing/basictracer-go v1.1.0
go: downloading github.com/influxdata/tdigest v0.0.1
go: downloading github.com/twmb/murmur3 v1.1.6
go: downloading github.com/dgryski/go-farm v0.0.0-20200201041132-a6ae2369ad13
go: downloading github.com/dolthub/swiss v0.2.1
go: downloading golang.org/x/tools v0.20.0
go: downloading github.com/ngaut/pools v0.0.0-20180318154953-b7bc8c42aac7
go: downloading github.com/yangkeao/ldap/v3 v3.4.5-0.20230421065457-369a3bab1117
go: downloading gopkg.in/yaml.v2 v2.4.0
go: downloading github.com/tklauser/go-sysconf v0.3.12
go: downloading github.com/ngaut/sync2 v0.0.0-20141008032647-7a24ed77b2ef
go: downloading github.com/dolthub/maphash v0.1.0
go: downloading github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec
go: downloading github.com/go-asn1-ber/asn1-ber v1.5.4
go: downloading github.com/Azure/go-ntlmssp v0.0.0-20221128193559-754e69321358
go: downloading github.com/AzureAD/microsoft-authentication-library-for-go v1.2.1
go: downloading github.com/tklauser/numcpus v0.6.1
go: downloading go.etcd.io/etcd/client/pkg/v3 v3.5.12
go: downloading github.com/kylelemons/godebug v1.1.0
go: downloading github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c
go: downloading github.com/golang-jwt/jwt/v5 v5.2.0
go: downloading github.com/uber/jaeger-lib v2.4.1+incompatible
wait process 5869 exit for 2-th time...
go: downloading github.com/getsentry/sentry-go v0.27.0
go: downloading github.com/cockroachdb/logtags v0.0.0-20230118201751-21c54148d20b
go: downloading github.com/cockroachdb/redact v1.1.5
go: downloading github.com/kr/pretty v0.3.1
go: downloading github.com/kr/text v0.2.0
go: downloading github.com/rogpeppe/go-internal v1.12.0
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils/kill_cdc_pid: line 19: kill: (5869) - No such process
wait process 5869 exit for 3-th time...
process 5869 already exit
check_no_capture http://127.0.0.1:2379
parse error: Invalid numeric literal at line 1, column 6
run task successfully
[Sun May  5 12:57:54 CST 2024] <<<<<< START cdc server in changefeed_reconstruct case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_reconstruct.61476149.out server --log-file /tmp/tidb_cdc_test/changefeed_reconstruct/cdcserver2.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_reconstruct/cdc_dataserver2 --cluster-id default --addr 127.0.0.1:8300
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table test.finish_mark not exists for 7-th check, retry later
table changefeed_pause_resume.t1 not exists for 2-th check, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
go: downloading github.com/jmespath/go-jmespath v0.4.0
go: downloading go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.49.0
go: downloading github.com/googleapis/enterprise-certificate-proxy v0.3.2
go: downloading github.com/google/s2a-go v0.1.7
go: downloading go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.49.0
go: downloading github.com/felixge/httpsnoop v1.0.4
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Cache restored successfully (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc)
3723625472 bytes in 6.95 secs (535618670 bytes/sec)
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
[Pipeline] {
[Pipeline] timeout
Timeout set to expire in 6 min 0 sec
table test.finish_mark not exists for 8-th check, retry later
[Pipeline] {
[Pipeline] sh
[Pipeline] sh
Starting Upstream TiDB...
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] sh
table changefeed_pause_resume.t1 exists
table changefeed_pause_resume.t2 exists
table changefeed_pause_resume.t3 not exists for 1-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c768f78000c	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:4150, start at 2024-05-05 12:57:55.947178636 +0800 CST m=+5.119209888	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:55.954 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:55.934 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:55.934 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c768f78000c	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:4150, start at 2024-05-05 12:57:55.947178636 +0800 CST m=+5.119209888	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:55.954 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:55.934 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:55.934 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c768f7c0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:4239, start at 2024-05-05 12:57:55.977948651 +0800 CST m=+5.093368831	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-12:59:55.984 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:57:55.985 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:47:55.985 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] sh
Logging trace to /tmp/tidb_cdc_test/ddl_puller_lag/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/ddl_puller_lag/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/ddl_puller_lag/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/ddl_puller_lag/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/ddl_puller_lag/tiflash-proxy.toml"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:57:57 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:

changefeedID: default/da2634f4-a2b2-4471-9cf9-deab938703f7
{UpstreamID:7365375212295188035 Namespace:default ID:da2634f4-a2b2-4471-9cf9-deab938703f7 SinkURI:kafka://127.0.0.1:9092/ticdc-changefeed-reconstruct-17687?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:57:48.006710747 +0800 CST StartTs:449546831232499714 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc004093680 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546831258451973}
{CheckpointTs:449546833486938115 MinTableBarrierTs:449546833486938117 AdminJobType:noop}
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449546833486938115, checkpointTs: 449546833486938115, state: Replicating



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4072d829-0675-43d2-a28c-9e5b29e0ccac
	{"id":"4072d829-0675-43d2-a28c-9e5b29e0ccac","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885074}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d37fe60
	4072d829-0675-43d2-a28c-9e5b29e0ccac

/tidb/cdc/default/default/changefeed/info/da2634f4-a2b2-4471-9cf9-deab938703f7
	{"upstream-id":7365375212295188035,"namespace":"default","changefeed-id":"da2634f4-a2b2-4471-9cf9-deab938703f7","sink-uri":"kafka://127.0.0.1:9092/ticdc-changefeed-reconstruct-17687?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:57:48.006710747+08:00","start-ts":449546831232499714,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546831258451973}

/tidb/cdc/default/default/changefeed/status/da2634f4-a2b2-4471-9cf9-deab938703f7
	{"checkpoint-ts":449546833486938115,"min-table-barrier-ts":449546833486938117,"admin-job-type":0}

/tidb/cdc/default/default/task/position/4072d829-0675-43d2-a28c-9e5b29e0ccac/da2634f4-a2b2-4471-9cf9-deab938703f7
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375212295188035
	{"id":7365375212295188035,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/da2634f4-a2b2-4471-9cf9-deab938703f7
{UpstreamID:7365375212295188035 Namespace:default ID:da2634f4-a2b2-4471-9cf9-deab938703f7 SinkURI:kafka://127.0.0.1:9092/ticdc-changefeed-reconstruct-17687?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:57:48.006710747 +0800 CST StartTs:449546831232499714 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc004093680 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546831258451973}
{CheckpointTs:449546833486938115 MinTableBarrierTs:449546833486938117 AdminJobType:noop}
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449546833486938115, checkpointTs: 449546833486938115, state: Replicating



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4072d829-0675-43d2-a28c-9e5b29e0ccac
	{"id":"4072d829-0675-43d2-a28c-9e5b29e0ccac","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885074}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d37fe60
	4072d829-0675-43d2-a28c-9e5b29e0ccac

/tidb/cdc/default/default/changefeed/info/da2634f4-a2b2-4471-9cf9-deab938703f7
	{"upstream-id":7365375212295188035,"namespace":"default","changefeed-id":"da2634f4-a2b2-4471-9cf9-deab938703f7","sink-uri":"kafka://127.0.0.1:9092/ticdc-changefeed-reconstruct-17687?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:57:48.006710747+08:00","start-ts":449546831232499714,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546831258451973}

/tidb/cdc/default/default/changefeed/status/da2634f4-a2b2-4471-9cf9-deab938703f7
	{"checkpoint-ts":449546833486938115,"min-table-barrier-ts":449546833486938117,"admin-job-type":0}

/tidb/cdc/default/default/task/position/4072d829-0675-43d2-a28c-9e5b29e0ccac/da2634f4-a2b2-4471-9cf9-deab938703f7
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warnin+ grep -q 'failed to get info:'
g":null}

/tidb/cdc/default/default/upstream/7365375212295188035
	{"id":7365375212295188035,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/da2634f4-a2b2-4471-9cf9-deab938703f7
{UpstreamID:7365375212295188035 Namespace:default ID:da2634f4-a2b2-4471-9cf9-deab938703f7 SinkURI:kafka://127.0.0.1:9092/ticdc-changefeed-reconstruct-17687?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:57:48.006710747 +0800 CST StartTs:449546831232499714 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc004093680 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546831258451973}
{CheckpointTs:449546833486938115 MinTableBarrierTs:449546833486938117 AdminJobType:noop}
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449546833486938115, checkpointTs: 449546833486938115, state: Replicating



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4072d829-0675-43d2-a28c-9e5b29e0ccac
	{"id":"4072d829-0675-43d2-a28c-9e5b29e0ccac","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885074}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d37fe60
	4072d829-0675-43d2-a28c-9e5b29e0ccac

/tidb/cdc/default/default/changefeed/info/da2634f4-a2b2-4471-9cf9-deab938703f7
	{"upstream-id":7365375212295188035,"namespace":"default","changefeed-id":"da2634f4-a2b2-4471-9cf9-deab938703f7","sink-uri":"kafka://127.0.0.1:9092/ticdc-changefeed-reconstruct-17687?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:57:48.006710747+08:00","start-ts":449546831232499714,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546831258451973}

/tidb/cdc/default/default/changefeed/status/da2634f4-a2b2-4471-9cf9-deab938703f7
	{"checkpoint-ts":449546833486938115,"min-table-barrier-ts":449546833486938117,"admin-job-type":0}

/tidb/cdc/default/default/task/position/4072d829-0675-43d2-a28c-9e5b29e0ccac/da2634f4-a2b2-4471-9cf9-deab938703f7
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375212295188035
	{"id":7365375212295188035,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ break
+ set +x
cdc.test cli capture list --pd=http://127.0.0.1:2379 2>&1 | grep id
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk+  {$1=$1;print}
grep -F -w /brokers/ids/1
/brokers/ids/1
[Pipeline] sh
[Pipeline] sh
table test.finish_mark not exists for 9-th check, retry later
    "id": "4072d829-0675-43d2-a28c-9e5b29e0ccac",
    "cluster-id": "default"
run task successfully
table changefeed_pause_resume.t3 exists
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_puller_lag.cli.5616.out cli tso query --pd=http://127.0.0.1:2379
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 1-th time, retry later
capture_id: 4072d829-0675-43d2-a28c-9e5b29e0ccac
check_processor_table_count http://127.0.0.1:2379 da2634f4-a2b2-4471-9cf9-deab938703f7 4072d829-0675-43d2-a28c-9e5b29e0ccac 1
run task successfully
table test.finish_mark not exists for 10-th check, retry later
check_processor_table_count http://127.0.0.1:2379 da2634f4-a2b2-4471-9cf9-deab938703f7 4072d829-0675-43d2-a28c-9e5b29e0ccac 0
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
table count 1 does equal to expected count 0
run task failed 1-th time, retry later
+ set +x
+ tso='449546834276515841
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546834276515841 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ echo Waiting for zookeeper to be ready...
Waiting for zookeeper to be ready...
+ nc -z localhost 2181
+ echo Waiting for kafka to be ready...
Waiting for kafka to be ready...
+ nc -z localhost 9092
+ echo Waiting for kafka-broker to be ready...
Waiting for kafka-broker to be ready...
+ echo dump
+ nc localhost 2181
+ grep brokers
+ awk {$1=$1;print}
+ grep -F -w /brokers/ids/1
/brokers/ids/1
[Sun May  5 12:58:01 CST 2024] <<<<<< START cdc server in ddl_puller_lag case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/processor/processorDDLResolved=1*sleep(180000)'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_puller_lag.56715673.out server --log-file /tmp/tidb_cdc_test/ddl_puller_lag/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_puller_lag/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 11-th check, retry later
check diff failed 2-th time, retry later
check_processor_table_count http://127.0.0.1:2379 da2634f4-a2b2-4471-9cf9-deab938703f7 4072d829-0675-43d2-a28c-9e5b29e0ccac 0
[Sun May  5 12:57:56 CST 2024] <<<<<< START kafka consumer in multi_topics_v2 case >>>>>>
schema registry uri found: 1
[Sun May  5 12:57:56 CST 2024] <<<<<< START kafka consumer in multi_topics_v2 case >>>>>>
schema registry uri found: 2
[Sun May  5 12:57:56 CST 2024] <<<<<< START kafka consumer in multi_topics_v2 case >>>>>>
schema registry uri found: 3
table test.table1 not exists for 1-th check, retry later
table test.table1 not exists for 2-th check, retry later
table test.table1 exists
table test.table2 exists
table test.table3 exists
check diff successfully
table test.table10 not exists for 1-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
run task successfully
table test.table10 exists
table test.table20 exists
check diff successfully
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
***************** properties *****************
"insertproportion"="0"
"readproportion"="0"
"mysql.host"="127.0.0.1"
"mysql.user"="root"
"workload"="core"
"recordcount"="50"
"updateproportion"="0"
"operationcount"="0"
"dotransactions"="false"
"requestdistribution"="uniform"
"readallfields"="true"
"threadcount"="4"
"scanproportion"="0"
"mysql.port"="4000"
"mysql.db"="changefeed_reconstruct"
**********************************************
Run finished, takes 17.021463ms
INSERT - Takes(s): 0.0, Count: 48, OPS: 3730.9, Avg(us): 1349, Min(us): 802, Max(us): 4346, 95th(us): 5000, 99th(us): 5000
table changefeed_reconstruct.usertable not exists for 1-th check, retry later
table sink_retry.finish_mark_2 exists
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:58:02 CST 2024] <<<<<< run test case sink_retry success! >>>>>>
table test.finish_mark not exists for 12-th check, retry later
check diff failed 3-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c77062c0009	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:4158, start at 2024-05-05 12:58:03.539103578 +0800 CST m=+6.617220205	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:03.547 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:03.531 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:03.531 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c77062c0009	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:4158, start at 2024-05-05 12:58:03.539103578 +0800 CST m=+6.617220205	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:03.547 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:03.531 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:03.531 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c76eeb40015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:4238, start at 2024-05-05 12:58:02.069018863 +0800 CST m=+5.089890844	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:02.077 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:02.080 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:02.080 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/drop_many_tables/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/drop_many_tables/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/drop_many_tables/tiflash/db/proxy"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/drop_many_tables/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/drop_many_tables/tiflash-proxy.toml"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:05 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/be2a4ebb-5e9d-431a-a98c-b89b0441ed83
	{"id":"be2a4ebb-5e9d-431a-a98c-b89b0441ed83","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885081}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d7a0eea
	be2a4ebb-5e9d-431a-a98c-b89b0441ed83

/tidb/cdc/default/default/upstream/7365375281953843541
	{"id":7365375281953843541,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/be2a4ebb-5e9d-431a-a98c-b89b0441ed83
	{"id":"be2a4ebb-5e9d-431a-a98c-b89b0441ed83","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885081}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d7a0eea
	be2a4ebb-5e9d-431a-a98c-b89b0441ed83

/tidb/cdc/default/default/upstream/7365375281953843541
	{"id":7365375281953843541,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/be2a4ebb-5e9d-431a-a98c-b89b0441ed83
	{"id":"be2a4ebb-5e9d-431a-a98c-b89b0441ed83","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885081}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d7a0eea
	be2a4ebb-5e9d-431a-a98c-b89b0441ed83

/tidb/cdc/default/default/upstream/7365375281953843541
	{"id":7365375281953843541,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_puller_lag.cli.5730.out cli changefeed create --start-ts=449546834276515841 '--sink-uri=kafka+ssl://127.0.0.1:9092/ticdc-ddl-puller-lag-test-23102?protocol=open-protocol&partition-num=4&kafka-client-id=ddl_puller_lag&kafka-version=2.4.1&max-message-bytes=10485760'
table changefeed_reconstruct.usertable exists
check diff failed 1-th time, retry later
Create changefeed successfully!
ID: 89967a4c-6ceb-4fc2-803a-abe75d556dfb
Info: {"upstream_id":7365375281953843541,"namespace":"default","id":"89967a4c-6ceb-4fc2-803a-abe75d556dfb","sink_uri":"kafka+ssl://127.0.0.1:9092/ticdc-ddl-puller-lag-test-23102?protocol=open-protocol\u0026partition-num=4\u0026kafka-client-id=ddl_puller_lag\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:58:05.986238567+08:00","start_ts":449546834276515841,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546834276515841,"checkpoint_ts":449546834276515841,"checkpoint_time":"2024-05-05 12:57:59.485"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table test.finish_mark not exists for 13-th check, retry later
check diff failed 4-th time, retry later
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.drop_many_tables.cli.5697.out cli tso query --pd=http://127.0.0.1:2379
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c770f9c0013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv, pid:4545, start at 2024-05-05 12:58:04.183931021 +0800 CST m=+5.303373392	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:04.191 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:04.185 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:04.185 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c770f9c0013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv, pid:4545, start at 2024-05-05 12:58:04.183931021 +0800 CST m=+5.303373392	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:04.191 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:04.185 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:04.185 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c77111c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv, pid:4627, start at 2024-05-05 12:58:04.259873418 +0800 CST m=+5.328057840	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:04.269 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:04.231 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:04.231 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/cdc/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/cdc/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/cdc/tiflash-proxy.toml"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/cdc/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/cdc/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ set +x
[Sun May  5 12:58:07 CST 2024] <<<<<< START kafka consumer in ddl_puller_lag case >>>>>>
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
table test.finish_mark not exists for 14-th check, retry later
check diff failed 5-th time, retry later
+ set +x
+ tso='449546836385726467
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546836385726467 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:58:08 CST 2024] <<<<<< START cdc server in drop_many_tables case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.drop_many_tables.57355737.out server --log-file /tmp/tidb_cdc_test/drop_many_tables/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/drop_many_tables/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:58:09 CST 2024] <<<<<< run test case changefeed_reconstruct success! >>>>>>
[Sun May  5 12:58:09 CST 2024] <<<<<< START cdc server in cdc case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc.60256027.out server --log-file /tmp/tidb_cdc_test/cdc/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cdc/cdc_data --cluster-id default
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
wait process 5634 exit for 1-th time...
wait process 5634 exit for 2-th time...
wait process 5634 exit for 3-th time...
wait process 5634 exit for 4-th time...
wait process 5634 exit for 5-th time...
wait process 5634 exit for 6-th time...
wait process 5634 exit for 7-th time...
table test.finish_mark not exists for 15-th check, retry later
check diff successfully
wait process 5634 exit for 8-th time...
wait process 5634 exit for 9-th time...
wait process 5634 exit for 10-th time...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:12 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c0dea2bd-447c-47f8-9daa-3757e1a111ea
	{"id":"c0dea2bd-447c-47f8-9daa-3757e1a111ea","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885089}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d93d3d4
	c0dea2bd-447c-47f8-9daa-3757e1a111ea

/tidb/cdc/default/default/upstream/7365375307019892374
	{"id":7365375307019892374,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c0dea2bd-447c-47f8-9daa-3757e1a111ea
	{"id":"c0dea2bd-447c-47f8-9daa-3757e1a111ea","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885089}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d93d3d4
	c0dea2bd-447c-47f8-9daa-3757e1a111ea

/tidb/cdc/default/default/upstream/7365375307019892374
	{"id":7365375307019892374,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c0dea2bd-447c-47f8-9daa-3757e1a111ea
	{"id":"c0dea2bd-447c-47f8-9daa-3757e1a111ea","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885089}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d93d3d4
	c0dea2bd-447c-47f8-9daa-3757e1a111ea

/tidb/cdc/default/default/upstream/7365375307019892374
	{"id":7365375307019892374,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.drop_many_tables.cli.5790.out cli changefeed create --start-ts=449546836385726467 '--sink-uri=kafka://127.0.0.1:9092/ticdc-drop-tables-test-12223?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
wait process 5634 exit for 11-th time...
Create changefeed successfully!
ID: 1493282b-7f3d-4270-b641-080de44cc4c4
Info: {"upstream_id":7365375307019892374,"namespace":"default","id":"1493282b-7f3d-4270-b641-080de44cc4c4","sink_uri":"kafka://127.0.0.1:9092/ticdc-drop-tables-test-12223?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:58:12.560382422+08:00","start_ts":449546836385726467,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546836385726467,"checkpoint_ts":449546836385726467,"checkpoint_time":"2024-05-05 12:58:07.531"}
PASS
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:12 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9fa7890b-d92f-4781-b8bb-978e43234441
	{"id":"9fa7890b-d92f-4781-b8bb-978e43234441","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885089}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d978ccc
	9fa7890b-d92f-4781-b8bb-978e43234441

/tidb/cdc/default/default/upstream/7365375317453085148
	{"id":7365375317453085148,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9fa7890b-d92f-4781-b8bb-978e43234441
	{"id":"9fa7890b-d92f-4781-b8bb-978e43234441","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885089}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d978ccc
	9fa7890b-d92f-4781-b8bb-978e43234441

/tidb/cdc/default/default/upstream/7365375317453085148
	{"id":7365375317453085148,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9fa7890b-d92f-4781-b8bb-978e43234441
	{"id":"9fa7890b-d92f-4781-b8bb-978e43234441","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885089}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d978ccc
	9fa7890b-d92f-4781-b8bb-978e43234441

/tidb/cdc/default/default/upstream/7365375317453085148
	{"id":7365375317453085148,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc.cli.6085.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-cdc-test-13029?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' --config /tmp/tidb_cdc_test/cdc/pulsar_test.toml
table test.finish_mark not exists for 16-th check, retry later
check diff failed 1-th time, retry later
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Create changefeed successfully!
ID: 269a7428-106c-4d29-827b-e1d83015778b
Info: {"upstream_id":7365375317453085148,"namespace":"default","id":"269a7428-106c-4d29-827b-e1d83015778b","sink_uri":"kafka://127.0.0.1:9092/ticdc-cdc-test-13029?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:58:12.737525181+08:00","start_ts":449546837710864389,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546837710864389,"checkpoint_ts":449546837710864389,"checkpoint_time":"2024-05-05 12:58:12.586"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
wait process 5634 exit for 12-th time...
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils/kill_cdc_pid: line 19: kill: (5634) - No such process
wait process 5634 exit for 13-th time...
process 5634 already exit
[Sun May  5 12:58:13 CST 2024] <<<<<< START cdc server in ddl_manager case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ExecuteDDLSlowly=return(true)'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_manager.58395841.out server --log-file /tmp/tidb_cdc_test/ddl_manager/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_manager/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/changefeed_error/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
+ set +x
[Sun May  5 12:58:14 CST 2024] <<<<<< START kafka consumer in drop_many_tables case >>>>>>
+ set +x
[Sun May  5 12:58:14 CST 2024] <<<<<< START kafka consumer in cdc case >>>>>>
go: downloading github.com/pingcap/errors v0.11.5-0.20240318064555-6bd07397691f
go: downloading github.com/pingcap/log v1.1.1-0.20240314023424-862ccc32f18d
go: downloading github.com/go-sql-driver/mysql v1.7.1
go: downloading github.com/BurntSushi/toml v1.3.2
go: downloading github.com/pingcap/tidb v1.1.0-beta.0.20240415145106-cd9c676e9ba4
go: downloading github.com/pingcap/tidb-tools v0.0.0-20240305021104-9f9bea84490b
go: downloading go.uber.org/zap v1.27.0
go: downloading github.com/pingcap/tidb/pkg/parser v0.0.0-20240410110152-5fc42c9be2f5
go: downloading gopkg.in/natefinch/lumberjack.v2 v2.2.1
go: downloading go.uber.org/atomic v1.11.0
go: downloading go.uber.org/multierr v1.11.0
go: downloading github.com/pingcap/failpoint v0.0.0-20220801062533-2eaa32854a6c
go: downloading google.golang.org/grpc v1.62.1
go: downloading github.com/coreos/go-semver v0.3.1
table test.finish_mark exists
check diff successfully
check diff failed 2-th time, retry later
go: downloading github.com/golang/protobuf v1.5.4
go: downloading golang.org/x/net v0.24.0
go: downloading google.golang.org/protobuf v1.33.0
go: downloading golang.org/x/sys v0.19.0
go: downloading google.golang.org/genproto/googleapis/rpc v0.0.0-20240401170217-c3f982113cda
go: downloading google.golang.org/genproto v0.0.0-20240401170217-c3f982113cda
go: downloading golang.org/x/text v0.14.0
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
table drop_tables.c not exists for 1-th check, retry later
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] sh
check diff failed 3-th time, retry later
[Pipeline] sh
go: downloading github.com/cznic/mathutil v0.0.0-20181122101859-297441e03548
go: downloading golang.org/x/exp v0.0.0-20240409090435-93d18d7e34b8
go: downloading golang.org/x/sync v0.7.0
go: downloading github.com/pingcap/tipb v0.0.0-20240318032315-55a7867ddd50
go: downloading github.com/influxdata/tdigest v0.0.1
go: downloading go.etcd.io/etcd/client/v3 v3.5.12
go: downloading github.com/opentracing/opentracing-go v1.2.0
go: downloading github.com/tikv/client-go/v2 v2.0.8-0.20240409022718-714958ccd4d5
go: downloading github.com/pingcap/kvproto v0.0.0-20240227073058-929ab83f9754
go: downloading github.com/grpc-ecosystem/go-grpc-middleware v1.4.0
go: downloading github.com/ngaut/pools v0.0.0-20180318154953-b7bc8c42aac7
go: downloading github.com/uber/jaeger-client-go v2.30.0+incompatible
go: downloading github.com/tiancaiamao/gp v0.0.0-20221230034425-4025bc8a4d4a
go: downloading github.com/spf13/pflag v1.0.5
go: downloading github.com/prometheus/client_golang v1.19.0
go: downloading github.com/pingcap/sysutil v1.0.1-0.20240311050922-ae81ee01f3a5
go: downloading github.com/stretchr/testify v1.9.0
go: downloading github.com/twmb/murmur3 v1.1.6
go: downloading github.com/jellydator/ttlcache/v3 v3.0.1
go: downloading github.com/tikv/pd/client v0.0.0-20240322051414-fb9e2d561b6e
go: downloading github.com/shirou/gopsutil/v3 v3.24.2
go: downloading github.com/danjacques/gofslock v0.0.0-20240212154529-d899e02bfe22
go: downloading github.com/docker/go-units v0.5.0
go: downloading github.com/google/uuid v1.6.0
go: downloading github.com/scalalang2/golang-fifo v0.1.5
go: downloading github.com/tidwall/btree v1.7.0
go: downloading github.com/coocood/freecache v1.2.1
go: downloading gopkg.in/yaml.v2 v2.4.0
go: downloading github.com/gorilla/mux v1.8.0
go: downloading github.com/cockroachdb/errors v1.11.1
go: downloading github.com/google/btree v1.1.2
go: downloading github.com/yangkeao/ldap/v3 v3.4.5-0.20230421065457-369a3bab1117
go: downloading cloud.google.com/go/storage v1.39.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azcore v1.9.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.5.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/storage/azblob v1.0.0
go: downloading github.com/aliyun/alibaba-cloud-sdk-go v1.61.1581
go: downloading github.com/aws/aws-sdk-go v1.50.0
go: downloading github.com/go-resty/resty/v2 v2.11.0
go: downloading github.com/klauspost/compress v1.17.8
go: downloading github.com/ks3sdklib/aws-sdk-go v1.2.9
go: downloading golang.org/x/oauth2 v0.18.0
go: downloading google.golang.org/api v0.170.0
go: downloading github.com/tikv/pd v1.1.0-beta.0.20240407022249-7179657d129b
go: downloading github.com/prometheus/client_model v0.6.1
go: downloading go.etcd.io/etcd/api/v3 v3.5.12
go: downloading github.com/cockroachdb/pebble v1.1.0
go: downloading go.uber.org/mock v0.4.0
go: downloading github.com/dgraph-io/ristretto v0.1.1
go: downloading github.com/gogo/protobuf v1.3.2
go: downloading github.com/carlmjohnson/flagext v0.21.0
go: downloading github.com/jfcg/sorty/v2 v2.1.0
go: downloading github.com/golang/snappy v0.0.4
go: downloading golang.org/x/time v0.5.0
go: downloading github.com/dolthub/swiss v0.2.1
go: downloading github.com/opentracing/basictracer-go v1.1.0
go: downloading cloud.google.com/go v0.112.2
go: downloading golang.org/x/tools v0.20.0
go: downloading github.com/ngaut/sync2 v0.0.0-20141008032647-7a24ed77b2ef
go: downloading github.com/cespare/xxhash/v2 v2.3.0
go: downloading github.com/Azure/go-ntlmssp v0.0.0-20221128193559-754e69321358
go: downloading github.com/go-asn1-ber/asn1-ber v1.5.4
go: downloading github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec
go: downloading github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc
go: downloading github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2
go: downloading gopkg.in/yaml.v3 v3.0.1
go: downloading go.etcd.io/etcd/client/pkg/v3 v3.5.12
go: downloading github.com/Azure/azure-sdk-for-go/sdk/internal v1.5.1
go: downloading github.com/AzureAD/microsoft-authentication-library-for-go v1.2.1
go: downloading github.com/cloudfoundry/gosigar v1.3.6
go: downloading golang.org/x/crypto v0.22.0
go: downloading github.com/otiai10/copy v1.2.0
go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2
go: downloading cloud.google.com/go/compute/metadata v0.2.3
go: downloading github.com/google/pprof v0.0.0-20240117000934-35fc243c5815
go: downloading github.com/joho/sqltocsv v0.0.0-20210428211105-a6d6801d59df
go: downloading github.com/wangjohn/quickselect v0.0.0-20161129230411-ed8402a42d5f
go: downloading github.com/jedib0t/go-pretty/v6 v6.2.2
go: downloading github.com/spkg/bom v1.0.0
go: downloading github.com/xitongsys/parquet-go v1.6.0
go: downloading cloud.google.com/go/compute v1.25.1
go: downloading github.com/jfcg/sixb v1.3.8
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:16 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-manager
{UpstreamID:7365375225435268540 Namespace:default ID:ddl-manager SinkURI:kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:57:50.590091651 +0800 CST StartTs:449546831901229061 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc001265dd0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546831940550659}
{CheckpointTs:449546832595910682 MinTableBarrierTs:449546832595910682 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/78a816f5-b374-4db9-8b20-ead5a8c57a9e
	{"id":"78a816f5-b374-4db9-8b20-ead5a8c57a9e","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885093}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4cb7c1
	78a816f5-b374-4db9-8b20-ead5a8c57a9e

/tidb/cdc/default/default/changefeed/info/ddl-manager
	{"upstream-id":7365375225435268540,"namespace":"default","changefeed-id":"ddl-manager","sink-uri":"kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:57:50.590091651+08:00","start-ts":449546831901229061,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546831940550659}

/tidb/cdc/default/default/changefeed/status/ddl-manager
	{"checkpoint-ts":449546832740089857,"min-table-barrier-ts":449546832740089857,"admin-job-type":0}

/tidb/cdc/default/default/task/position/78a816f5-b374-4db9-8b20-ead5a8c57a9e/ddl-manager
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375225435268540
	{"id":7365375225435268540,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-manager
{UpstreamID:7365375225435268540 Namespace:default ID:ddl-manager SinkURI:kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:57:50.590091651 +0800 CST StartTs:449546831901229061 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc001265dd0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546831940550659}
{CheckpointTs:449546832595910682 MinTableBarrierTs:449546832595910682 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/78a816f5-b374-4db9-8b20-ead5a8c57a9e
	{"id":"78a816f5-b374-4db9-8b20-ead5a8c57a9e","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885093}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4cb7c1
	78a816f5-b374-4db9-8b20-ead5a8c57a9e

/tidb/cdc/default/default/changefeed/info/ddl-manager
	{"upstream-id":7365375225435268540,"namespace":"default","changefeed-id":"ddl-manager","sink-uri":"kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:57:50.590091651+08:00","start-ts":449546831901229061,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546831940550659}

/tidb/cdc/default/default/changefeed/status/ddl-manager
	{"checkpoint-ts":449546832740089857,"min-table-barrier-ts":449546832740089857,"admin-job-type":0}

/tidb/cdc/default/default/task/position/78a816f5-b374-4db9-8b20-ead5a8c57a9e/ddl-manager
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375225435268540
+ grep -q 'failed to get info:'
	{"id":7365375225435268540,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-manager
{UpstreamID:7365375225435268540 Namespace:default ID:ddl-manager SinkURI:kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:57:50.590091651 +0800 CST StartTs:449546831901229061 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc001265dd0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546831940550659}
{CheckpointTs:449546832595910682 MinTableBarrierTs:449546832595910682 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/78a816f5-b374-4db9-8b20-ead5a8c57a9e
	{"id":"78a816f5-b374-4db9-8b20-ead5a8c57a9e","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885093}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4cb7c1
	78a816f5-b374-4db9-8b20-ead5a8c57a9e

/tidb/cdc/default/default/changefeed/info/ddl-manager
	{"upstream-id":7365375225435268540,"namespace":"default","changefeed-id":"ddl-manager","sink-uri":"kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:57:50.590091651+08:00","start-ts":449546831901229061,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546831940550659}

/tidb/cdc/default/default/changefeed/status/ddl-manager
	{"checkpoint-ts":449546832740089857,"min-table-barrier-ts":449546832740089857,"admin-job-type":0}

/tidb/cdc/default/default/task/position/78a816f5-b374-4db9-8b20-ead5a8c57a9e/ddl-manager
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375225435268540
+ grep -q 'etcd info'
	{"id":7365375225435268540,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ break
+ set +x
[Sun May  5 12:58:16 CST 2024] <<<<<< START cdc server in ddl_manager case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ExecuteDDLSlowly=return(true)'
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_manager.58975899.out server --log-file /tmp/tidb_cdc_test/ddl_manager/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_manager/cdc_data --cluster-id default
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G17
Run cases: clustered_index processor_resolved_ts_fallback
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=13fc3065-35cd-4e7b-a746-4cd0ec024549
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-z64vj-z69rl
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G17
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-z64vj
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-z64vj-z69rl pingcap_tiflow_pull_cdc_integration_kafka_test_1856-z64vj
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-z64vj-z69rl
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/clustered_index/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:58:16 CST 2024] <<<<<< skip test case clustered_index for kafka! >>>>>>
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/processor_resolved_ts_fallback/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:58:16 CST 2024] <<<<<< run test case processor_resolved_ts_fallback success! >>>>>>
go: downloading github.com/lestrrat-go/jwx/v2 v2.0.21
go: downloading github.com/dolthub/maphash v0.1.0
go: downloading github.com/beorn7/perks v1.0.1
go: downloading github.com/prometheus/common v0.52.2
go: downloading github.com/prometheus/procfs v0.13.0
go: downloading github.com/pkg/errors v0.9.1
go: downloading github.com/uber/jaeger-lib v2.4.1+incompatible
go: downloading github.com/cockroachdb/logtags v0.0.0-20230118201751-21c54148d20b
go: downloading github.com/cockroachdb/redact v1.1.5
go: downloading github.com/getsentry/sentry-go v0.27.0
go: downloading github.com/tklauser/go-sysconf v0.3.12
go: downloading google.golang.org/genproto/googleapis/api v0.0.0-20240401170217-c3f982113cda
go: downloading github.com/pingcap/badger v1.5.1-0.20230103063557-828f39b09b6d
go: downloading github.com/dgryski/go-farm v0.0.0-20200201041132-a6ae2369ad13
go: downloading github.com/cheggaaa/pb/v3 v3.0.8
go: downloading github.com/robfig/cron/v3 v3.0.1
go: downloading github.com/kr/pretty v0.3.1
go: downloading github.com/dustin/go-humanize v1.0.1
go: downloading github.com/golang/glog v1.2.0
go: downloading github.com/coreos/go-systemd/v22 v22.5.0
go: downloading github.com/pingcap/goleveldb v0.0.0-20191226122134-f82aafb29989
go: downloading github.com/robfig/cron v1.2.0
go: downloading cloud.google.com/go/iam v1.1.7
go: downloading github.com/googleapis/gax-go/v2 v2.12.3
go: downloading github.com/kr/text v0.2.0
go: downloading github.com/rogpeppe/go-internal v1.12.0
go: downloading github.com/VividCortex/ewma v1.2.0
go: downloading github.com/fatih/color v1.16.0
go: downloading github.com/mattn/go-colorable v0.1.13
go: downloading github.com/mattn/go-isatty v0.0.20
go: downloading github.com/mattn/go-runewidth v0.0.15
go: downloading github.com/apache/thrift v0.16.0
go: downloading github.com/tklauser/numcpus v0.6.1
go: downloading github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c
go: downloading github.com/kylelemons/godebug v1.1.0
go: downloading go.opencensus.io v0.23.1-0.20220331163232-052120675fac
go: downloading go.opentelemetry.io/otel v1.24.0
go: downloading go.opentelemetry.io/otel/trace v1.24.0
go: downloading github.com/golang-jwt/jwt/v5 v5.2.0
go: downloading github.com/rivo/uniseg v0.4.7
go: downloading github.com/golang-jwt/jwt v3.2.2+incompatible
go: downloading github.com/klauspost/cpuid v1.3.1
go: downloading github.com/ncw/directio v1.0.5
go: downloading github.com/coocood/bbloom v0.0.0-20190830030839-58deb6228d64
go: downloading github.com/coocood/rtutil v0.0.0-20190304133409-c84515f646f2
go: downloading github.com/lestrrat-go/iter v1.0.2
go: downloading github.com/lestrrat-go/option v1.0.1
go: downloading github.com/lestrrat-go/blackmagic v1.0.2
[Pipeline] sh
table drop_tables.c not exists for 2-th check, retry later
go: downloading github.com/lestrrat-go/httprc v1.0.5
go: downloading github.com/lestrrat-go/httpcc v1.0.1
go: downloading github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da
go: downloading github.com/go-logr/stdr v1.2.2
go: downloading go.opentelemetry.io/otel/metric v1.24.0
go: downloading github.com/go-logr/logr v1.4.1
[Pipeline] sh
start tidb cluster in /tmp/tidb_cdc_test/changefeed_error
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
go: downloading github.com/cockroachdb/tokenbucket v0.0.0-20230807174530-cc333fc44b06
go: downloading github.com/DataDog/zstd v1.5.5
[Pipeline] sh
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G16
Run cases: owner_resign processor_etcd_worker_delay sink_hang
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=0a046476-d338-4664-be30-930c2a5e3cdb
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l2kvf-qx8r0
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT=tcp://10.233.0.1:443
KUBERNETES_PORT_443_TCP_PORT=443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G16
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l2kvf
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-l2kvf pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l2kvf-qx8r0
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l2kvf-qx8r0
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/owner_resign/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:58:17 CST 2024] <<<<<< run test case owner_resign success! >>>>>>
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G12
[Pipeline] sh
Run cases: many_pk_or_uk capture_session_done_during_task ddl_attributes
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=9123beee-4372-404d-b75e-aad25b8ae748
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G12
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-11vs6
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-11vs6 pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/many_pk_or_uk/run.sh using Sink-Type: kafka... <<=================
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G14
Run cases: changefeed_finish force_replicate_table
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=0e2dd3c1-d9fc-4909-beb0-bc5cbfa19689
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT=tcp://10.233.0.1:443
KUBERNETES_PORT_443_TCP_PORT=443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G14
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-rgkc6
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-rgkc6 pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/changefeed_finish/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:18 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-manager
{UpstreamID:7365375225435268540 Namespace:default ID:ddl-manager SinkURI:kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:57:50.590091651 +0800 CST StartTs:449546831901229061 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc001265dd0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546831940550659}
{CheckpointTs:449546832740089857 MinTableBarrierTs:449546832740089857 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/78a816f5-b374-4db9-8b20-ead5a8c57a9e
	{"id":"78a816f5-b374-4db9-8b20-ead5a8c57a9e","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885093}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4cb7c1
	78a816f5-b374-4db9-8b20-ead5a8c57a9e

/tidb/cdc/default/default/changefeed/info/ddl-manager
	{"upstream-id":7365375225435268540,"namespace":"default","changefeed-id":"ddl-manager","sink-uri":"kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:57:50.590091651+08:00","start-ts":449546831901229061,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546831940550659}

/tidb/cdc/default/default/changefeed/status/ddl-manager
	{"checkpoint-ts":449546832740089857,"min-table-barrier-ts":449546832740089857,"admin-job-type":0}

/tidb/cdc/default/default/task/position/78a816f5-b374-4db9-8b20-ead5a8c57a9e/ddl-manager
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375225435268540
	{"id":7365375225435268540,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-manager
{UpstreamID:7365375225435268540 Namespace:default ID:ddl-manager SinkURI:kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:57:50.590091651 +0800 CST StartTs:449546831901229061 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc001265dd0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546831940550659}
{CheckpointTs:449546832740089857 MinTableBarrierTs:449546832740089857 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/78a816f5-b374-4db9-8b20-ead5a8c57a9e
	{"id":"78a816f5-b374-4db9-8b20-ead5a8c57a9e","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885093}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4cb7c1
	78a816f5-b374-4db9-8b20-ead5a8c57a9e

/tidb/cdc/default/default/changefeed/info/ddl-manager
	{"upstream-id":7365375225435268540,"namespace":"default","changefeed-id":"ddl-manager","sink-uri":"kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:57:50.590091651+08:00","start-ts":449546831901229061,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546831940550659}

/tidb/cdc/default/default/changefeed/status/ddl-manager
	{"checkpoint-ts":449546832740089857,"min-table-barrier-ts":449546832740089857,"admin-job-type":0}

/tidb/cdc/default/default/task/position/78a816f5-b374-4db9-8b20-ead5a8c57a9e/ddl-manager
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375225435268540
	{"id":7365375225435268540,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ grep -q 'etcd info'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-manager
{UpstreamID:7365375225435268540 Namespace:default ID:ddl-manager SinkURI:kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:57:50.590091651 +0800 CST StartTs:449546831901229061 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc001265dd0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546831940550659}
{CheckpointTs:449546832740089857 MinTableBarrierTs:449546832740089857 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/78a816f5-b374-4db9-8b20-ead5a8c57a9e
	{"id":"78a816f5-b374-4db9-8b20-ead5a8c57a9e","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885093}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471d4cb7c1
	78a816f5-b374-4db9-8b20-ead5a8c57a9e

/tidb/cdc/default/default/changefeed/info/ddl-manager
	{"upstream-id":7365375225435268540,"namespace":"default","changefeed-id":"ddl-manager","sink-uri":"kafka://127.0.0.1:9092/ticdc-ddl-mamager-test-9783?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:57:50.590091651+08:00","start-ts":449546831901229061,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546831940550659}

/tidb/cdc/default/default/changefeed/status/ddl-manager
	{"checkpoint-ts":449546832740089857,"min-table-barrier-ts":449546832740089857,"admin-job-type":0}

/tidb/cdc/default/default/task/position/78a816f5-b374-4db9-8b20-ead5a8c57a9e/ddl-manager
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375225435268540
	{"id":7365375225435268540,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ break
+ set +x
table ddl_manager.finish_mark not exists for 1-th check, retry later
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G15
Run cases: new_ci_collation batch_add_table multi_rocks
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=05ff5ef9-d008-4c0b-9a1f-8e475f49330f
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G15
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-wrhxv
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-wrhxv pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/new_ci_collation/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
check diff failed 4-th time, retry later
Verifying downstream PD is started...
table drop_tables.c not exists for 3-th check, retry later
table test.finish_mark not exists for 1-th check, retry later
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table ddl_manager.finish_mark not exists for 2-th check, retry later
+ rm -rf /tmp/tidb_cdc_test
+ mkdir -p /tmp/tidb_cdc_test
+ chmod +x ./tests/integration_tests/run_group.sh
+ ./tests/integration_tests/run_group.sh kafka G13
Run cases: tiflash region_merge common_1
PROW_JOB_ID=3d559389-be1a-48e0-8a90-a6526f498ff5
JENKINS_NODE_COOKIE=ff7d420e-1d6f-42fc-ba7e-d006c7f99e1b
BUILD_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/
GOLANG_VERSION=1.21.0
HOSTNAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5
HUDSON_SERVER_COOKIE=83ef27fe9acccc92
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT=tcp://10.233.0.1:443
TERM=xterm
STAGE_NAME=Test
BUILD_TAG=jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856
KUBERNETES_SERVICE_PORT=443
GIT_PREVIOUS_COMMIT=03312178c534dce949face80c69812d989e55009
JOB_SPEC={"type":"presubmit","job":"pingcap/tiflow/pull_cdc_integration_kafka_test","buildid":"1786980986911723520","prowjobid":"3d559389-be1a-48e0-8a90-a6526f498ff5","refs":{"org":"pingcap","repo":"tiflow","repo_link":"https://github.com/pingcap/tiflow","base_ref":"master","base_sha":"be1553484fe4c03594eabb8d7435c694e5fd7224","base_link":"https://github.com/pingcap/tiflow/commit/be1553484fe4c03594eabb8d7435c694e5fd7224","pulls":[{"number":10919,"author":"lidezhu","sha":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","title":"*(ticdc): split old update kv entry after restarting changefeed","link":"https://github.com/pingcap/tiflow/pull/10919","commit_link":"https://github.com/pingcap/tiflow/pull/10919/commits/0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","author_link":"https://github.com/lidezhu"}]}}
KUBERNETES_SERVICE_HOST=10.233.0.1
WORKSPACE=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test
JOB_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/
RUN_CHANGES_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=changes
RUN_ARTIFACTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=artifacts
FILE_SERVER_URL=http://fileserver.pingcap.net
JENKINS_HOME=/var/jenkins_home
GIT_COMMIT=03312178c534dce949face80c69812d989e55009
PATH=/go/bin:/usr/local/go/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../bin:/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/../../scripts/bin
RUN_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect
GOPROXY=http://goproxy.apps.svc,https://proxy.golang.org,direct
POD_CONTAINER=golang
PWD=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow
HUDSON_URL=https://do.pingcap.net/jenkins/
TICDC_COVERALLS_TOKEN=****
JOB_NAME=pingcap/tiflow/pull_cdc_integration_kafka_test
TZ=Asia/Shanghai
BUILD_DISPLAY_NAME=#1856
TEST_GROUP=G13
JENKINS_URL=https://do.pingcap.net/jenkins/
BUILD_ID=1786980986911723520
TICDC_CODECOV_TOKEN=****
GOLANG_DOWNLOAD_SHA256=d0398903a16ba2232b389fb31032ddf57cac34efda306a0eebac34f0965a0742
JOB_BASE_NAME=pull_cdc_integration_kafka_test
GIT_PREVIOUS_SUCCESSFUL_COMMIT=03312178c534dce949face80c69812d989e55009
RUN_TESTS_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/1856/display/redirect?page=tests
SHLVL=5
HOME=/home/jenkins
POD_LABEL=pingcap_tiflow_pull_cdc_integration_kafka_test_1856-ww4ds
GOROOT=/usr/local/go
GIT_BRANCH=origin/main
KUBERNETES_PORT_443_TCP_PROTO=tcp
TINI_VERSION=v0.19.0
CI=true
KUBERNETES_SERVICE_PORT_HTTPS=443
WORKSPACE_TMP=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test@tmp
EXECUTOR_NUMBER=0
JENKINS_SERVER_COOKIE=durable-8f1d433d6527e85b4c28b432b07fa0c56dd3090c9176381d29a5b4531676247a
NODE_LABELS=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5 pingcap_tiflow_pull_cdc_integration_kafka_test_1856-ww4ds
GIT_URL=https://github.com/PingCAP-QE/ci.git
HUDSON_HOME=/var/jenkins_home
CLASSPATH=
NODE_NAME=pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5
GOPATH=/go
JOB_DISPLAY_URL=https://do.pingcap.net/jenkins/job/pingcap/job/tiflow/job/pull_cdc_integration_kafka_test/display/redirect
BUILD_NUMBER=1856
KUBERNETES_PORT_443_TCP_ADDR=10.233.0.1
KUBERNETES_PORT_443_TCP=tcp://10.233.0.1:443
GOLANG_DOWNLOAD_URL=https://dl.google.com/go/go1.21.0.linux-amd64.tar.gz
_=/usr/bin/env
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/tiflash/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
go: downloading github.com/google/s2a-go v0.1.7
go: downloading go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.49.0
go: downloading github.com/googleapis/enterprise-certificate-proxy v0.3.2
go: downloading go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.49.0
go: downloading github.com/felixge/httpsnoop v1.0.4
go: downloading github.com/jmespath/go-jmespath v0.4.0
check diff failed 5-th time, retry later
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
table test.finish_mark not exists for 2-th check, retry later
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
table drop_tables.c not exists for 4-th check, retry later
[Pipeline] }
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_capture/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
go: downloading github.com/modern-go/reflect2 v1.0.2
go: downloading github.com/json-iterator/go v1.1.12
go: downloading github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/processor_etcd_worker_delay/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:58:21 CST 2024] <<<<<< run test case processor_etcd_worker_delay success! >>>>>>
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
start tidb cluster in /tmp/tidb_cdc_test/changefeed_finish
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
Starting Upstream TiDB...
[Pipeline] }
table ddl_manager.finish_mark not exists for 3-th check, retry later
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 6-th time, retry later
table test.finish_mark not exists for 3-th check, retry later
table drop_tables.c not exists for 5-th check, retry later
Verifying downstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/tiflash
Starting Upstream PD...
start tidb cluster in /tmp/tidb_cdc_test/new_ci_collation
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table ddl_manager.finish_mark not exists for 4-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/multi_capture
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table test.finish_mark not exists for 4-th check, retry later
table drop_tables.c exists
check diff successfully
find: '/tmp/tidb_cdc_test/*/*': No such file or directory
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/sink_hang/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:58:24 CST 2024] <<<<<< run test case sink_hang success! >>>>>>
check diff successfully
Verifying downstream PD is started...
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:58:26 CST 2024] <<<<<< run test case drop_many_tables success! >>>>>>
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 5-th check, retry later
table ddl_manager.finish_mark not exists for 5-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/many_pk_or_uk
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
check diff failed 1-th time, retry later
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_manager.finish_mark not exists for 6-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c787ed4000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:4201, start at 2024-05-05 12:58:27.649266138 +0800 CST m=+5.135788691	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:27.656 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:27.637 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:27.637 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c787ed4000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:4201, start at 2024-05-05 12:58:27.649266138 +0800 CST m=+5.135788691	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:27.656 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:27.637 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:27.637 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c787ec00010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:4286, start at 2024-05-05 12:58:27.652710871 +0800 CST m=+5.086040829	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:27.659 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:27.632 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:27.632 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/changefeed_error/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/changefeed_error/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/changefeed_error/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/changefeed_error/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/changefeed_error/tiflash-proxy.toml"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
check diff failed 2-th time, retry later
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 6-th check, retry later
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_manager.finish_mark not exists for 7-th check, retry later
[2024/05/05 12:58:22.371 +08:00] [INFO] [main.go:99] ["running ddl test: 1 modifyColumnDefaultValueDDL2"]
[2024/05/05 12:58:22.371 +08:00] [INFO] [main.go:99] ["running ddl test: 0 modifyColumnDefaultValueDDL1"]
[2024/05/05 12:58:22.889 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs858f2d21_d300_48d9_baec_7ad98360d5c7"]
[2024/05/05 12:58:22.894 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs054660a4_497f_490c_9525_bf6c60956cad"]
[2024/05/05 12:58:22.895 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs4c0eff19_ba50_4dd8_ac26_a64ef2630388"]
[2024/05/05 12:58:22.896 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLse26b79cd_8560_413e_9228_7eba28434d3e"]
[2024/05/05 12:58:22.897 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs0ebaef15_d8bb_4165_b506_0a2c342c9516"]
[2024/05/05 12:58:22.898 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsa330f299_d574_43a6_b5a8_1d62aa38575c"]
[2024/05/05 12:58:22.899 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs81400044_de95_4f81_af71_3a5472f8c0c9"]
[2024/05/05 12:58:22.900 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs43fd46ea_bcd4_4a75_afc5_77b04a4a81f6"]
[2024/05/05 12:58:22.927 +08:00] [INFO] [main.go:178] ["1 insert success: 100"]
[2024/05/05 12:58:22.927 +08:00] [INFO] [main.go:178] ["1 insert success: 100"]
[2024/05/05 12:58:22.998 +08:00] [INFO] [main.go:178] ["0 insert success: 100"]
[2024/05/05 12:58:22.999 +08:00] [INFO] [main.go:178] ["0 insert success: 100"]
[2024/05/05 12:58:23.432 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:23.439 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:23.439 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:23.439 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:23.440 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:23.441 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:23.441 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:23.443 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:23.443 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:23.444 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:23.444 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:23.444 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:23.445 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:23.445 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:23.448 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:23.448 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:23.467 +08:00] [INFO] [main.go:178] ["1 insert success: 200"]
[2024/05/05 12:58:23.467 +08:00] [INFO] [main.go:178] ["1 insert success: 200"]
[2024/05/05 12:58:23.647 +08:00] [INFO] [main.go:178] ["0 insert success: 200"]
[2024/05/05 12:58:23.649 +08:00] [INFO] [main.go:178] ["0 insert success: 200"]
[2024/05/05 12:58:23.651 +08:00] [INFO] [main.go:199] ["0 delete success: 100"]
[2024/05/05 12:58:23.653 +08:00] [INFO] [main.go:199] ["0 delete success: 100"]
[2024/05/05 12:58:23.941 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:23.949 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:23.949 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:23.950 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:23.950 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:23.950 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:23.950 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:23.950 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:23.951 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:23.952 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:23.952 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:23.952 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:23.953 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:24.011 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:24.015 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:24.017 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:24.028 +08:00] [INFO] [main.go:178] ["1 insert success: 300"]
[2024/05/05 12:58:24.029 +08:00] [INFO] [main.go:178] ["1 insert success: 300"]
[2024/05/05 12:58:24.319 +08:00] [INFO] [main.go:178] ["0 insert success: 300"]
[2024/05/05 12:58:24.321 +08:00] [INFO] [main.go:178] ["0 insert success: 300"]
[2024/05/05 12:58:24.415 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:24.418 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:24.419 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:24.425 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:24.538 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:24.539 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:24.542 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:24.543 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:24.543 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:24.558 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:24.613 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:24.614 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:24.615 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:24.618 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:24.620 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:24.621 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:24.633 +08:00] [INFO] [main.go:178] ["1 insert success: 400"]
[2024/05/05 12:58:24.635 +08:00] [INFO] [main.go:178] ["1 insert success: 400"]
[2024/05/05 12:58:24.922 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:24.922 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:24.923 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:24.941 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:25.031 +08:00] [INFO] [main.go:178] ["0 insert success: 400"]
[2024/05/05 12:58:25.031 +08:00] [INFO] [main.go:178] ["0 insert success: 400"]
[2024/05/05 12:58:25.036 +08:00] [INFO] [main.go:199] ["0 delete success: 200"]
[2024/05/05 12:58:25.036 +08:00] [INFO] [main.go:199] ["0 delete success: 200"]
[2024/05/05 12:58:25.113 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:25.114 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:25.114 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:25.115 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:25.121 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:25.167 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:25.168 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:25.168 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:25.217 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:25.217 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:25.221 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:25.222 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:25.230 +08:00] [INFO] [main.go:178] ["1 insert success: 500"]
[2024/05/05 12:58:25.230 +08:00] [INFO] [main.go:178] ["1 insert success: 500"]
[2024/05/05 12:58:25.424 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:25.424 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:25.430 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:25.445 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:25.639 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:25.643 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:25.643 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:25.646 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:25.656 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:25.750 +08:00] [INFO] [main.go:178] ["0 insert success: 500"]
[2024/05/05 12:58:25.751 +08:00] [INFO] [main.go:178] ["0 insert success: 500"]
[2024/05/05 12:58:25.812 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:25.816 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:25.820 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:25.821 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:25.822 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:25.826 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:25.828 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:25.830 +08:00] [INFO] [main.go:178] ["1 insert success: 600"]
[2024/05/05 12:58:25.831 +08:00] [INFO] [main.go:178] ["1 insert success: 600"]
[2024/05/05 12:58:25.952 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:25.955 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:25.956 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:26.020 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:26.151 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:26.156 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:26.157 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:26.211 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:26.228 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:26.345 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:26.349 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:26.350 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:26.350 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:26.355 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:26.411 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:26.415 +08:00] [INFO] [main.go:178] ["1 insert success: 700"]
[2024/05/05 12:58:26.418 +08:00] [INFO] [main.go:178] ["1 insert success: 700"]
[2024/05/05 12:58:26.418 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:26.467 +08:00] [INFO] [main.go:178] ["0 insert success: 600"]
[2024/05/05 12:58:26.469 +08:00] [INFO] [main.go:178] ["0 insert success: 600"]
[2024/05/05 12:58:26.472 +08:00] [INFO] [main.go:199] ["0 delete success: 300"]
[2024/05/05 12:58:26.511 +08:00] [INFO] [main.go:199] ["0 delete success: 300"]
[2024/05/05 12:58:26.527 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:26.527 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:26.529 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:26.542 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:26.717 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:26.725 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:26.727 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:26.731 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:26.750 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:26.853 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:26.913 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:26.916 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:26.916 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:26.939 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:26.941 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:26.946 +08:00] [INFO] [main.go:178] ["1 insert success: 800"]
[2024/05/05 12:58:26.952 +08:00] [INFO] [main.go:178] ["1 insert success: 800"]
[2024/05/05 12:58:26.953 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:27.113 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:27.116 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:27.118 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:27.134 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:27.244 +08:00] [INFO] [main.go:178] ["0 insert success: 700"]
[2024/05/05 12:58:27.244 +08:00] [INFO] [main.go:178] ["0 insert success: 700"]
[2024/05/05 12:58:27.323 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:27.334 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:27.334 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:27.338 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:27.355 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:27.436 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:27.437 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:27.443 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:27.447 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:27.513 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:27.515 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:27.524 +08:00] [INFO] [main.go:178] ["1 insert success: 900"]
[2024/05/05 12:58:27.531 +08:00] [INFO] [main.go:178] ["1 insert success: 900"]
[2024/05/05 12:58:27.534 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:27.630 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:27.645 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:27.646 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:27.714 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:27.875 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:27.953 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:27.957 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:27.962 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:27.977 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:27.999 +08:00] [INFO] [main.go:178] ["0 insert success: 800"]
[2024/05/05 12:58:27.999 +08:00] [INFO] [main.go:178] ["0 insert success: 800"]
[2024/05/05 12:58:28.003 +08:00] [INFO] [main.go:199] ["0 delete success: 400"]
[2024/05/05 12:58:28.003 +08:00] [INFO] [main.go:199] ["0 delete success: 400"]
[2024/05/05 12:58:28.097 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:28.101 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:28.108 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:28.243 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:28.252 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:28.255 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:28.267 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:28.287 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:28.385 +08:00] [INFO] [main.go:178] ["0 insert success: 900"]
[2024/05/05 12:58:28.388 +08:00] [INFO] [main.go:178] ["0 insert success: 900"]
[2024/05/05 12:58:28.402 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:28.405 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:28.411 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:28.533 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:28.545 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:28.559 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:28.561 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:28.584 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:28.630 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:28.636 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:28.644 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:28.647 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:28.662 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:28.717 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:28.724 +08:00] [INFO] [main.go:178] ["1 insert success: 1000"]
[2024/05/05 12:58:28.726 +08:00] [INFO] [main.go:178] ["1 insert success: 1000"]
[2024/05/05 12:58:28.752 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:28.756 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:28.758 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:28.765 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:28.815 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:28.913 +08:00] [INFO] [main.go:178] ["0 insert success: 1000"]
[2024/05/05 12:58:28.918 +08:00] [INFO] [main.go:199] ["0 delete success: 500"]
[2024/05/05 12:58:28.920 +08:00] [INFO] [main.go:178] ["0 insert success: 1000"]
[2024/05/05 12:58:28.925 +08:00] [INFO] [main.go:199] ["0 delete success: 500"]
[2024/05/05 12:58:29.042 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:29.117 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:29.135 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:29.141 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:29.141 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:29.220 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:29.223 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:29.234 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:29.243 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:29.251 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:29.312 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:29.321 +08:00] [INFO] [main.go:178] ["1 insert success: 1100"]
[2024/05/05 12:58:29.324 +08:00] [INFO] [main.go:178] ["1 insert success: 1100"]
[2024/05/05 12:58:29.353 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:29.356 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:29.361 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:29.413 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:29.419 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:29.630 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:29.644 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:29.713 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:29.718 +08:00] [INFO] [main.go:178] ["0 insert success: 1100"]
[2024/05/05 12:58:29.720 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:29.722 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:29.723 +08:00] [INFO] [main.go:178] ["0 insert success: 1100"]
[2024/05/05 12:58:29.759 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:29.813 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:29.832 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:29.837 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:29.844 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:29.913 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:29.913 +08:00] [INFO] [main.go:178] ["1 insert success: 1200"]
[2024/05/05 12:58:29.920 +08:00] [INFO] [main.go:178] ["1 insert success: 1200"]
[2024/05/05 12:58:29.933 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:29.943 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:29.949 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:29.951 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:30.017 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:30.148 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:30.218 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:30.237 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:30.249 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:30.249 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:30.428 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:30.430 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:30.452 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:30.454 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:30.519 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:30.520 +08:00] [INFO] [main.go:178] ["0 insert success: 1200"]
[2024/05/05 12:58:30.525 +08:00] [INFO] [main.go:199] ["0 delete success: 600"]
[2024/05/05 12:58:30.528 +08:00] [INFO] [main.go:178] ["0 insert success: 1200"]
[2024/05/05 12:58:30.533 +08:00] [INFO] [main.go:199] ["0 delete success: 600"]
[2024/05/05 12:58:30.536 +08:00] [INFO] [main.go:178] ["1 insert success: 1300"]
[2024/05/05 12:58:30.538 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:30.541 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:30.542 +08:00] [INFO] [main.go:178] ["1 insert success: 1300"]
[2024/05/05 12:58:30.549 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:30.557 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:30.615 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:30.626 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:30.727 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:30.740 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:30.754 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:30.826 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:30.838 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.5706.out cli tso query --pd=http://127.0.0.1:2379
table test.finish_mark not exists for 7-th check, retry later
check diff failed 3-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:30.942 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:30.947 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
[2024/05/05 12:58:31.120 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:31.138 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:31.138 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:31.212 +08:00] [INFO] [main.go:178] ["1 insert success: 1400"]
[2024/05/05 12:58:31.214 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:31.214 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:31.218 +08:00] [INFO] [main.go:178] ["1 insert success: 1400"]
[2024/05/05 12:58:31.225 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:31.230 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:31.250 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:31.257 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:31.334 +08:00] [INFO] [main.go:178] ["0 insert success: 1300"]
[2024/05/05 12:58:31.341 +08:00] [INFO] [main.go:178] ["0 insert success: 1300"]
[2024/05/05 12:58:31.343 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:31.353 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:31.426 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:31.444 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:31.514 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:31.543 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:31.549 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:31.654 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:31.728 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:31.736 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:31.756 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:31.812 +08:00] [INFO] [main.go:178] ["1 insert success: 1500"]
[2024/05/05 12:58:31.816 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:31.820 +08:00] [INFO] [main.go:178] ["1 insert success: 1500"]
[2024/05/05 12:58:31.828 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:31.832 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:31.852 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:31.855 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:31.942 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:31.949 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:32.020 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:32.041 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:32.113 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:32.140 +08:00] [INFO] [main.go:178] ["0 insert success: 1400"]
[2024/05/05 12:58:32.140 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:32.142 +08:00] [INFO] [main.go:178] ["0 insert success: 1400"]
[2024/05/05 12:58:32.144 +08:00] [INFO] [main.go:199] ["0 delete success: 700"]
[2024/05/05 12:58:32.146 +08:00] [INFO] [main.go:199] ["0 delete success: 700"]
[2024/05/05 12:58:32.150 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:32.254 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:32.324 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:32.341 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:32.352 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:32.497 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:32.503 +08:00] [INFO] [main.go:178] ["1 insert success: 1600"]
[2024/05/05 12:58:32.503 +08:00] [INFO] [main.go:178] ["1 insert success: 1600"]
[2024/05/05 12:58:32.538 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:32.540 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:32.561 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:32.562 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
table ddl_manager.finish_mark not exists for 8-th check, retry later
+ set +x
+ tso='449546842573897729
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546842573897729 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:32.628 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:32.635 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:32.650 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:32.717 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:32.732 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:32.814 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:32.826 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
***************** properties *****************
"operationcount"="0"
"mysql.host"="127.0.0.1"
"mysql.db"="changefeed_error"
"recordcount"="20"
"workload"="core"
"insertproportion"="0"
"readproportion"="0"
"updateproportion"="0"
"scanproportion"="0"
"mysql.port"="4000"
"threadcount"="4"
"requestdistribution"="uniform"
"readallfields"="true"
"dotransactions"="false"
"mysql.user"="root"
**********************************************
Run finished, takes 12.933464ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 4104.2, Avg(us): 2508, Min(us): 934, Max(us): 8024, 95th(us): 9000, 99th(us): 9000
[Sun May  5 12:58:32 CST 2024] <<<<<< START cdc server in changefeed_error case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/NewChangefeedNoRetryError=1*return(true)'
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.57645766.out server --log-file /tmp/tidb_cdc_test/changefeed_error/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_error/cdc_data --cluster-id default
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[2024/05/05 12:58:32.928 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:32.931 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:32.936 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:32.948 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:33.121 +08:00] [INFO] [main.go:178] ["1 insert success: 1700"]
[2024/05/05 12:58:33.121 +08:00] [INFO] [main.go:178] ["0 insert success: 1500"]
[2024/05/05 12:58:33.123 +08:00] [INFO] [main.go:178] ["1 insert success: 1700"]
[2024/05/05 12:58:33.124 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:33.126 +08:00] [INFO] [main.go:178] ["0 insert success: 1500"]
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78c164001a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4, pid:1353, start at 2024-05-05 12:58:31.943793393 +0800 CST m=+5.218775253	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:31.952 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:31.947 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:31.947 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78c164001a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4, pid:1353, start at 2024-05-05 12:58:31.943793393 +0800 CST m=+5.218775253	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:31.952 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:31.947 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:31.947 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78c2ec0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4, pid:1438, start at 2024-05-05 12:58:32.024455074 +0800 CST m=+5.237852158	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:32.030 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:31.995 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:31.995 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/changefeed_finish/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/changefeed_finish/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/changefeed_finish/tiflash/db/proxy"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/changefeed_finish/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/changefeed_finish/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:33.220 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:33.228 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:33.260 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:33.262 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:33.326 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:33.339 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:33.362 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 8-th check, retry later
check diff failed 4-th time, retry later
[2024/05/05 12:58:33.417 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:33.425 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:33.514 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:33.519 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:33.520 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:33.524 +08:00] [INFO] [main.go:178] ["72 insert success: 2000"]
[2024/05/05 12:58:33.623 +08:00] [INFO] [main.go:178] ["1 insert success: 1800"]
[2024/05/05 12:58:33.627 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:33.727 +08:00] [INFO] [main.go:178] ["72 insert success: 2000"]
[2024/05/05 12:58:33.728 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:33.762 +08:00] [INFO] [main.go:178] ["72 insert success: 2000"]
[2024/05/05 12:58:33.813 +08:00] [INFO] [main.go:178] ["72 insert success: 2000"]
[2024/05/05 12:58:33.820 +08:00] [INFO] [main.go:178] ["0 insert success: 1600"]
[2024/05/05 12:58:33.823 +08:00] [INFO] [main.go:178] ["72 insert success: 2000"]
[2024/05/05 12:58:33.825 +08:00] [INFO] [main.go:199] ["0 delete success: 800"]
[2024/05/05 12:58:33.836 +08:00] [INFO] [main.go:178] ["72 insert success: 2000"]
[2024/05/05 12:58:33.850 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:33.865 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:33.911 +08:00] [INFO] [main.go:178] ["72 insert success: 2000"]
[2024/05/05 12:58:33.953 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:33.959 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:33.962 +08:00] [INFO] [main.go:178] ["72 insert success: 2100"]
[2024/05/05 12:58:34.011 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:34.052 +08:00] [INFO] [main.go:178] ["1 insert success: 1900"]
[2024/05/05 12:58:34.058 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:34.140 +08:00] [INFO] [main.go:178] ["73 insert success: 1900"]
[2024/05/05 12:58:34.170 +08:00] [INFO] [main.go:178] ["72 insert success: 2100"]
[2024/05/05 12:58:34.191 +08:00] [INFO] [main.go:178] ["72 insert success: 2100"]
[2024/05/05 12:58:34.220 +08:00] [INFO] [main.go:178] ["72 insert success: 2100"]
[2024/05/05 12:58:34.231 +08:00] [INFO] [main.go:178] ["72 insert success: 2100"]
[2024/05/05 12:58:34.255 +08:00] [INFO] [main.go:178] ["73 insert success: 1900"]
[2024/05/05 12:58:34.298 +08:00] [INFO] [main.go:178] ["73 insert success: 1900"]
[2024/05/05 12:58:34.305 +08:00] [INFO] [main.go:178] ["0 insert success: 1700"]
[2024/05/05 12:58:34.365 +08:00] [INFO] [main.go:178] ["1 insert success: 2000"]
[2024/05/05 12:58:34.372 +08:00] [INFO] [main.go:178] ["73 insert success: 1900"]
table ddl_manager.finish_mark not exists for 9-th check, retry later
[2024/05/05 12:58:34.447 +08:00] [INFO] [main.go:178] ["73 insert success: 2000"]
[2024/05/05 12:58:34.464 +08:00] [INFO] [main.go:178] ["72 insert success: 2200"]
[2024/05/05 12:58:34.527 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsd2af00b9_ca68_4ff1_a053_6ca190aee7e0"]
[2024/05/05 12:58:34.528 +08:00] [INFO] [main.go:178] ["72 insert success: 2200"]
[2024/05/05 12:58:34.541 +08:00] [INFO] [main.go:178] ["72 insert success: 2200"]
[2024/05/05 12:58:34.549 +08:00] [INFO] [main.go:178] ["72 insert success: 2200"]
[2024/05/05 12:58:34.581 +08:00] [INFO] [main.go:178] ["73 insert success: 2000"]
[2024/05/05 12:58:34.633 +08:00] [INFO] [main.go:178] ["73 insert success: 2000"]
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78da80001f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj, pid:1298, start at 2024-05-05 12:58:33.550174346 +0800 CST m=+5.129121931	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:33.558 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:33.554 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:33.554 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78da80001f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj, pid:1298, start at 2024-05-05 12:58:33.550174346 +0800 CST m=+5.129121931	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:33.558 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:33.554 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:33.554 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78dc0c0017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj, pid:1380, start at 2024-05-05 12:58:33.639626265 +0800 CST m=+5.165879977	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:33.648 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:33.652 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:33.652 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/new_ci_collation/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/new_ci_collation/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/new_ci_collation/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/new_ci_collation/tiflash-proxy.toml"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/new_ci_collation/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[2024/05/05 12:58:34.690 +08:00] [INFO] [main.go:178] ["1 insert success: 2100"]
[2024/05/05 12:58:34.717 +08:00] [INFO] [main.go:178] ["73 insert success: 2000"]
[2024/05/05 12:58:34.733 +08:00] [INFO] [main.go:178] ["0 insert success: 1800"]
[2024/05/05 12:58:34.738 +08:00] [INFO] [main.go:199] ["0 delete success: 900"]
[2024/05/05 12:58:34.772 +08:00] [INFO] [main.go:178] ["73 insert success: 2100"]
[2024/05/05 12:58:34.818 +08:00] [INFO] [main.go:178] ["72 insert success: 2300"]
[2024/05/05 12:58:34.868 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:34.874 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:34.875 +08:00] [INFO] [main.go:178] ["72 insert success: 2300"]
[2024/05/05 12:58:34.886 +08:00] [INFO] [main.go:178] ["72 insert success: 2300"]
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78e070000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5, pid:1282, start at 2024-05-05 12:58:33.900199496 +0800 CST m=+5.283743553	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:33.907 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:33.884 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:33.884 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:34.915 +08:00] [INFO] [main.go:178] ["72 insert success: 2300"]
[2024/05/05 12:58:34.950 +08:00] [INFO] [main.go:178] ["73 insert success: 2100"]
[2024/05/05 12:58:35.056 +08:00] [INFO] [main.go:178] ["1 insert success: 2200"]
[2024/05/05 12:58:35.076 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs04cfea0e_75b2_4b97_930d_91bca5c8c2da"]
[2024/05/05 12:58:35.080 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs1912210b_1dc7_4fee_afa5_b253e5d21c8e"]
[2024/05/05 12:58:35.126 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLscd5412f8_4f40_4ca6_b6c0_f838365101da"]
[2024/05/05 12:58:35.152 +08:00] [INFO] [main.go:178] ["0 insert success: 1900"]
[Sun May  5 12:58:33 CST 2024] <<<<<< START kafka consumer in multi_topics_v2 case >>>>>>
schema registry uri found: 10
[Sun May  5 12:58:33 CST 2024] <<<<<< START kafka consumer in multi_topics_v2 case >>>>>>
schema registry uri found: 20
[Sun May  5 12:58:33 CST 2024] <<<<<< START kafka consumer in multi_topics_v2 case >>>>>>
schema registry uri found: finish
table test.finish not exists for 1-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark exists
check diff successfully
check diff failed 5-th time, retry later
[2024/05/05 12:58:35.181 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:35.184 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:35.348 +08:00] [INFO] [main.go:178] ["1 insert success: 2300"]
[2024/05/05 12:58:35.368 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:35.369 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:35.372 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:35.376 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
wait process cdc.test exit for 1-th time...
[Sun May  5 12:58:35 CST 2024] <<<<<< START cdc server in changefeed_finish case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_finish.28172819.out server --log-file /tmp/tidb_cdc_test/changefeed_finish/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_finish/cdc_data --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[2024/05/05 12:58:35.436 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:35.438 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:35.489 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs5263fbc9_8f20_4341_87e3_2d43d30b787b"]
[2024/05/05 12:58:35.491 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:35.494 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:35.527 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsc6656974_f0d8_4ce5_8b91_f89de5f7224e"]
[2024/05/05 12:58:35.541 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsc7dd7275_d474_4d5f_91c9_47aff71b9927"]
[2024/05/05 12:58:35.547 +08:00] [INFO] [main.go:178] ["0 insert success: 2000"]
[2024/05/05 12:58:35.551 +08:00] [INFO] [main.go:199] ["0 delete success: 1000"]
[2024/05/05 12:58:35.565 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs51a7d924_b821_4357_93ee_eb2efd039b0a"]
table test.finish not exists for 2-th check, retry later
[2024/05/05 12:58:35.717 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:35.718 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:35.723 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:35.724 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:35.823 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:35.826 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
wait process cdc.test exit for 2-th time...
[2024/05/05 12:58:35.929 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:35.931 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:35.934 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:35.936 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:35.956 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:35.959 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:35.971 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:35.972 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:36.021 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:36.024 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:36.121 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:36.121 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:36.126 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:36.129 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:35 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8
	{"id":"cd3ec868-b5ac-485a-92c6-ae45a61099c8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885113}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b2eb
	cd3ec868-b5ac-485a-92c6-ae45a61099c8

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8
	{"id":"cd3ec868-b5ac-485a-92c6-ae45a61099c8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885113}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b2eb
	cd3ec868-b5ac-485a-92c6-ae45a61099c8

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8
	{"id":"cd3ec868-b5ac-485a-92c6-ae45a61099c8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885113}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b2eb
	cd3ec868-b5ac-485a-92c6-ae45a61099c8

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.5827.out cli changefeed create --start-ts=449546842573897729 '--sink-uri=kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' -c changefeed-error
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78ef100007	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:7672, start at 2024-05-05 12:58:34.827067571 +0800 CST m=+5.134532151	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:34.835 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:34.820 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:34.820 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78ef100007	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:7672, start at 2024-05-05 12:58:34.827067571 +0800 CST m=+5.134532151	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:34.835 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:34.820 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:34.820 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78f210000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:7749, start at 2024-05-05 12:58:35.027442259 +0800 CST m=+5.279101628	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:35.036 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:35.012 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:35.012 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/multi_capture/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/multi_capture/tiflash/log/error.log
arg matches is ArgMatches { args: {"data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/multi_capture/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/multi_capture/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/multi_capture/tiflash-proxy.toml"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[2024/05/05 12:58:36.212 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:36.217 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:36.324 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:36.335 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:36.341 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:36.347 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:36.361 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:36.367 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:36.419 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:36.426 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
Create changefeed successfully!
ID: changefeed-error
Info: {"upstream_id":7365375408703361060,"namespace":"default","id":"changefeed-error","sink_uri":"kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:58:36.343129959+08:00","start_ts":449546842573897729,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546842573897729,"checkpoint_ts":449546842573897729,"checkpoint_time":"2024-05-05 12:58:31.137"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
wait process cdc.test exit for 3-th time...
[2024/05/05 12:58:36.442 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:36.453 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:36.534 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:36.537 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:36.538 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:36.549 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:36.626 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:36.637 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:36.726 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:36.742 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:36.744 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:36.758 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:36.764 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:36.827 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:36.828 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:36.845 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:36.850 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:36.868 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78e070000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5, pid:1282, start at 2024-05-05 12:58:33.900199496 +0800 CST m=+5.283743553	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:33.907 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:33.884 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:33.884 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c78e2b40014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5, pid:1369, start at 2024-05-05 12:58:34.056781855 +0800 CST m=+5.384376298	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:34.064 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:34.029 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:34.029 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/tiflash/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/tiflash/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/tiflash/tiflash/log/proxy.log"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/tiflash/tiflash-proxy.toml"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/tiflash/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table ddl_manager.finish_mark not exists for 10-th check, retry later
[2024/05/05 12:58:36.941 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:36.945 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:36.945 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:36.960 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:37.038 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:37.050 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:37.146 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:37.167 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
check diff successfully
[Sun May  5 12:58:36 CST 2024] <<<<<< START cdc server in new_ci_collation case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.new_ci_collation.28612863.out server --log-file /tmp/tidb_cdc_test/new_ci_collation/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/new_ci_collation/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 12:58:37 CST 2024] <<<<<< run test case kafka_simple_basic_avro success! >>>>>>
[2024/05/05 12:58:37.215 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:37.223 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:37.228 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:37.258 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:37.265 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:37.279 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:37.281 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:37.329 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:37.359 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:37.364 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:37.368 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:37.411 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:37.440 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:37.457 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:37.546 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:37.630 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:37.638 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:37.643 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:37.649 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:37.663 +08:00] [INFO] [main.go:88] ["testGetDefaultValue take 15.292791626s"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
[Sun May  5 12:58:37 CST 2024] <<<<<< START kafka consumer in changefeed_error case >>>>>>
check_changefeed_state http://127.0.0.1:2379 changefeed-error failed [CDC:ErrStartTsBeforeGC]
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=changefeed-error
+ expected_state=failed
+ error_msg='[CDC:ErrStartTsBeforeGC]'
+ tls_dir='[CDC:ErrStartTsBeforeGC]'
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c changefeed-error -s
[2024/05/05 12:58:37.718 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:37.725 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:37.734 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:37.741 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:37.763 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:37.812 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:37.821 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:37.828 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:37.840 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:37.856 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:37.879 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:37.952 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
+ info='{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-error",
  "state": "failed",
  "checkpoint_tso": 449546842573897729,
  "checkpoint_time": "2024-05-05 12:58:31.137",
  "error": {
    "time": "2024-05-05T12:58:36.426398234+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrStartTsBeforeGC",
    "message": "[CDC:ErrStartTsBeforeGC]fail to create or maintain changefeed because start-ts 449546842573897429 is earlier than or equal to GC safepoint at 449546842573897729"
  }
}'
+ echo '{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-error",
  "state": "failed",
  "checkpoint_tso": 449546842573897729,
  "checkpoint_time": "2024-05-05 12:58:31.137",
  "error": {
    "time": "2024-05-05T12:58:36.426398234+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrStartTsBeforeGC",
    "message": "[CDC:ErrStartTsBeforeGC]fail to create or maintain changefeed because start-ts 449546842573897429 is earlier than or equal to GC safepoint at 449546842573897729"
  }
}'
{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-error",
  "state": "failed",
  "checkpoint_tso": 449546842573897729,
  "checkpoint_time": "2024-05-05 12:58:31.137",
  "error": {
    "time": "2024-05-05T12:58:36.426398234+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrStartTsBeforeGC",
    "message": "[CDC:ErrStartTsBeforeGC]fail to create or maintain changefeed because start-ts 449546842573897429 is earlier than or equal to GC safepoint at 449546842573897729"
  }
}
++ jq -r .state
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-error",' '"state":' '"failed",' '"checkpoint_tso":' 449546842573897729, '"checkpoint_time":' '"2024-05-05' '12:58:31.137",' '"error":' '{' '"time":' '"2024-05-05T12:58:36.426398234+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrStartTsBeforeGC",' '"message":' '"[CDC:ErrStartTsBeforeGC]fail' to create or maintain changefeed because start-ts 449546842573897429 is earlier than or equal to GC safepoint at '449546842573897729"' '}' '}'
+ state=failed
+ [[ ! failed == \f\a\i\l\e\d ]]
++ jq -r .error.message
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-error",' '"state":' '"failed",' '"checkpoint_tso":' 449546842573897729, '"checkpoint_time":' '"2024-05-05' '12:58:31.137",' '"error":' '{' '"time":' '"2024-05-05T12:58:36.426398234+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrStartTsBeforeGC",' '"message":' '"[CDC:ErrStartTsBeforeGC]fail' to create or maintain changefeed because start-ts 449546842573897429 is earlier than or equal to GC safepoint at '449546842573897729"' '}' '}'
+ message='[CDC:ErrStartTsBeforeGC]fail to create or maintain changefeed because start-ts 449546842573897429 is earlier than or equal to GC safepoint at 449546842573897729'
+ [[ ! [CDC:ErrStartTsBeforeGC]fail to create or maintain changefeed because start-ts 449546842573897429 is earlier than or equal to GC safepoint at 449546842573897729 =~ \[CDC:ErrStartTsBeforeGC] ]]
run task successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.5955.out cli changefeed resume -c changefeed-error
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:38 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/68233595-6eac-4796-af70-8e9e8eb3974b
	{"id":"68233595-6eac-4796-af70-8e9e8eb3974b","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885115}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e0818c9
	68233595-6eac-4796-af70-8e9e8eb3974b

/tidb/cdc/default/default/upstream/7365375431770261931
	{"id":7365375431770261931,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/68233595-6eac-4796-af70-8e9e8eb3974b
	{"id":"68233595-6eac-4796-af70-8e9e8eb3974b","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885115}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e0818c9
	68233595-6eac-4796-af70-8e9e8eb3974b

/tidb/cdc/default/default/upstream/7365375431770261931
	{"id":7365375431770261931,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/68233595-6eac-4796-af70-8e9e8eb3974b
	{"id":"68233595-6eac-4796-af70-8e9e8eb3974b","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885115}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e0818c9
	68233595-6eac-4796-af70-8e9e8eb3974b

/tidb/cdc/default/default/upstream/7365375431770261931
	{"id":7365375431770261931,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[2024/05/05 12:58:38.046 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:38.058 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:38.058 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:38.063 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:38.131 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:38.145 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:38.148 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:38.152 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:38.174 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_capture.cli.9064.out cli tso query --pd=http://127.0.0.1:2379
table test.finish exists
check diff successfully
[2024/05/05 12:58:38.227 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:38.235 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:38.238 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:38.250 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:38.260 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:38.281 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:38.322 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:38.429 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:38.438 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:38.440 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:38.441 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
wait process cdc.test exit for 1-th time...
[Sun May  5 12:58:38 CST 2024] <<<<<< START kafka consumer in changefeed_finish case >>>>>>
[2024/05/05 12:58:38.523 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:38.540 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:38.541 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:38.549 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:38.569 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:38.642 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:38.650 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:38.651 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:38.664 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:38.674 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:38.715 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
table ddl_manager.finish_mark not exists for 11-th check, retry later
check diff failed 1-th time, retry later
[2024/05/05 12:58:38.727 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:38.818 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:38.824 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:38.829 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:38.830 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:38.927 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:38.934 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:38.940 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:38.943 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:38.962 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
PASS
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
check diff failed 1-th time, retry later
wait process cdc.test exit for 2-th time...
[2024/05/05 12:58:39.058 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:39.114 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:39.122 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:39.130 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:39.142 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:39.157 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:39.170 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:58:39 CST 2024] <<<<<< run test case multi_topics_v2 success! >>>>>>
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.tiflash.cli.2821.out cli tso query --pd=http://127.0.0.1:2379
[2024/05/05 12:58:39.259 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:39.264 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:39.264 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:39.272 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:39.355 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:39.362 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:39.365 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:39.371 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:39.417 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:39.467 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:39.520 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:39.525 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:39.535 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:39.550 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:39.572 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:39.620 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:39.720 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:39.727 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:39.729 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:39.737 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:39.815 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:39.823 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:39.831 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:39.839 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:39.850 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:39.923 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:39.934 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:39.943 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:39.949 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:39.965 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:39.976 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
+ set +x
+ tso='449546844457140225
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546844457140225 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[2024/05/05 12:58:40.024 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:40.119 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:40.124 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:40.127 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:40.138 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:40.224 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:40.234 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:40.241 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
+ set +x
table changefeed_error.usertable not exists for 1-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:39 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/80123d2f-afd4-4d88-8cf4-fee660b25f6f
	{"id":"80123d2f-afd4-4d88-8cf4-fee660b25f6f","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885117}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e0ec5c3
	80123d2f-afd4-4d88-8cf4-fee660b25f6f

/tidb/cdc/default/default/upstream/7365375443314580976
	{"id":7365375443314580976,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/80123d2f-afd4-4d88-8cf4-fee660b25f6f
	{"id":"80123d2f-afd4-4d88-8cf4-fee660b25f6f","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885117}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e0ec5c3
	80123d2f-afd4-4d88-8cf4-fee660b25f6f

/tidb/cdc/default/default/upstream/7365375443314580976
	{"id":7365375443314580976,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/80123d2f-afd4-4d88-8cf4-fee660b25f6f
	{"id":"80123d2f-afd4-4d88-8cf4-fee660b25f6f","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885117}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e0ec5c3
	80123d2f-afd4-4d88-8cf4-fee660b25f6f

/tidb/cdc/default/default/upstream/7365375443314580976
	{"id":7365375443314580976,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Create changefeed successfully!
ID: 66a2a673-fe39-4270-8427-350be434bd3d
Info: {"upstream_id":7365375443314580976,"namespace":"default","id":"66a2a673-fe39-4270-8427-350be434bd3d","sink_uri":"kafka://127.0.0.1:9092/ticdc-new_ci_collation-test-11922?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:58:40.128484163+08:00","start_ts":449546844046622721,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546844046622721,"checkpoint_ts":449546844046622721,"checkpoint_time":"2024-05-05 12:58:36.755"}
[Sun May  5 12:58:40 CST 2024] <<<<<< START kafka consumer in new_ci_collation case >>>>>>
[2024/05/05 12:58:40.248 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:40.260 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:40.327 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:40.345 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:40.346 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:40.358 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:40.369 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:40.416 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:40.427 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:40.516 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:40.523 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:40.528 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:40.531 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:40.631 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:40.641 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:40.642 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:40.654 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:40.670 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:40.730 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:40.749 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
+ set +x
+ tso='449546844709584898
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546844709584898 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:58:40 CST 2024] <<<<<< START cdc server in tiflash case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.tiflash.28542856.out server --log-file /tmp/tidb_cdc_test/tiflash/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/tiflash/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table ddl_manager.finish_mark not exists for 12-th check, retry later
[2024/05/05 12:58:40.763 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:40.767 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:40.775 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:40.790 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:40.794 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:40.874 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:40.876 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:40.946 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:40.951 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:40.963 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:40.969 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
***************** properties *****************
"mysql.host"="127.0.0.1"
"mysql.port"="4000"
"dotransactions"="false"
"mysql.user"="root"
"operationcount"="0"
"readallfields"="true"
"mysql.db"="multi_capture_1"
"updateproportion"="0"
"workload"="core"
"recordcount"="10"
"threadcount"="2"
"readproportion"="0"
"scanproportion"="0"
"insertproportion"="0"
"requestdistribution"="uniform"
**********************************************
Run finished, takes 8.18994ms
INSERT - Takes(s): 0.0, Count: 10, OPS: 2215.7, Avg(us): 1548, Min(us): 932, Max(us): 3643, 95th(us): 4000, 99th(us): 4000
[2024/05/05 12:58:41.013 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:41.053 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:41.075 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:41.113 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:41.122 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:41.129 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:41.150 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:41.159 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
check diff failed 2-th time, retry later
[2024/05/05 12:58:41.422 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:41.429 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7925500019	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s, pid:1482, start at 2024-05-05 12:58:38.328596133 +0800 CST m=+5.184962402	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:38.337 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:38.343 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:38.343 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7925500019	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s, pid:1482, start at 2024-05-05 12:58:38.328596133 +0800 CST m=+5.184962402	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:38.337 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:38.343 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:38.343 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c792858000a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s, pid:1564, start at 2024-05-05 12:58:38.497466015 +0800 CST m=+5.300498698	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:38.506 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:38.486 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:38.486 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/many_pk_or_uk/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/many_pk_or_uk/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/many_pk_or_uk/tiflash/log/proxy.log"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/many_pk_or_uk/tiflash/db/proxy"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/many_pk_or_uk/tiflash-proxy.toml"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
check diff failed 2-th time, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/processor_stop_delay/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[2024/05/05 12:58:41.621 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:41.637 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:41.646 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:41.720 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:41.743 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
table new_ci_collation_test.t1 not exists for 1-th check, retry later
***************** properties *****************
"operationcount"="0"
"threadcount"="2"
"mysql.port"="4000"
"workload"="core"
"updateproportion"="0"
"insertproportion"="0"
"mysql.user"="root"
"requestdistribution"="uniform"
"readallfields"="true"
"mysql.host"="127.0.0.1"
"dotransactions"="false"
"recordcount"="10"
"scanproportion"="0"
"mysql.db"="multi_capture_2"
"readproportion"="0"
**********************************************
Run finished, takes 8.774423ms
INSERT - Takes(s): 0.0, Count: 10, OPS: 2103.3, Avg(us): 1682, Min(us): 989, Max(us): 3900, 95th(us): 4000, 99th(us): 4000
[2024/05/05 12:58:41.851 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:41.925 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:41.933 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:41.936 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:41.944 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:41.945 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:41.952 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
table changefeed_error.usertable exists
[2024/05/05 12:58:42.032 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:42.041 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:42.117 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:42.135 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:42.139 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:42.148 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:42.172 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:42.254 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
check diff failed 1-th time, retry later
[2024/05/05 12:58:42.275 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:42.278 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:42.313 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:42.313 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:42.316 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:42.323 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:42.369 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:42.383 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:42.443 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:42.464 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:42.468 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:42.476 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:42.529 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
table ddl_manager.finish_mark not exists for 13-th check, retry later
[2024/05/05 12:58:42.613 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:42.636 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:42.637 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs45c1dcbb_4dd3_480c_9726_6c07e65944c4"]
[2024/05/05 12:58:42.641 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:42.659 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:42.668 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:42.671 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:42.756 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:42.817 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:42.844 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:42.914 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:42.914 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:42.925 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:42.952 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:43.033 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:43.041 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:43.055 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:43.060 +08:00] [INFO] [main.go:178] ["73 insert success: 1900"]
[2024/05/05 12:58:43.061 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:43.117 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:43.179 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:43.257 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:43.279 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:58:43.305 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
check diff successfully
***************** properties *****************
"operationcount"="0"
"mysql.port"="4000"
"updateproportion"="0"
"threadcount"="2"
"mysql.host"="127.0.0.1"
"workload"="core"
"insertproportion"="0"
"dotransactions"="false"
"scanproportion"="0"
"recordcount"="10"
"readproportion"="0"
"readallfields"="true"
"mysql.user"="root"
"mysql.db"="multi_capture_3"
"requestdistribution"="uniform"
**********************************************
Run finished, takes 11.610274ms
INSERT - Takes(s): 0.0, Count: 10, OPS: 1313.4, Avg(us): 2162, Min(us): 1072, Max(us): 4061, 95th(us): 5000, 99th(us): 5000
check diff failed 3-th time, retry later
[2024/05/05 12:58:43.369 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:43.373 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:43.467 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:43.541 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:43.569 +08:00] [INFO] [main.go:178] ["73 insert success: 1900"]
table new_ci_collation_test.t1 exists
table new_ci_collation_test.t2 not exists for 1-th check, retry later
[2024/05/05 12:58:43.596 +08:00] [INFO] [main.go:178] ["73 insert success: 1900"]
[2024/05/05 12:58:43.631 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsce2b5e5f_9d5a_4659_b3a3_959ac9482374"]
[2024/05/05 12:58:43.656 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:43.668 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:43.676 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsce665d1c_8c8f_4344_9cc4_128983ed6ab7"]
[2024/05/05 12:58:43.731 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsb6386e1f_1a34_4c95_9fda_beaae7b588a7"]
[2024/05/05 12:58:43.752 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs6bcf89f9_8899_4cd5_a4b8_5d9672770849"]
[2024/05/05 12:58:43.757 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs7ebd0e8e_1db8_4592_abe1_c4d50b69446d"]
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.many_pk_or_uk.cli.2909.out cli tso query --pd=http://127.0.0.1:2379
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:43 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8874314a-c9cb-4182-8a67-67edb84107a3
	{"id":"8874314a-c9cb-4182-8a67-67edb84107a3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885121}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e1191d6
	8874314a-c9cb-4182-8a67-67edb84107a3

/tidb/cdc/default/default/upstream/7365375445664432270
	{"id":7365375445664432270,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8874314a-c9cb-4182-8a67-67edb84107a3
	{"id":"8874314a-c9cb-4182-8a67-67edb84107a3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885121}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e1191d6
	8874314a-c9cb-4182-8a67-67edb84107a3

/tidb/cdc/default/default/upstream/7365375445664432270
	{"id":7365375445664432270,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8874314a-c9cb-4182-8a67-67edb84107a3
	{"id":"8874314a-c9cb-4182-8a67-67edb84107a3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885121}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e1191d6
	8874314a-c9cb-4182-8a67-67edb84107a3

/tidb/cdc/default/default/upstream/7365375445664432270
	{"id":7365375445664432270,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Create changefeed successfully!
ID: aada63ad-41e6-400b-8af3-2e7bb346aceb
Info: {"upstream_id":7365375445664432270,"namespace":"default","id":"aada63ad-41e6-400b-8af3-2e7bb346aceb","sink_uri":"kafka://127.0.0.1:9092/ticdc-tiflash-test-13670?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:58:44.065356118+08:00","start_ts":449546844709584898,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546844709584898,"checkpoint_ts":449546844709584898,"checkpoint_time":"2024-05-05 12:58:39.284"}
[2024/05/05 12:58:43.967 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:43.971 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:44.025 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:44.042 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:44.046 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:44.056 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:44.090 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
***************** properties *****************
"dotransactions"="false"
"mysql.db"="multi_capture_4"
"mysql.host"="127.0.0.1"
"workload"="core"
"threadcount"="2"
"readproportion"="0"
"requestdistribution"="uniform"
"recordcount"="10"
"scanproportion"="0"
"operationcount"="0"
"mysql.user"="root"
"updateproportion"="0"
"mysql.port"="4000"
"insertproportion"="0"
"readallfields"="true"
**********************************************
Run finished, takes 9.698319ms
INSERT - Takes(s): 0.0, Count: 9, OPS: 1793.1, Avg(us): 1559, Min(us): 1114, Max(us): 4538, 95th(us): 5000, 99th(us): 5000
[Sun May  5 12:58:44 CST 2024] <<<<<< START cdc server in multi_capture case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8301/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_capture.91939195.out server --log-file /tmp/tidb_cdc_test/multi_capture/cdc1.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_capture/cdc_data1 --cluster-id default --addr 127.0.0.1:8301
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8301/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8301 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8301; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[Sun May  5 12:58:44 CST 2024] <<<<<< START kafka consumer in tiflash case >>>>>>
[2024/05/05 12:58:44.116 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:44.132 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:44.136 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:44.143 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:44.145 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:44.182 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs1c4240de_8001_4407_bce8_91167e95dbcf"]
[2024/05/05 12:58:44.217 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsdba39c2e_672f_4fca_9e18_bf367fc934e6"]
check diff successfully
***************** properties *****************
"mysql.host"="127.0.0.1"
"mysql.user"="root"
"dotransactions"="false"
"mysql.port"="4000"
"mysql.db"="changefeed_error"
"scanproportion"="0"
"insertproportion"="0"
"operationcount"="0"
"readproportion"="0"
"requestdistribution"="uniform"
"updateproportion"="0"
"readallfields"="true"
"threadcount"="4"
"recordcount"="20"
"workload"="core"
**********************************************
Run finished, takes 4.76728ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 5934.6, Avg(us): 804, Min(us): 495, Max(us): 1913, 95th(us): 2000, 99th(us): 2000
table cdc_tiflash_test.multi_data_type not exists for 1-th check, retry later
table ddl_manager.finish_mark not exists for 14-th check, retry later
[2024/05/05 12:58:44.411 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:44.511 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:44.526 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:44.538 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:44.550 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:44.611 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
check diff successfully
{"id":"cd3ec868-b5ac-485a-92c6-ae45a61099c8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885113}
check_etcd_meta_not_exist '/tidb/cdc/default/__cdc_meta__/capture' 'capture'
+ key_prefix=/tidb/cdc/default/__cdc_meta__/capture
+ message=capture
++ etcdctl get /tidb/cdc/default/__cdc_meta__/capture --prefix --keys-only
+ info=/tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8
+ [[ /tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8 =~ capture ]]
+ echo 'capture contains in etcd /tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8'
capture contains in etcd /tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8
+ echo 'check failed'
check failed
+ exit 1
run task failed 1-th time, retry later
[2024/05/05 12:58:44.637 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:44.648 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:44.655 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:44.659 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:44.670 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:44.672 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:44.726 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:44.731 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:44.734 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:44.748 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:44.782 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:44.919 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:44.930 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:44.942 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:45.016 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:45.120 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
check diff failed 4-th time, retry later
[2024/05/05 12:58:45.147 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:45.221 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:45.227 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:45.235 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:45.246 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:45.246 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:45.325 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:45.325 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:45.332 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:45.343 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:45.347 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
+ set +x
+ tso='449546845904175105
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546845904175105 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:58:45 CST 2024] <<<<<< START cdc server in many_pk_or_uk case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.many_pk_or_uk.29492951.out server --log-file /tmp/tidb_cdc_test/many_pk_or_uk/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/many_pk_or_uk/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table new_ci_collation_test.t2 exists
table new_ci_collation_test.t3 not exists for 1-th check, retry later
[2024/05/05 12:58:45.433 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:45.440 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:45.445 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:45.456 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:45.530 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:45.555 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
start tidb cluster in /tmp/tidb_cdc_test/processor_stop_delay
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
[2024/05/05 12:58:45.712 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:45.719 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:45.733 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:45.819 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:45.822 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:45.939 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:45.944 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:45.951 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:46.026 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:46.035 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:46.129 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:46.141 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:46.146 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:46.155 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:46.213 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:46.222 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:46.251 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:46.254 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:46.262 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:46.313 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:46.314 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
table cdc_tiflash_test.multi_data_type not exists for 2-th check, retry later
check_etcd_meta_not_exist '/tidb/cdc/default/__cdc_meta__/capture' 'capture'
+ key_prefix=/tidb/cdc/default/__cdc_meta__/capture
+ message=capture
++ etcdctl get /tidb/cdc/default/__cdc_meta__/capture --prefix --keys-only
+ info=/tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8
+ [[ /tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8 =~ capture ]]
+ echo 'capture contains in etcd /tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8'
capture contains in etcd /tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8
+ echo 'check failed'
check failed
+ exit 1
run task failed 2-th time, retry later
[2024/05/05 12:58:46.413 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:46.431 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:46.532 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:46.612 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:46.631 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:46.821 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:46.836 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:46.844 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:46.852 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
table ddl_manager.finish_mark not exists for 15-th check, retry later
Verifying downstream PD is started...
[2024/05/05 12:58:46.917 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:46.919 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:46.941 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:46.942 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:46.949 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:46.952 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:46.954 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:46.963 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:47.015 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:47.039 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:47.042 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:47.062 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
check diff failed 5-th time, retry later
[2024/05/05 12:58:47.331 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:47.335 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:47.352 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8301/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8301 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8301 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8301
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:47 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/85e80019-1a43-4797-b502-5a664e999f37
	{"id":"85e80019-1a43-4797-b502-5a664e999f37","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885124}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c24b
	85e80019-1a43-4797-b502-5a664e999f37

/tidb/cdc/default/default/upstream/7365375438406902655
	{"id":7365375438406902655,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/85e80019-1a43-4797-b502-5a664e999f37
	{"id":"85e80019-1a43-4797-b502-5a664e999f37","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885124}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c24b
	85e80019-1a43-4797-b502-5a664e999f37

/tidb/cdc/default/default/upstream/7365375438406902655
	{"id":7365375438406902655,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/85e80019-1a43-4797-b502-5a664e999f37
	{"id":"85e80019-1a43-4797-b502-5a664e999f37","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885124}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c24b
	85e80019-1a43-4797-b502-5a664e999f37

/tidb/cdc/default/default/upstream/7365375438406902655
	{"id":7365375438406902655,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 12:58:47 CST 2024] <<<<<< START cdc server in multi_capture case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8302/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_capture.92459247.out server --log-file /tmp/tidb_cdc_test/multi_capture/cdc2.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_capture/cdc_data2 --cluster-id default --addr 127.0.0.1:8302
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8302/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8302 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8302; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[2024/05/05 12:58:47.425 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:47.462 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:47.513 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:47.528 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:47.532 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:47.542 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:47.542 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:47.550 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:47.551 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:47.569 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:47.620 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:47.621 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:47.625 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:47.752 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:47.819 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
table new_ci_collation_test.t3 exists
table new_ci_collation_test.t4 not exists for 1-th check, retry later
[2024/05/05 12:58:47.924 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:47.942 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:48.037 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:48.123 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:48.125 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:48.139 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:48.222 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:48.225 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:48.227 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:48.229 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:48.248 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:48.314 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:48.314 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:48.314 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:48 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/2d63f1d9-ba46-4129-8006-0f9dad6a9cb6
	{"id":"2d63f1d9-ba46-4129-8006-0f9dad6a9cb6","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885125}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e1d5cd5
	2d63f1d9-ba46-4129-8006-0f9dad6a9cb6

/tidb/cdc/default/default/upstream/7365375468139763572
	{"id":7365375468139763572,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/2d63f1d9-ba46-4129-8006-0f9dad6a9cb6
	{"id":"2d63f1d9-ba46-4129-8006-0f9dad6a9cb6","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885125}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e1d5cd5
	2d63f1d9-ba46-4129-8006-0f9dad6a9cb6

/tidb/cdc/default/default/upstream/7365375468139763572
	{"id":7365375468139763572,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/2d63f1d9-ba46-4129-8006-0f9dad6a9cb6
	{"id":"2d63f1d9-ba46-4129-8006-0f9dad6a9cb6","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885125}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e1d5cd5
	2d63f1d9-ba46-4129-8006-0f9dad6a9cb6

/tidb/cdc/default/default/upstream/7365375468139763572
	{"id":7365375468139763572,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.many_pk_or_uk.cli.3007.out cli changefeed create --start-ts=449546845904175105 '--sink-uri=kafka://127.0.0.1:9092/ticdc-many-pk-or-uk-test-15350?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
[2024/05/05 12:58:48.430 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:48.435 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:48.616 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:48.636 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
Create changefeed successfully!
ID: be7100cb-87a2-43ad-b724-282067a2ad3a
Info: {"upstream_id":7365375468139763572,"namespace":"default","id":"be7100cb-87a2-43ad-b724-282067a2ad3a","sink_uri":"kafka://127.0.0.1:9092/ticdc-many-pk-or-uk-test-15350?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:58:48.870763465+08:00","start_ts":449546845904175105,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546845904175105,"checkpoint_ts":449546845904175105,"checkpoint_time":"2024-05-05 12:58:43.841"}
PASS
[2024/05/05 12:58:48.735 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:48.837 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:48.916 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:48.919 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
table cdc_tiflash_test.multi_data_type exists
table ddl_manager.finish_mark not exists for 16-th check, retry later
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
[2024/05/05 12:58:49.016 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:49.018 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:49.027 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:49.030 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:49.036 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:49.052 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:49.052 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:49.052 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:49.141 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:49.146 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
check diff successfully
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_handle_key_only/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[2024/05/05 12:58:49.237 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:49.413 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
[2024/05/05 12:58:49.441 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:49.513 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:49.523 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:49.636 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:49.639 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:49.652 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:49.653 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:49.659 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:49.664 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:49.665 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:49.669 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
check diff successfully
wait process cdc.test exit for 1-th time...
[2024/05/05 12:58:49.734 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:49.755 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:49.831 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:49.920 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
table new_ci_collation_test.t4 exists
table new_ci_collation_test.t5 not exists for 1-th check, retry later
[2024/05/05 12:58:50.011 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:50.039 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
wait process cdc.test exit for 2-th time...
[2024/05/05 12:58:50.221 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:50.235 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:50.312 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:50.313 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:50.315 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:50.315 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:50.321 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:50.332 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:50.360 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:50.366 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:50.381 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:50.429 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
+ set +x
[Sun May  5 12:58:50 CST 2024] <<<<<< START kafka consumer in many_pk_or_uk case >>>>>>
go: downloading github.com/pingcap/log v1.1.1-0.20240314023424-862ccc32f18d
go: downloading github.com/pingcap/errors v0.11.5-0.20240318064555-6bd07397691f
go: downloading github.com/pingcap/tidb-tools v0.0.0-20240305021104-9f9bea84490b
go: downloading github.com/BurntSushi/toml v1.3.2
go: downloading go.uber.org/zap v1.27.0
go: downloading github.com/pingcap/tidb v1.1.0-beta.0.20240415145106-cd9c676e9ba4
go: downloading gopkg.in/natefinch/lumberjack.v2 v2.2.1
go: downloading go.uber.org/atomic v1.11.0
go: downloading go.uber.org/multierr v1.11.0
go: downloading github.com/pingcap/tidb/pkg/parser v0.0.0-20240410110152-5fc42c9be2f5
go: downloading github.com/pingcap/failpoint v0.0.0-20220801062533-2eaa32854a6c
go: downloading github.com/go-sql-driver/mysql v1.7.1
go: downloading google.golang.org/grpc v1.62.1
go: downloading github.com/coreos/go-semver v0.3.1
check_etcd_meta_not_exist '/tidb/cdc/default/__cdc_meta__/capture' 'capture'
+ key_prefix=/tidb/cdc/default/__cdc_meta__/capture
+ message=capture
++ etcdctl get /tidb/cdc/default/__cdc_meta__/capture --prefix --keys-only
+ info=/tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8
+ [[ /tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8 =~ capture ]]
+ echo 'capture contains in etcd /tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8'
capture contains in etcd /tidb/cdc/default/__cdc_meta__/capture/cd3ec868-b5ac-485a-92c6-ae45a61099c8
+ echo 'check failed'
check failed
+ exit 1
run task failed 3-th time, retry later
[2024/05/05 12:58:50.512 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:50.638 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
go: downloading github.com/golang/protobuf v1.5.4
go: downloading golang.org/x/net v0.24.0
go: downloading google.golang.org/protobuf v1.33.0
go: downloading google.golang.org/genproto/googleapis/rpc v0.0.0-20240401170217-c3f982113cda
go: downloading golang.org/x/sys v0.19.0
go: downloading google.golang.org/genproto v0.0.0-20240401170217-c3f982113cda
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8302/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8302 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8302 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8302
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:50 GMT
< Content-Length: 1271
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/85e80019-1a43-4797-b502-5a664e999f37
	{"id":"85e80019-1a43-4797-b502-5a664e999f37","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885124}

/tidb/cdc/default/__cdc_meta__/capture/8d94a679-9f71-4056-af0b-8f8661b22571
	{"id":"8d94a679-9f71-4056-af0b-8f8661b22571","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885127}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c24b
	85e80019-1a43-4797-b502-5a664e999f37

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c281
	8d94a679-9f71-4056-af0b-8f8661b22571

/tidb/cdc/default/default/upstream/7365375438406902655
	{"id":7365375438406902655,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/85e80019-1a43-4797-b502-5a664e999f37
	{"id":"85e80019-1a43-4797-b502-5a664e999f37","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885124}

/tidb/cdc/default/__cdc_meta__/capture/8d94a679-9f71-4056-af0b-8f8661b22571
	{"id":"8d94a679-9f71-4056-af0b-8f8661b22571","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885127}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c24b
	85e80019-1a43-4797-b502-5a664e999f37

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c281
	8d94a679-9f71-4056-af0b-8f8661b22571

/tidb/cdc/default/default/upstream/7365375438406902655
	{"id":7365375438406902655,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/85e80019-1a43-4797-b502-5a664e999f37
	{"id":"85e80019-1a43-4797-b502-5a664e999f37","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885124}

/tidb/cdc/default/__cdc_meta__/capture/8d94a679-9f71-4056-af0b-8f8661b22571
	{"id":"8d94a679-9f71-4056-af0b-8f8661b22571","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885127}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c24b
	85e80019-1a43-4797-b502-5a664e999f37

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c281
	8d94a679-9f71-4056-af0b-8f8661b22571

/tidb/cdc/default/default/upstream/7365375438406902655
	{"id":7365375438406902655,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 12:58:50 CST 2024] <<<<<< START cdc server in multi_capture case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8303/debug/info --user ticdc:ticdc_secret -vsL'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_capture.93069308.out server --log-file /tmp/tidb_cdc_test/multi_capture/cdc3.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_capture/cdc_data3 --cluster-id default --addr 127.0.0.1:8303
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8303/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8303 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8303; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
wait process cdc.test exit for 3-th time...
table ddl_manager.finish_mark not exists for 17-th check, retry later
[2024/05/05 12:58:50.832 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:50.851 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:50.951 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:50.952 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:50.956 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:50.957 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
go: downloading golang.org/x/text v0.14.0
check diff failed 1-th time, retry later
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 12:58:51 CST 2024] <<<<<< run test case tiflash success! >>>>>>
[2024/05/05 12:58:50.966 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:50.972 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:51.022 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:51.023 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:51.026 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:51.032 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:51.043 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:51.126 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:51.312 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:51.340 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/storage_cleanup/run.sh using Sink-Type: kafka... <<=================
+++ dirname /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/storage_cleanup/run.sh
++ cd /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/storage_cleanup
++ pwd
+ CUR=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/storage_cleanup
+ source /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/storage_cleanup/../_utils/test_prepare
++ UP_TIDB_HOST=127.0.0.1
++ UP_TIDB_PORT=4000
++ UP_TIDB_OTHER_PORT=4001
++ UP_TIDB_STATUS=10080
++ UP_TIDB_OTHER_STATUS=10081
++ DOWN_TIDB_HOST=127.0.0.1
++ DOWN_TIDB_PORT=3306
++ DOWN_TIDB_STATUS=20080
++ TLS_TIDB_HOST=127.0.0.1
++ TLS_TIDB_PORT=3307
++ TLS_TIDB_STATUS=30080
++ UP_PD_HOST_1=127.0.0.1
++ UP_PD_PORT_1=2379
++ UP_PD_PEER_PORT_1=2380
++ UP_PD_HOST_2=127.0.0.1
++ UP_PD_PORT_2=2679
++ UP_PD_PEER_PORT_2=2680
++ UP_PD_HOST_3=127.0.0.1
++ UP_PD_PORT_3=2779
++ UP_PD_PEER_PORT_3=2780
++ DOWN_PD_HOST=127.0.0.1
++ DOWN_PD_PORT=2479
++ DOWN_PD_PEER_PORT=2480
++ TLS_PD_HOST=127.0.0.1
++ TLS_PD_PORT=2579
++ TLS_PD_PEER_PORT=2580
++ UP_TIKV_HOST_1=127.0.0.1
++ UP_TIKV_PORT_1=20160
++ UP_TIKV_STATUS_PORT_1=20181
++ UP_TIKV_HOST_2=127.0.0.1
++ UP_TIKV_PORT_2=20161
++ UP_TIKV_STATUS_PORT_2=20182
++ UP_TIKV_HOST_3=127.0.0.1
++ UP_TIKV_PORT_3=20162
++ UP_TIKV_STATUS_PORT_3=20183
++ DOWN_TIKV_HOST=127.0.0.1
++ DOWN_TIKV_PORT=21160
++ DOWN_TIKV_STATUS_PORT=21180
++ TLS_TIKV_HOST=127.0.0.1
++ TLS_TIKV_PORT=22160
++ TLS_TIKV_STATUS_PORT=22180
+++ cat /tmp/tidb_cdc_test/KAFKA_VERSION
+++ echo 2.4.1
++ KAFKA_VERSION=2.4.1
+ WORK_DIR=/tmp/tidb_cdc_test/storage_cleanup
+ CDC_BINARY=cdc.test
+ SINK_TYPE=kafka
+ EXIST_FILES=()
+ CLEANED_FILES=()
+ trap stop_tidb_cluster EXIT
+ run kafka
+ '[' kafka '!=' storage ']'
+ return
+ check_logs /tmp/tidb_cdc_test/storage_cleanup
++ date
+ echo '[Sun May  5 12:58:50 CST 2024] <<<<<< run test case storage_cleanup success! >>>>>>'
[Sun May  5 12:58:50 CST 2024] <<<<<< run test case storage_cleanup success! >>>>>>
+ stop_tidb_cluster
[2024/05/05 12:58:51.513 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:51.526 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:51.528 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:51.544 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:51.545 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:51.611 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:51.615 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:51.620 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:51.631 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:51.670 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
table new_ci_collation_test.t5 exists
check diff failed 1-th time, retry later
start tidb cluster in /tmp/tidb_cdc_test/kafka_simple_handle_key_only
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
[2024/05/05 12:58:51.756 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:51.819 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:51.823 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsdb8f092b_80b7_40b8_968a_50f45bc50196"]
[2024/05/05 12:58:51.944 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:52.031 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:52.034 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:52.115 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:52.119 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:52.218 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:52.226 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:52.229 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:52.235 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:52.434 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:52.450 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:52.454 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:52.459 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:52.533 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:52.618 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:52.620 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:52.632 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:52.640 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:58:52.713 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:52.725 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:52.735 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
table ddl_manager.finish_mark not exists for 18-th check, retry later
go: downloading github.com/cznic/mathutil v0.0.0-20181122101859-297441e03548
go: downloading golang.org/x/exp v0.0.0-20240409090435-93d18d7e34b8
go: downloading github.com/tikv/client-go/v2 v2.0.8-0.20240409022718-714958ccd4d5
go: downloading github.com/pingcap/kvproto v0.0.0-20240227073058-929ab83f9754
go: downloading github.com/prometheus/client_golang v1.19.0
go: downloading github.com/influxdata/tdigest v0.0.1
go: downloading github.com/tiancaiamao/gp v0.0.0-20221230034425-4025bc8a4d4a
go: downloading go.etcd.io/etcd/client/v3 v3.5.12
go: downloading github.com/opentracing/opentracing-go v1.2.0
go: downloading github.com/pingcap/tipb v0.0.0-20240318032315-55a7867ddd50
go: downloading golang.org/x/sync v0.7.0
go: downloading github.com/tidwall/btree v1.7.0
go: downloading github.com/ngaut/pools v0.0.0-20180318154953-b7bc8c42aac7
go: downloading github.com/scalalang2/golang-fifo v0.1.5
go: downloading github.com/grpc-ecosystem/go-grpc-middleware v1.4.0
go: downloading github.com/danjacques/gofslock v0.0.0-20240212154529-d899e02bfe22
go: downloading github.com/uber/jaeger-client-go v2.30.0+incompatible
go: downloading github.com/pingcap/sysutil v1.0.1-0.20240311050922-ae81ee01f3a5
go: downloading github.com/google/uuid v1.6.0
go: downloading github.com/docker/go-units v0.5.0
go: downloading github.com/gorilla/mux v1.8.0
go: downloading github.com/prometheus/client_model v0.6.1
go: downloading github.com/shirou/gopsutil/v3 v3.24.2
go: downloading github.com/tikv/pd/client v0.0.0-20240322051414-fb9e2d561b6e
go: downloading github.com/spf13/pflag v1.0.5
go: downloading github.com/google/btree v1.1.2
go: downloading github.com/coocood/freecache v1.2.1
go: downloading github.com/jellydator/ttlcache/v3 v3.0.1
go: downloading gopkg.in/yaml.v2 v2.4.0
go: downloading github.com/yangkeao/ldap/v3 v3.4.5-0.20230421065457-369a3bab1117
go: downloading github.com/stretchr/testify v1.9.0
go: downloading github.com/cockroachdb/errors v1.11.1
go: downloading github.com/twmb/murmur3 v1.1.6
go: downloading cloud.google.com/go/storage v1.39.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azcore v1.9.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.5.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/storage/azblob v1.0.0
go: downloading go.etcd.io/etcd/api/v3 v3.5.12
go: downloading github.com/gogo/protobuf v1.3.2
go: downloading github.com/aliyun/alibaba-cloud-sdk-go v1.61.1581
go: downloading github.com/aws/aws-sdk-go v1.50.0
go: downloading github.com/tikv/pd v1.1.0-beta.0.20240407022249-7179657d129b
go: downloading github.com/go-resty/resty/v2 v2.11.0
go: downloading github.com/klauspost/compress v1.17.8
go: downloading golang.org/x/tools v0.20.0
go: downloading github.com/ks3sdklib/aws-sdk-go v1.2.9
go: downloading cloud.google.com/go v0.112.2
go: downloading golang.org/x/oauth2 v0.18.0
go: downloading google.golang.org/api v0.170.0
go: downloading github.com/dolthub/swiss v0.2.1
go: downloading github.com/golang/snappy v0.0.4
go: downloading github.com/opentracing/basictracer-go v1.1.0
go: downloading github.com/ngaut/sync2 v0.0.0-20141008032647-7a24ed77b2ef
go: downloading github.com/cespare/xxhash/v2 v2.3.0
go: downloading go.uber.org/mock v0.4.0
go: downloading github.com/cockroachdb/pebble v1.1.0
go: downloading github.com/jfcg/sorty/v2 v2.1.0
go: downloading golang.org/x/time v0.5.0
go: downloading github.com/joho/sqltocsv v0.0.0-20210428211105-a6d6801d59df
go: downloading github.com/carlmjohnson/flagext v0.21.0
go: downloading github.com/jedib0t/go-pretty/v6 v6.2.2
go: downloading github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec
go: downloading github.com/Azure/go-ntlmssp v0.0.0-20221128193559-754e69321358
go: downloading github.com/go-asn1-ber/asn1-ber v1.5.4
go: downloading go.etcd.io/etcd/client/pkg/v3 v3.5.12
go: downloading github.com/dgraph-io/ristretto v0.1.1
go: downloading github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc
go: downloading github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2
go: downloading gopkg.in/yaml.v3 v3.0.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/internal v1.5.1
go: downloading github.com/AzureAD/microsoft-authentication-library-for-go v1.2.1
go: downloading golang.org/x/crypto v0.22.0
go: downloading github.com/beorn7/perks v1.0.1
go: downloading github.com/prometheus/common v0.52.2
go: downloading github.com/prometheus/procfs v0.13.0
go: downloading github.com/pkg/errors v0.9.1
go: downloading github.com/uber/jaeger-lib v2.4.1+incompatible
go: downloading github.com/cockroachdb/logtags v0.0.0-20230118201751-21c54148d20b
go: downloading github.com/cockroachdb/redact v1.1.5
go: downloading github.com/getsentry/sentry-go v0.27.0
go: downloading github.com/lestrrat-go/jwx/v2 v2.0.21
go: downloading github.com/cloudfoundry/gosigar v1.3.6
go: downloading github.com/dgryski/go-farm v0.0.0-20200201041132-a6ae2369ad13
go: downloading github.com/tklauser/go-sysconf v0.3.12
go: downloading github.com/otiai10/copy v1.2.0
go: downloading github.com/spkg/bom v1.0.0
go: downloading github.com/xitongsys/parquet-go v1.6.0
go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2
go: downloading github.com/dolthub/maphash v0.1.0
go: downloading google.golang.org/genproto/googleapis/api v0.0.0-20240401170217-c3f982113cda
go: downloading github.com/google/pprof v0.0.0-20240117000934-35fc243c5815
go: downloading github.com/wangjohn/quickselect v0.0.0-20161129230411-ed8402a42d5f
go: downloading github.com/jfcg/sixb v1.3.8
go: downloading github.com/kr/pretty v0.3.1
go: downloading github.com/coreos/go-systemd/v22 v22.5.0
go: downloading cloud.google.com/go/compute/metadata v0.2.3
go: downloading cloud.google.com/go/iam v1.1.7
go: downloading cloud.google.com/go/compute v1.25.1
go: downloading github.com/googleapis/gax-go/v2 v2.12.3
go: downloading github.com/cheggaaa/pb/v3 v3.0.8
go: downloading github.com/mattn/go-runewidth v0.0.15
go: downloading github.com/robfig/cron/v3 v3.0.1
go: downloading github.com/pingcap/goleveldb v0.0.0-20191226122134-f82aafb29989
go: downloading github.com/pingcap/badger v1.5.1-0.20230103063557-828f39b09b6d
go: downloading github.com/robfig/cron v1.2.0
go: downloading github.com/kylelemons/godebug v1.1.0
go: downloading github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c
go: downloading github.com/tklauser/numcpus v0.6.1
go: downloading github.com/kr/text v0.2.0
go: downloading github.com/rogpeppe/go-internal v1.12.0
go: downloading go.opencensus.io v0.23.1-0.20220331163232-052120675fac
go: downloading go.opentelemetry.io/otel v1.24.0
go: downloading go.opentelemetry.io/otel/trace v1.24.0
go: downloading github.com/apache/thrift v0.16.0
go: downloading github.com/rivo/uniseg v0.4.7
go: downloading github.com/VividCortex/ewma v1.2.0
go: downloading github.com/fatih/color v1.16.0
[2024/05/05 12:58:53.016 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:53.030 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:53.052 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:53.059 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:53.071 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:53.143 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:53.146 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:53.155 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:58:53.161 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:53.215 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8303/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8303 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8303 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8303
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:53 GMT
< Content-Length: 1750
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/85e80019-1a43-4797-b502-5a664e999f37
	{"id":"85e80019-1a43-4797-b502-5a664e999f37","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885124}

/tidb/cdc/default/__cdc_meta__/capture/8d94a679-9f71-4056-af0b-8f8661b22571
	{"id":"8d94a679-9f71-4056-af0b-8f8661b22571","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885127}

/tidb/cdc/default/__cdc_meta__/capture/e8090f07-04e8-4889-8c52-004ac93d6906
	{"id":"e8090f07-04e8-4889-8c52-004ac93d6906","address":"127.0.0.1:8303","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885130}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c24b
	85e80019-1a43-4797-b502-5a664e999f37

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c281
	8d94a679-9f71-4056-af0b-8f8661b22571

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c2a3
	e8090f07-04e8-4889-8c52-004ac93d6906

/tidb/cdc/default/default/upstream/7365375438406902655
	{"id":7365375438406902655,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/85e80019-1a43-4797-b502-5a664e999f37
	{"id":"85e80019-1a43-4797-b502-5a664e999f37","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885124}

/tidb/cdc/default/__cdc_meta__/capture/8d94a679-9f71-4056-af0b-8f8661b22571
	{"id":"8d94a679-9f71-4056-af0b-8f8661b22571","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885127}

/tidb/cdc/default/__cdc_meta__/capture/e8090f07-04e8-4889-8c52-004ac93d6906
	{"id":"e8090f07-04e8-4889-8c52-004ac93d6906","address":"127.0.0.1:8303","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885130}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c24b
	85e80019-1a43-4797-b502-5a664e999f37

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c281
	8d94a679-9f71-4056-af0b-8f8661b22571

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c2a3
	e8090f07-04e8-4889-8c52-004ac93d6906

/tidb/cdc/default/default/upstream/7365375438406902655
	{"id":7365375438406902655,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/85e80019-1a43-4797-b502-5a664e999f37
	{"id":"85e80019-1a43-4797-b502-5a664e999f37","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885124}

/tidb/cdc/default/__cdc_meta__/capture/8d94a679-9f71-4056-af0b-8f8661b22571
	{"id":"8d94a679-9f71-4056-af0b-8f8661b22571","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885127}

/tidb/cdc/default/__cdc_meta__/capture/e8090f07-04e8-4889-8c52-004ac93d6906
	{"id":"e8090f07-04e8-4889-8c52-004ac93d6906","address":"127.0.0.1:8303","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885130}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c24b
	85e80019-1a43-4797-b502-5a664e999f37

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c281
	8d94a679-9f71-4056-af0b-8f8661b22571

/tidb/cdc/default/__cdc_meta__/owner/22318f471e11c2a3
	e8090f07-04e8-4889-8c52-004ac93d6906

/tidb/cdc/default/default/upstream/7365375438406902655
	{"id":7365375438406902655,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_capture.cli.9358.out cli changefeed create --start-ts=449546844457140225 '--sink-uri=kafka://127.0.0.1:9092/ticdc-multi-capture-test-11466?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' --server=127.0.0.1:8301
go: downloading github.com/mattn/go-colorable v0.1.13
go: downloading github.com/mattn/go-isatty v0.0.20
go: downloading github.com/lestrrat-go/blackmagic v1.0.2
go: downloading github.com/lestrrat-go/httprc v1.0.5
go: downloading github.com/lestrrat-go/iter v1.0.2
go: downloading github.com/lestrrat-go/option v1.0.1
go: downloading github.com/dustin/go-humanize v1.0.1
go: downloading github.com/golang/glog v1.2.0
go: downloading github.com/golang-jwt/jwt/v5 v5.2.0
go: downloading github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da
go: downloading github.com/ncw/directio v1.0.5
go: downloading github.com/coocood/rtutil v0.0.0-20190304133409-c84515f646f2
go: downloading github.com/coocood/bbloom v0.0.0-20190830030839-58deb6228d64
go: downloading github.com/lestrrat-go/httpcc v1.0.1
go: downloading github.com/klauspost/cpuid v1.3.1
go: downloading github.com/golang-jwt/jwt v3.2.2+incompatible
go: downloading github.com/go-logr/logr v1.4.1
go: downloading go.opentelemetry.io/otel/metric v1.24.0
go: downloading github.com/go-logr/stdr v1.2.2
[2024/05/05 12:58:53.316 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsee1f5db7_41c3_44d8_984d_086f5461610f"]
check diff failed 2-th time, retry later
go: downloading github.com/DataDog/zstd v1.5.5
go: downloading github.com/cockroachdb/tokenbucket v0.0.0-20230807174530-cc333fc44b06
check diff failed 2-th time, retry later
[2024/05/05 12:58:53.613 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:53.631 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:53.655 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:53.712 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:58:53.714 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:53.750 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:53.757 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:53.763 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:58:53.767 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:53.775 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
Create changefeed successfully!
ID: 698561b7-6c6e-401a-883e-1a97a40a67fe
Info: {"upstream_id":7365375438406902655,"namespace":"default","id":"698561b7-6c6e-401a-883e-1a97a40a67fe","sink_uri":"kafka://127.0.0.1:9092/ticdc-multi-capture-test-11466?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:58:53.644500015+08:00","start_ts":449546844457140225,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546844457140225,"checkpoint_ts":449546844457140225,"checkpoint_time":"2024-05-05 12:58:38.321"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:53.822 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:53.828 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:54.023 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:54.045 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/csv_storage_basic/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:58:53 CST 2024] <<<<<< run test case csv_storage_basic success! >>>>>>
[2024/05/05 12:58:54.125 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:58:54.131 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:54.163 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:54.182 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:58:54.219 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:54.231 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
[2024/05/05 12:58:54.326 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:58:54.342 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsbb175dd2_528e_4508_9537_0e258466882e"]
[2024/05/05 12:58:54.350 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:54.422 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsf2c80e91_3d06_40e5_a45c_50afab4eb3d8"]
[2024/05/05 12:58:54.444 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs71e1fae0_6852_477e_b9db_c0357ec71760"]
[2024/05/05 12:58:54.451 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:54.536 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
table ddl_manager.finish_mark not exists for 19-th check, retry later
[2024/05/05 12:58:54.642 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:54.714 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:54.836 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:58:54.849 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:54.853 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:54.853 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:54.872 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:54.912 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:54.924 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:54.930 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:54.939 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:54.950 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs1fa375a9_bf4f_4935_9e49_afd2fa004cfb"]
[2024/05/05 12:58:55.018 +08:00] [INFO] [main.go:178] ["73 insert success: 1900"]
+ set +x
[Sun May  5 12:58:55 CST 2024] <<<<<< START kafka consumer in multi_capture case >>>>>>
table multi_capture_1.usertable not exists for 1-th check, retry later
[2024/05/05 12:58:55.118 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:55.247 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:55.334 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs79acc791_56d5_4054_8b2e_f17f287754e1"]
[2024/05/05 12:58:55.643 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:55.645 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:55.648 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:55.655 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:55.661 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:55.719 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:55.733 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:55.744 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:55.747 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:55.751 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:55.772 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:55.861 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:55.925 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:55.933 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
check diff failed 3-th time, retry later
check diff failed 3-th time, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:56.213 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:56.214 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:56.221 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:56.228 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:56.250 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:56.254 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:56.333 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:56.338 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
table ddl_manager.finish_mark not exists for 20-th check, retry later
[2024/05/05 12:58:56.345 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:56.350 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:56.352 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:56.523 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:56.533 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:56.547 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:56.726 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:56.730 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:56.735 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:56.742 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:56.819 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:56.819 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
go: downloading github.com/google/s2a-go v0.1.7
go: downloading go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.49.0
go: downloading github.com/googleapis/enterprise-certificate-proxy v0.3.2
go: downloading go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.49.0
go: downloading github.com/felixge/httpsnoop v1.0.4
go: downloading github.com/jmespath/go-jmespath v0.4.0
[2024/05/05 12:58:56.848 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:56.849 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:56.913 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:56.918 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:56.921 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:57.056 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:57.060 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
table multi_capture_1.usertable not exists for 2-th check, retry later
check_etcd_meta_not_exist '/tidb/cdc/default/__cdc_meta__/capture' 'capture'
+ key_prefix=/tidb/cdc/default/__cdc_meta__/capture
+ message=capture
++ etcdctl get /tidb/cdc/default/__cdc_meta__/capture --prefix --keys-only
+ info=
+ [[ '' =~ capture ]]
+ echo 'check pass'
check pass
+ exit 0
run task successfully
check_etcd_meta_not_exist '/tidb/cdc/default/__cdc_meta__/owner' 'owner'
+ key_prefix=/tidb/cdc/default/__cdc_meta__/owner
+ message=owner
++ etcdctl get /tidb/cdc/default/__cdc_meta__/owner --prefix --keys-only
+ info=
+ [[ '' =~ owner ]]
+ echo 'check pass'
check pass
+ exit 0
run task successfully
[Sun May  5 12:58:56 CST 2024] <<<<<< START cdc server in changefeed_error case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/NewChangefeedRetryError=return(true)'
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.62146216.out server --log-file /tmp/tidb_cdc_test/changefeed_error/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_error/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[2024/05/05 12:58:57.125 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:57.229 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:57.233 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:57.246 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:57.246 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:57.259 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:57.275 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:57.322 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:57.333 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:57.338 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:57.342 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:57.346 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/csv_storage_multi_tables_ddl/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:58:57 CST 2024] <<<<<< run test case csv_storage_multi_tables_ddl success! >>>>>>
[2024/05/05 12:58:57.433 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs74758b61_de85_48a7_893a_ffea8a9c2921"]
[2024/05/05 12:58:57.526 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7a41500012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:6987, start at 2024-05-05 12:58:56.495135206 +0800 CST m=+5.104363127	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:56.502 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:56.468 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:56.468 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7a41500012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:6987, start at 2024-05-05 12:58:56.495135206 +0800 CST m=+5.104363127	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:56.502 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:56.468 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:56.468 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7a42bc0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:7074, start at 2024-05-05 12:58:56.606716892 +0800 CST m=+5.162188031	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:00:56.613 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:58:56.610 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:48:56.610 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/processor_stop_delay/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/processor_stop_delay/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/processor_stop_delay/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/processor_stop_delay/tiflash-proxy.toml"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/processor_stop_delay/tiflash/log/proxy.log"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
go: downloading github.com/json-iterator/go v1.1.12
go: downloading github.com/modern-go/reflect2 v1.0.2
go: downloading github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd
[2024/05/05 12:58:57.640 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:57.643 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:57.655 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:57.669 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:57.690 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:58:57.701 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:57.738 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:58:57.821 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
check diff failed 4-th time, retry later
check diff failed 4-th time, retry later
[2024/05/05 12:58:57.978 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:57.985 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:58:58.012 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:58.028 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:58.052 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:58.060 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:58.062 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:58.135 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:58:58.145 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:58:58.238 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:58:58.332 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:58.434 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:58.450 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:58:58.519 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:58.637 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:58.638 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:58:58.643 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:58.714 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:58.718 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:58:58.719 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:58.744 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:58:58.751 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:58:58.823 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:58:58.828 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:58:58.852 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:58:58.867 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
table ddl_manager.finish_mark not exists for 21-th check, retry later
[2024/05/05 12:58:58.944 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:59.012 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:59.020 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:58:59.032 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:59.213 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:59.213 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:59.214 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:58:59.248 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:58:59.311 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:59.312 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:58:59.330 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:59.342 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
table multi_capture_1.usertable exists
table multi_capture_2.usertable exists
table multi_capture_3.usertable not exists for 1-th check, retry later
[2024/05/05 12:58:59.428 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:58:59.439 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:58:59.523 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:58:59.523 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:58:59.620 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:59.625 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:58:59.627 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:58:59.641 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:58:59 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9828fe1f-adab-4652-aeb0-dddd5e8868d9
	{"id":"9828fe1f-adab-4652-aeb0-dddd5e8868d9","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885136}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	3

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b3d2
	9828fe1f-adab-4652-aeb0-dddd5e8868d9

/tidb/cdc/default/default/changefeed/info/changefeed-error
	{"upstream-id":7365375408703361060,"namespace":"default","changefeed-id":"changefeed-error","sink-uri":"kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:58:36.343129959+08:00","start-ts":449546842573897729,"target-ts":0,"admin-job-type":1,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"pending","error":{"time":"2024-05-05T12:58:57.200574991+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrOwnerUnknown","message":"failpoint injected retriable error"},"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546849402748930}

/tidb/cdc/default/default/changefeed/status/changefeed-error
	{"checkpoint-ts":449546845824483339,"min-table-barrier-ts":449546845824483339,"admin-job-type":1}

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9828fe1f-adab-4652-aeb0-dddd5e8868d9
	{"id":"9828fe1f-adab-4652-aeb0-dddd5e8868d9","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885136}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	3

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b3d2
	9828fe1f-adab-4652-aeb0-dddd5e8868d9

/tidb/cdc/default/default/changefeed/info/changefeed-error
	{"upstream-id":7365375408703361060,"namespace":"default","changefeed-id":"changefeed-error","sink-uri":"kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:58:36.343129959+08:00","start-ts":449546842573897729,"target-ts":0,"admin-job-type":1,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"pending","error":{"time":"2024-05-05T12:58:57.200574991+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrOwnerUnknown","message":"failpoint injected retriable error"},"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546849402748930}

/tidb/cdc/default/default/changefeed/status/changefeed-error
	{"checkpoint-ts":449546845824483339,"min-table-barrier-ts":449546845824483339,"admin-job-type":1}

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9828fe1f-adab-4652-aeb0-dddd5e8868d9
	{"id":"9828fe1f-adab-4652-aeb0-dddd5e8868d9","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885136}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	3

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b3d2
	9828fe1f-adab-4652-aeb0-dddd5e8868d9

/tidb/cdc/default/default/changefeed/info/changefeed-error
	{"upstream-id":7365375408703361060,"namespace":"default","changefeed-id":"changefeed-error","sink-uri":"kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:58:36.343129959+08:00","start-ts":449546842573897729,"target-ts":0,"admin-job-type":1,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"pending","error":{"time":"2024-05-05T12:58:57.200574991+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrOwnerUnknown","message":"failpoint injected retriable error"},"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546849402748930}

/tidb/cdc/default/default/changefeed/status/changefeed-error
	{"checkpoint-ts":449546845824483339,"min-table-barrier-ts":449546845824483339,"admin-job-type":1}

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
check_changefeed_state http://127.0.0.1:2379 changefeed-error warning failpoint injected retriable error
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=changefeed-error
+ expected_state=warning
+ error_msg=failpoint
+ tls_dir=error
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c changefeed-error -s
check diff failed 5-th time, retry later
[2024/05/05 12:58:59.813 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:58:59.817 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:59.817 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:58:59.851 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
+ info='{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-error",
  "state": "warning",
  "checkpoint_tso": 449546845824483339,
  "checkpoint_time": "2024-05-05 12:58:43.537",
  "error": {
    "time": "2024-05-05T12:58:57.200574991+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrOwnerUnknown",
    "message": "failpoint injected retriable error"
  }
}'
+ echo '{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-error",
  "state": "warning",
  "checkpoint_tso": 449546845824483339,
  "checkpoint_time": "2024-05-05 12:58:43.537",
  "error": {
    "time": "2024-05-05T12:58:57.200574991+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrOwnerUnknown",
    "message": "failpoint injected retriable error"
  }
}'
{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-error",
  "state": "warning",
  "checkpoint_tso": 449546845824483339,
  "checkpoint_time": "2024-05-05 12:58:43.537",
  "error": {
    "time": "2024-05-05T12:58:57.200574991+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrOwnerUnknown",
    "message": "failpoint injected retriable error"
  }
}
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-error",' '"state":' '"warning",' '"checkpoint_tso":' 449546845824483339, '"checkpoint_time":' '"2024-05-05' '12:58:43.537",' '"error":' '{' '"time":' '"2024-05-05T12:58:57.200574991+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrOwnerUnknown",' '"message":' '"failpoint' injected retriable 'error"' '}' '}'
++ jq -r .state
+ state=warning
+ [[ ! warning == \w\a\r\n\i\n\g ]]
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-error",' '"state":' '"warning",' '"checkpoint_tso":' 449546845824483339, '"checkpoint_time":' '"2024-05-05' '12:58:43.537",' '"error":' '{' '"time":' '"2024-05-05T12:58:57.200574991+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrOwnerUnknown",' '"message":' '"failpoint' injected retriable 'error"' '}' '}'
++ jq -r .error.message
+ message='failpoint injected retriable error'
+ [[ ! failpoint injected retriable error =~ failpoint ]]
run task successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.6325.out cli changefeed remove -c changefeed-error
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 12:58:59.940 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:58:59.941 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:58:59.946 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:59:00.012 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:59:00.041 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:59:00.051 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:59:00.123 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:59:00.130 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:59:00.165 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:59:00.167 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:59:00.167 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:59:00.173 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[Sun May  5 12:58:59 CST 2024] <<<<<< START cdc server in processor_stop_delay case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/processor/processorStopDelay=1*sleep(10000)'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.processor_stop_delay.84598461.out server --log-file /tmp/tidb_cdc_test/processor_stop_delay/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/processor_stop_delay/cdc_data --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[2024/05/05 12:59:00.248 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:59:00.251 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:59:00.252 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:59:00.322 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:59:00.358 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:59:00.360 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:59:00.364 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:59:00.437 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
check diff failed 5-th time, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/csv_storage_partition_table/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 12:59:00 CST 2024] <<<<<< run test case csv_storage_partition_table success! >>>>>>
[2024/05/05 12:59:00.532 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:59:00.614 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:59:00.645 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
Changefeed remove successfully.
ID: changefeed-error
CheckpointTs: 449546845824483339
SinkURI: kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
[2024/05/05 12:59:00.726 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:59:00.830 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:59:00.831 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:59:00.835 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:59:00.838 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:59:00.862 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:59:00.915 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:59:00.916 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:59:00.936 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
table ddl_manager.finish_mark not exists for 22-th check, retry later
[2024/05/05 12:59:00.952 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:59:00.954 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:59:00.962 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:59:00.980 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:59:01.032 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:59:01.057 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:59:01.121 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:59:01.223 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:59:01.344 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:59:01.348 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:59:01.351 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:59:01.355 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:59:01.440 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:59:01.445 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:59:01.448 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
table multi_capture_3.usertable exists
table multi_capture_4.usertable not exists for 1-th check, retry later
[2024/05/05 12:59:01.517 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:59:01.536 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:59:01.538 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:59:01.542 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:59:01.551 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:59:01.563 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:59:01.571 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:59:01.624 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:59:01.660 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7a88cc0016	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:11641, start at 2024-05-05 12:59:01.061755321 +0800 CST m=+5.133850899	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:01.069 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:01.043 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:01.043 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7a88cc0016	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:11641, start at 2024-05-05 12:59:01.061755321 +0800 CST m=+5.133850899	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:01.069 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:01.043 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:01.043 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
[2024/05/05 12:59:01.919 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:59:01.923 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:59:01.927 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:59:01.933 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
+ set +x
check_no_changefeed 127.0.0.1:2379
parse error: Invalid numeric literal at line 1, column 6
run task successfully
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7a88a00016	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:11729, start at 2024-05-05 12:59:01.070263402 +0800 CST m=+5.093759857	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:01.078 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:01.082 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:01.082 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_simple_handle_key_only/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_simple_handle_key_only/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_simple_handle_key_only/tiflash-proxy.toml"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_simple_handle_key_only/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_simple_handle_key_only/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[2024/05/05 12:59:02.028 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:59:02.036 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:59:02.135 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:59:02.154 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:59:02.158 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:59:02.162 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:59:02.214 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:59:02.228 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:59:02.245 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:59:02.254 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:59:02.273 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
check diff successfully
check diff successfully
check diff failed 1-th time, retry later
[2024/05/05 12:59:02.376 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:59:02.381 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:59:02.419 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:59:02.453 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:59:02.458 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:59:02.535 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
table test1.finishmark exists
[2024/05/05 12:58:52.202 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.t1] [checkSum=3030946575]
[2024/05/05 12:58:52.207 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.t2] [checkSum=718014124]
[2024/05/05 12:58:52.209 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.t3] [checkSum=718014124]
[2024/05/05 12:58:52.214 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test1.finishmark] [checkSum=0]
[2024/05/05 12:58:52.215 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test1.t1] [checkSum=718014124]
[2024/05/05 12:58:52.216 +08:00] [INFO] [main.go:107] ["get checksum for the upstream success"] [elapsed=17.310692ms]
[2024/05/05 12:58:52.218 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.t1] [checkSum=3030946575]
[2024/05/05 12:58:52.220 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.t2] [checkSum=718014124]
[2024/05/05 12:58:52.222 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.t3] [checkSum=718014124]
[2024/05/05 12:58:52.226 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test1.finishmark] [checkSum=0]
[2024/05/05 12:58:52.228 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test1.t1] [checkSum=718014124]
[2024/05/05 12:58:52.228 +08:00] [INFO] [main.go:116] ["get checksum for the downstream success"] [elapsed=12.249255ms]
[2024/05/05 12:58:52.228 +08:00] [INFO] [main.go:95] ["compare checksum passed"]
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:58:53 CST 2024] <<<<<< run test case kafka_column_selector success! >>>>>>
[2024/05/05 12:59:02.617 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:59:02.624 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:59:02.644 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:59:02.647 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:59:02.715 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:59:02.743 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:59:02.748 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:59:02.774 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
wait process cdc.test exit for 1-th time...
table ddl_manager.finish_mark not exists for 23-th check, retry later
[2024/05/05 12:58:48.347 +08:00] [INFO] [pd_service_discovery.go:1016] ["[pd] switch leader"] [new-leader=http://127.0.0.1:2379] [old-leader=]
< HTTP/1.1 200 OK
[2024/05/05 12:58:48.348 +08:00] [INFO] [pd_service_discovery.go:498] ["[pd] init cluster id"] [cluster-id=7365375018838600413]
< Date: Sun, 05 May 2024 04:59:02 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
[2024/05/05 12:58:48.348 +08:00] [INFO] [client.go:606] ["[pd] changing service mode"] [old-mode=UNKNOWN_SVC_MODE] [new-mode=PD_SVC_MODE]
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/3aa3f859-704d-4ddc-b75c-e17dbfd04433
[2024/05/05 12:58:48.348 +08:00] [INFO] [tso_client.go:236] ["[tso] switch dc tso global allocator serving url"] [dc-location=global] [new-url=http://127.0.0.1:2379]
[2024/05/05 12:58:48.348 +08:00] [INFO] [tso_dispatcher.go:359] ["[tso] tso dispatcher created"] [dc-location=global]
[2024/05/05 12:58:48.348 +08:00] [INFO] [client.go:612] ["[pd] service mode changed"] [old-mode=UNKNOWN_SVC_MODE] [new-mode=PD_SVC_MODE]
	{"id":"3aa3f859-704d-4ddc-b75c-e17dbfd04433","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885141}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1
[2024/05/05 12:58:48.349 +08:00] [INFO] [pd_service_discovery.go:1016] ["[pd] switch leader"] [new-leader=http://127.0.0.1:2379] [old-leader=]

/tidb/cdc/default/__cdc_meta__/owner/22318f471e648fd3
	3aa3f859-704d-4ddc-b75c-e17dbfd04433

[2024/05/05 12:58:48.349 +08:00] [INFO] [pd_service_discovery.go:498] ["[pd] init cluster id"] [cluster-id=7365375018838600413]
/tidb/cdc/default/default/upstream/7365375529494091078
[2024/05/05 12:58:48.349 +08:00] [INFO] [client.go:606] ["[pd] changing service mode"] [old-mode=UNKNOWN_SVC_MODE] [new-mode=PD_SVC_MODE]
	{"id":7365375529494091078,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

[2024/05/05 12:58:48.349 +08:00] [INFO] [tso_client.go:236] ["[tso] switch dc tso global allocator serving url"] [dc-location=global] [new-url=http://127.0.0.1:2379]
/tidb/cdc/default/__cdc_meta__/capture/3aa3f859-704d-4ddc-b75c-e17dbfd04433
[2024/05/05 12:58:48.350 +08:00] [INFO] [tso_dispatcher.go:359] ["[tso] tso dispatcher created"] [dc-location=global]
[2024/05/05 12:58:48.350 +08:00] [INFO] [client.go:612] ["[pd] service mode changed"] [old-mode=UNKNOWN_SVC_MODE] [new-mode=PD_SVC_MODE]
[2024/05/05 12:58:48.350 +08:00] [INFO] [tikv_driver.go:197] ["using API V1."]
[2024/05/05 12:58:48.350 +08:00] [INFO] [main.go:180] ["genLock started"]
	{"id":"3aa3f859-704d-4ddc-b75c-e17dbfd04433","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885141}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e648fd3
	3aa3f859-704d-4ddc-b75c-e17dbfd04433

/tidb/cdc/default/default/upstream/7365375529494091078
[2024/05/05 12:58:48.351 +08:00] [INFO] [store_cache.go:477] ["change store resolve state"] [store=1] [addr=127.0.0.1:20161] [from=unresolved] [to=resolved] [liveness-state=reachable]
	{"id":7365375529494091078,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

[2024/05/05 12:58:58.358 +08:00] [INFO] [main.go:196] ["genLock done"]
*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/3aa3f859-704d-4ddc-b75c-e17dbfd04433
[2024/05/05 12:58:58.358 +08:00] [INFO] [pd_service_discovery.go:550] ["[pd] exit member loop due to context canceled"]
[2024/05/05 12:58:58.358 +08:00] [INFO] [resource_manager_client.go:295] ["[resource manager] exit resource token dispatcher"]
	{"id":"3aa3f859-704d-4ddc-b75c-e17dbfd04433","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885141}

[2024/05/05 12:58:58.358 +08:00] [INFO] [tso_client.go:140] ["closing tso client"]
/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e648fd3
	3aa3f859-704d-4ddc-b75c-e17dbfd04433

/tidb/cdc/default/default/upstream/7365375529494091078
[2024/05/05 12:58:58.358 +08:00] [INFO] [tso_dispatcher.go:455] ["[tso] stop fetching the pending tso requests due to context canceled"] [dc-location=global]
	{"id":7365375529494091078,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[2024/05/05 12:58:58.358 +08:00] [INFO] [tso_dispatcher.go:268] ["exit tso dispatcher loop"]
[2024/05/05 12:58:58.358 +08:00] [INFO] [tso_dispatcher.go:214] ["exit tso requests cancel loop"]
[2024/05/05 12:58:58.358 +08:00] [INFO] [tso_dispatcher.go:380] ["[tso] exit tso dispatcher"] [dc-location=global]
[2024/05/05 12:58:58.358 +08:00] [INFO] [tso_batch_controller.go:158] ["[pd] clear the tso batch controller"] [max-batch-size=10000] [best-batch-size=1] [collected-request-count=0] [pending-request-count=0]
[2024/05/05 12:58:58.358 +08:00] [INFO] [tso_client.go:145] ["close tso client"]
[2024/05/05 12:58:58.358 +08:00] [INFO] [tso_batch_controller.go:158] ["[pd] clear the tso batch controller"] [max-batch-size=10000] [best-batch-size=1] [collected-request-count=0] [pending-request-count=0]
[2024/05/05 12:58:58.358 +08:00] [INFO] [tso_client.go:155] ["tso client is closed"]
[2024/05/05 12:58:58.358 +08:00] [INFO] [pd_service_discovery.go:637] ["[pd] close pd service discovery client"]
[2024/05/05 12:59:02.915 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:59:02.920 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:59:02.928 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:59:02.949 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:59:02.949 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:59:02.972 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:59:03.022 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:59:03.028 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:59:03.049 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:59:03.052 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[Sun May  5 12:59:03 CST 2024] <<<<<< START kafka consumer in processor_stop_delay case >>>>>>
[2024/05/05 12:59:03.113 +08:00] [INFO] [main.go:178] ["72 insert success: 1100"]
[2024/05/05 12:59:03.144 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:59:03.147 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:59:03.226 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
wait process cdc.test exit for 2-th time...
table multi_capture_4.usertable exists
check diff failed 1-th time, retry later
table processor_stop_delay.t not exists for 1-th check, retry later
[2024/05/05 12:59:03.348 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:59:03.352 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:59:03.423 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:59:03.433 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:59:03.438 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:59:03.459 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:59:03.529 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:59:03.539 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:59:03.560 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:59:03.572 +08:00] [INFO] [main.go:178] ["72 insert success: 1200"]
[2024/05/05 12:59:03.620 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:59:03.773 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
\033[0;36m<<< Run all test success >>>\033[0m
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
check_etcd_meta_not_exist '/tidb/cdc/default/__cdc_meta__/owner' 'owner'
+ key_prefix=/tidb/cdc/default/__cdc_meta__/owner
+ message=owner
++ etcdctl get /tidb/cdc/default/__cdc_meta__/owner --prefix --keys-only
+ info=
+ [[ '' =~ owner ]]
+ echo 'check pass'
check pass
+ exit 0
run task successfully
[Sun May  5 12:59:03 CST 2024] <<<<<< START cdc server in changefeed_error case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/InjectChangefeedDDLError=return(true)'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.64276429.out server --log-file /tmp/tidb_cdc_test/changefeed_error/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_error/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[2024/05/05 12:59:03.860 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:59:03.925 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:59:03.939 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:59:03.954 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:59:03.954 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:59:04.023 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:59:04.057 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:59:04.064 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:59:04.065 +08:00] [INFO] [main.go:178] ["72 insert success: 1300"]
[2024/05/05 12:59:04.065 +08:00] [INFO] [main.go:178] ["73 insert success: 1100"]
[2024/05/05 12:59:04.069 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[2024/05/05 12:59:04.173 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:59:04.239 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:59:04.251 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:59:04.323 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:59:04.333 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:59:04.334 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[Pipeline] // withCredentials
[Pipeline] }
check diff successfully
table test.t2 not exists for 1-th check, retry later
[Pipeline] // timeout
[Pipeline] }
[2024/05/05 12:59:04.450 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:59:04.527 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:59:04.534 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:59:04.534 +08:00] [INFO] [main.go:178] ["72 insert success: 1400"]
[2024/05/05 12:59:04.535 +08:00] [INFO] [main.go:178] ["73 insert success: 1200"]
[2024/05/05 12:59:04.536 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:59:04.618 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[Pipeline] // stage
[Pipeline] }
table ddl_manager.finish_mark not exists for 24-th check, retry later
[Sun May  5 12:59:04 CST 2024] <<<<<< START cdc server in kafka_simple_handle_key_only case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only.1313313135.out server --log-file /tmp/tidb_cdc_test/kafka_simple_handle_key_only/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_simple_handle_key_only/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
wait process cdc.test exit for 1-th time...
check diff failed 1-th time, retry later
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[2024/05/05 12:59:04.923 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:59:04.930 +08:00] [INFO] [main.go:178] ["72 insert success: 1900"]
[2024/05/05 12:59:04.933 +08:00] [INFO] [main.go:178] ["72 insert success: 1500"]
[2024/05/05 12:59:04.940 +08:00] [INFO] [main.go:178] ["73 insert success: 1300"]
[2024/05/05 12:59:04.944 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:59:04.966 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsb221d687_a920_401c_b7a0_973fb0b0c3f6"]
[2024/05/05 12:59:05.017 +08:00] [INFO] [main.go:178] ["73 insert success: 1700"]
[2024/05/05 12:59:05.043 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLsf966fe1b_aed9_4342_a899_beeb2835c27b"]
[Pipeline] }
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_column_selector/run.sh: line 1: 17158 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/changefeed.toml" 2>&1  (wd: /tmp/tidb_cdc_test/kafka_column_selector)
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_column_selector_avro/run.sh using Sink-Type: kafka... <<=================
Starting schema registry...
* About to connect() to 127.0.0.1 port 8088 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8088; Connection refused
* Closing connection 0
[Pipeline] // withEnv
[Pipeline] }
table processor_stop_delay.t not exists for 2-th check, retry later
[Pipeline] // stage
[2024/05/05 12:59:05.228 +08:00] [INFO] [main.go:178] ["72 insert success: 2000"]
[2024/05/05 12:59:05.236 +08:00] [INFO] [main.go:178] ["72 insert success: 1600"]
[2024/05/05 12:59:05.246 +08:00] [INFO] [main.go:178] ["73 insert success: 1400"]
[2024/05/05 12:59:05.272 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:59:05.364 +08:00] [INFO] [main.go:178] ["73 insert success: 1800"]
[2024/05/05 12:59:05.384 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:59:05.392 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[Pipeline] }
wait process cdc.test exit for 2-th time...
[2024/05/05 12:59:05.487 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:59:05.545 +08:00] [INFO] [main.go:178] ["72 insert success: 1700"]
[2024/05/05 12:59:05.557 +08:00] [INFO] [main.go:178] ["73 insert success: 1500"]
[2024/05/05 12:59:05.562 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs415b5ed6_8a86_4f65_a605_0111488da77d"]
[2024/05/05 12:59:05.580 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:59:05.677 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:59:05.722 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
check diff failed 2-th time, retry later
[2024/05/05 12:59:05.789 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:59:05.847 +08:00] [INFO] [main.go:178] ["72 insert success: 1800"]
[2024/05/05 12:59:05.867 +08:00] [INFO] [main.go:178] ["73 insert success: 1600"]
[2024/05/05 12:59:05.868 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:59:05.870 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:59:05.885 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:59:05.965 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:59:05.979 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs49b22e73_2f7e_4859_ae9f_b05f463634a7"]
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:59:05 CST 2024] <<<<<< run test case new_ci_collation success! >>>>>>
[2024/05/05 12:59:06.014 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:59:06.074 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:59:06.159 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:59:06.162 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:59:06.173 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:59:06.262 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:59:06.273 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:59:06.276 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:59:06.307 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:59:06.361 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:59:06.446 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:59:06.456 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:59:06.467 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
table ddl_manager.finish_mark not exists for 25-th check, retry later
[2024/05/05 12:59:06.500 +08:00] [INFO] [main.go:835] ["running ddl test: testMultiDDLs41e964f9_9fb9_4ca1_9d9b_388e39fbfc87"]
[2024/05/05 12:59:06.550 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:59:06.562 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:59:06.573 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:59:06.601 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:59:06.648 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:59:06.742 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
check diff failed 2-th time, retry later
table test.t2 not exists for 2-th check, retry later
[2024/05/05 12:59:06.772 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:59:06.784 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:59:06.833 +08:00] [INFO] [main.go:178] ["72 insert success: 100"]
[2024/05/05 12:59:06.849 +08:00] [INFO] [main.go:178] ["73 insert success: 100"]
[2024/05/05 12:59:06.883 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:59:06.895 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:59:06.921 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:59:06.943 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:59:06.971 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:06 GMT
< Content-Length: 883
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ec7ec5f3-3ce1-40fc-a747-e66772e2956c
	{"id":"ec7ec5f3-3ce1-40fc-a747-e66772e2956c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885144}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	4

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b431
	ec7ec5f3-3ce1-40fc-a747-e66772e2956c

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ec7ec5f3-3ce1-40fc-a747-e66772e2956c
	{"id":"ec7ec5f3-3ce1-40fc-a747-e66772e2956c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885144}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	4

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b431
	ec7ec5f3-3ce1-40fc-a747-e66772e2956c

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ec7ec5f3-3ce1-40fc-a747-e66772e2956c
	{"id":"ec7ec5f3-3ce1-40fc-a747-e66772e2956c","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885144}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	4

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b431
	ec7ec5f3-3ce1-40fc-a747-e66772e2956c

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.6483.out cli changefeed create --start-ts=449546842573897729 '--sink-uri=kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' -c changefeed-error-1
* About to connect() to 127.0.0.1 port 8088 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8088; Connection refused
* Closing connection 0
[2024/05/05 12:59:07.030 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:59:07.074 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:59:07.131 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:59:07.137 +08:00] [INFO] [main.go:178] ["72 insert success: 200"]
[2024/05/05 12:59:07.166 +08:00] [INFO] [main.go:178] ["73 insert success: 200"]
[2024/05/05 12:59:07.184 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:59:07.206 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
[2024/05/05 12:59:07.248 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:07 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ac06df6f-146a-4da0-81d6-b9b3fc4cfc52
	{"id":"ac06df6f-146a-4da0-81d6-b9b3fc4cfc52","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885144}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e7e3fc3
	ac06df6f-146a-4da0-81d6-b9b3fc4cfc52

/tidb/cdc/default/default/upstream/7365375562342503600
	{"id":7365375562342503600,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ac06df6f-146a-4da0-81d6-b9b3fc4cfc52
	{"id":"ac06df6f-146a-4da0-81d6-b9b3fc4cfc52","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885144}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e7e3fc3
	ac06df6f-146a-4da0-81d6-b9b3fc4cfc52

/tidb/cdc/default/default/upstream/7365375562342503600
	{"id":7365375562342503600,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ac06df6f-146a-4da0-81d6-b9b3fc4cfc52
	{"id":"ac06df6f-146a-4da0-81d6-b9b3fc4cfc52","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885144}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471e7e3fc3
	ac06df6f-146a-4da0-81d6-b9b3fc4cfc52

/tidb/cdc/default/default/upstream/7365375562342503600
	{"id":7365375562342503600,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only.cli.13194.out cli tso query --pd=http://127.0.0.1:2379
Create changefeed successfully!
ID: changefeed-error-1
Info: {"upstream_id":7365375408703361060,"namespace":"default","id":"changefeed-error-1","sink_uri":"kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:59:07.375807788+08:00","start_ts":449546842573897729,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546842573897729,"checkpoint_ts":449546842573897729,"checkpoint_time":"2024-05-05 12:58:31.137"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
[2024/05/05 12:59:07.267 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:59:07.282 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:59:07.331 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:59:07.416 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:59:07.460 +08:00] [INFO] [main.go:178] ["72 insert success: 300"]
[2024/05/05 12:59:07.464 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:59:07.527 +08:00] [INFO] [main.go:178] ["73 insert success: 300"]
[2024/05/05 12:59:07.562 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:59:07.612 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:59:07.631 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:59:07.641 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:59:07.667 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:59:07.730 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:59:07.755 +08:00] [INFO] [main.go:178] ["72 insert success: 400"]
check diff successfully
***************** properties *****************
"scanproportion"="0"
"recordcount"="20"
"mysql.host"="127.0.0.1"
"mysql.port"="4000"
"readproportion"="0"
"readallfields"="true"
"insertproportion"="0"
"mysql.user"="root"
"dotransactions"="false"
"workload"="core"
"requestdistribution"="uniform"
"updateproportion"="0"
"mysql.db"="multi_capture_1"
"threadcount"="2"
"operationcount"="0"
**********************************************
Run finished, takes 22.726071ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 1381.8, Avg(us): 2228, Min(us): 580, Max(us): 8310, 95th(us): 9000, 99th(us): 9000
***************** properties *****************
"operationcount"="0"
"mysql.db"="multi_capture_2"
"insertproportion"="0"
"mysql.host"="127.0.0.1"
"mysql.port"="4000"
"scanproportion"="0"
"dotransactions"="false"
"readallfields"="true"
"updateproportion"="0"
"threadcount"="2"
"readproportion"="0"
"mysql.user"="root"
"recordcount"="20"
"workload"="core"
"requestdistribution"="uniform"
**********************************************
Run finished, takes 10.446365ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 2173.9, Avg(us): 987, Min(us): 504, Max(us): 1779, 95th(us): 2000, 99th(us): 2000
***************** properties *****************
"requestdistribution"="uniform"
"threadcount"="2"
"insertproportion"="0"
"mysql.host"="127.0.0.1"
"mysql.db"="multi_capture_3"
"recordcount"="20"
"operationcount"="0"
"updateproportion"="0"
"mysql.user"="root"
"scanproportion"="0"
"readallfields"="true"
"mysql.port"="4000"
"dotransactions"="false"
"readproportion"="0"
"workload"="core"
**********************************************
Run finished, takes 21.648255ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 1451.6, Avg(us): 2117, Min(us): 626, Max(us): 7874, 95th(us): 8000, 99th(us): 8000
***************** properties *****************
"readproportion"="0"
"dotransactions"="false"
"mysql.host"="127.0.0.1"
"requestdistribution"="uniform"
"insertproportion"="0"
"recordcount"="20"
"threadcount"="2"
"workload"="core"
"mysql.port"="4000"
"operationcount"="0"
"mysql.db"="multi_capture_4"
"scanproportion"="0"
"updateproportion"="0"
"readallfields"="true"
"mysql.user"="root"
**********************************************
Run finished, takes 11.508881ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 1929.1, Avg(us): 1014, Min(us): 546, Max(us): 1866, 95th(us): 2000, 99th(us): 2000
check diff failed 1-th time, retry later
table processor_stop_delay.t exists
check diff successfully
[2024/05/05 12:59:07.830 +08:00] [INFO] [main.go:178] ["73 insert success: 400"]
[2024/05/05 12:59:07.849 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:59:07.907 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:59:07.925 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:59:07.936 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:59:07.958 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:59:08.016 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:59:08.044 +08:00] [INFO] [main.go:178] ["72 insert success: 500"]
[2024/05/05 12:59:08.119 +08:00] [INFO] [main.go:178] ["73 insert success: 500"]
[2024/05/05 12:59:08.135 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:59:08.199 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
[2024/05/05 12:59:08.239 +08:00] [INFO] [main.go:178] ["72 insert success: 900"]
[2024/05/05 12:59:08.301 +08:00] [INFO] [main.go:178] ["73 insert success: 900"]
[2024/05/05 12:59:08.320 +08:00] [INFO] [main.go:178] ["72 insert success: 600"]
[2024/05/05 12:59:08.413 +08:00] [INFO] [main.go:178] ["73 insert success: 600"]
[2024/05/05 12:59:08.421 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:59:08.495 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
[2024/05/05 12:59:08.525 +08:00] [INFO] [main.go:178] ["72 insert success: 1000"]
[2024/05/05 12:59:08.589 +08:00] [INFO] [main.go:178] ["73 insert success: 1000"]
[2024/05/05 12:59:08.601 +08:00] [INFO] [main.go:178] ["72 insert success: 700"]
[2024/05/05 12:59:08.696 +08:00] [INFO] [main.go:178] ["73 insert success: 700"]
table test.t2 not exists for 3-th check, retry later
+ set +x
[2024/05/05 12:59:08.878 +08:00] [INFO] [main.go:178] ["72 insert success: 800"]
[2024/05/05 12:59:08.972 +08:00] [INFO] [main.go:178] ["73 insert success: 800"]
check_changefeed_status 127.0.0.1:8300 changefeed-error-1 warning last_warning ErrExecDDLFailed
+ endpoint=127.0.0.1:8300
+ changefeed_id=changefeed-error-1
+ expected_state=warning
+ field=last_warning
+ error_pattern=ErrExecDDLFailed
++ curl 127.0.0.1:8300/api/v2/changefeeds/changefeed-error-1/status
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   253  100   253    0     0   1942      0 --:--:-- --:--:-- --:--:--  1946
+ info='{"state":"warning","resolved_ts":449546842954006541,"checkpoint_ts":449546842954006541,"last_warning":{"time":"2024-05-05T12:59:08.807740556+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrExecDDLFailed","message":"[CDC:ErrExecDDLFailed]exec DDL failed"}}'
+ echo '{"state":"warning","resolved_ts":449546842954006541,"checkpoint_ts":449546842954006541,"last_warning":{"time":"2024-05-05T12:59:08.807740556+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrExecDDLFailed","message":"[CDC:ErrExecDDLFailed]exec DDL failed"}}'
{"state":"warning","resolved_ts":449546842954006541,"checkpoint_ts":449546842954006541,"last_warning":{"time":"2024-05-05T12:59:08.807740556+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrExecDDLFailed","message":"[CDC:ErrExecDDLFailed]exec DDL failed"}}
++ jq -r .state
++ echo '{"state":"warning","resolved_ts":449546842954006541,"checkpoint_ts":449546842954006541,"last_warning":{"time":"2024-05-05T12:59:08.807740556+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrExecDDLFailed","message":"[CDC:ErrExecDDLFailed]exec' DDL 'failed"}}'
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/region_merge/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table ddl_manager.finish_mark not exists for 26-th check, retry later
+ set +x
+ tso='449546852130619398
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546852130619398 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only.cli.13237.out cli changefeed create --start-ts=449546852130619398 '--sink-uri=kafka://127.0.0.1:9092/simple-handle-key-only-14016?protocol=simple' -c simple-handle-key-only --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_handle_key_only/conf/changefeed.toml
+ state=warning
+ [[ ! warning == \w\a\r\n\i\n\g ]]
+ [[ -z last_warning ]]
++ jq -r .last_warning.message
++ echo '{"state":"warning","resolved_ts":449546842954006541,"checkpoint_ts":449546842954006541,"last_warning":{"time":"2024-05-05T12:59:08.807740556+08:00","addr":"127.0.0.1:8300","code":"CDC:ErrExecDDLFailed","message":"[CDC:ErrExecDDLFailed]exec' DDL 'failed"}}'
+ error_msg='[CDC:ErrExecDDLFailed]exec DDL failed'
+ [[ ! [CDC:ErrExecDDLFailed]exec DDL failed =~ ErrExecDDLFailed ]]
run task successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.6533.out cli changefeed remove -c changefeed-error-1
check diff failed 3-th time, retry later
[2024/05/05 12:59:09.585 +08:00] [INFO] [main.go:812] ["testMultiDDLs take %v47.214012642s"]
* About to connect() to 127.0.0.1 port 8088 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8088 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8088
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:09 GMT
< Content-Type: application/vnd.schemaregistry.v1+json
< Vary: Accept-Encoding, User-Agent
< Content-Length: 2
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100    49  100    24  100    25    317    330 --:--:-- --:--:-- --:--:--   333
{"compatibility":"NONE"}The 1 times to try to start tidb cluster...
Create changefeed successfully!
ID: simple-handle-key-only
Info: {"upstream_id":7365375562342503600,"namespace":"default","id":"simple-handle-key-only","sink_uri":"kafka://127.0.0.1:9092/simple-handle-key-only-14016?protocol=simple","create_time":"2024-05-05T12:59:09.498239819+08:00","start_ts":449546852130619398,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"handle-key-only","large_message_handle_compression":"lz4","claim_check_storage_uri":""}},"advance_timeout":150,"send_bootstrap_interval_in_sec":0,"send_bootstrap_in_msg_count":0,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546852130619398,"checkpoint_ts":449546852130619398,"checkpoint_time":"2024-05-05 12:59:07.593"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
check diff failed 1-th time, retry later
[2024/05/05 12:59:10.095 +08:00] [INFO] [main.go:74] ["DefaultValue integration tests take 47.723885478s"]
Changefeed remove successfully.
ID: changefeed-error-1
CheckpointTs: 449546842954006541
SinkURI: kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
check diff successfully
wait process cdc.test exit for 1-th time...
table mark.finish_mark_1 exists
table mark.finish_mark_2 not exists for 1-th check, retry later
wait process cdc.test exit for 2-th time...
table ddl_manager.finish_mark not exists for 27-th check, retry later
+ set +x
check diff failed 4-th time, retry later
wait process cdc.test exit for 3-th time...
+ set +x
check diff failed 2-th time, retry later
table test.t2 not exists for 4-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 12:59:11 CST 2024] <<<<<< run test case multi_capture success! >>>>>>
wait process cdc.test exit for 1-th time...
start tidb cluster in /tmp/tidb_cdc_test/region_merge
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
wait process cdc.test exit for 2-th time...
table mark.finish_mark_2 not exists for 2-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
check_etcd_meta_not_exist '/tidb/cdc/default/__cdc_meta__/owner' 'owner'
+ key_prefix=/tidb/cdc/default/__cdc_meta__/owner
+ message=owner
++ etcdctl get /tidb/cdc/default/__cdc_meta__/owner --prefix --keys-only
+ info=
+ [[ '' =~ owner ]]
+ echo 'check pass'
check pass
+ exit 0
run task successfully
[Sun May  5 12:59:12 CST 2024] <<<<<< START cdc server in changefeed_error case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
+ GO_FAILPOINTS='github.com/pingcap/tiflow/pkg/txnutil/gc/InjectActualGCSafePoint=return(9223372036854775807)'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.65896591.out server --log-file /tmp/tidb_cdc_test/changefeed_error/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_error/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
start tidb cluster in /tmp/tidb_cdc_test/kafka_column_selector_avro
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table ddl_manager.finish_mark not exists for 28-th check, retry later
check diff failed 5-th time, retry later
check diff failed 3-th time, retry later
table mark.finish_mark_2 not exists for 3-th check, retry later
table test.t2 not exists for 5-th check, retry later
Verifying downstream PD is started...
table ddl_manager.finish_mark not exists for 29-th check, retry later
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
check diff successfully
check diff failed 4-th time, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:15 GMT
< Content-Length: 883
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5be7f9bd-5a10-42d7-9866-48aa71769629
	{"id":"5be7f9bd-5a10-42d7-9866-48aa71769629","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885153}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	5

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b4ae
	5be7f9bd-5a10-42d7-9866-48aa71769629

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5be7f9bd-5a10-42d7-9866-48aa71769629
	{"id":"5be7f9bd-5a10-42d7-9866-48aa71769629","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885153}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	5

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b4ae
	5be7f9bd-5a10-42d7-9866-48aa71769629

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/5be7f9bd-5a10-42d7-9866-48aa71769629
	{"id":"5be7f9bd-5a10-42d7-9866-48aa71769629","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885153}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	5

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b4ae
	5be7f9bd-5a10-42d7-9866-48aa71769629

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.6645.out cli changefeed create --start-ts=449546842573897729 '--sink-uri=kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' -c changefeed-error-2
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only.cli.13296.out cli changefeed pause -c simple-handle-key-only
table test.t2 not exists for 6-th check, retry later
Create changefeed successfully!
ID: changefeed-error-2
Info: {"upstream_id":7365375408703361060,"namespace":"default","id":"changefeed-error-2","sink_uri":"kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:59:16.411651705+08:00","start_ts":449546842573897729,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546842573897729,"checkpoint_ts":449546842573897729,"checkpoint_time":"2024-05-05 12:58:31.137"}
PASS
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table ddl_manager.finish_mark not exists for 30-th check, retry later
table mark.finish_mark_2 not exists for 4-th check, retry later
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
check diff failed 1-th time, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
check diff failed 5-th time, retry later
+ set +x
check_changefeed_state http://127.0.0.1:2379 changefeed-error-2 failed [CDC:ErrSnapshotLostByGC]
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=changefeed-error-2
+ expected_state=failed
+ error_msg='[CDC:ErrSnapshotLostByGC]'
+ tls_dir='[CDC:ErrSnapshotLostByGC]'
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c changefeed-error-2 -s
+ info='{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-error-2",
  "state": "failed",
  "checkpoint_tso": 449546842573897729,
  "checkpoint_time": "2024-05-05 12:58:31.137",
  "error": {
    "time": "2024-05-05T12:59:16.495607209+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrSnapshotLostByGC",
    "message": "[CDC:ErrSnapshotLostByGC]fail to create or maintain changefeed due to snapshot loss caused by GC. checkpoint-ts 449546842573897729 is earlier than or equal to GC safepoint at 9223372036854775807"
  }
}'
+ echo '{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-error-2",
  "state": "failed",
  "checkpoint_tso": 449546842573897729,
  "checkpoint_time": "2024-05-05 12:58:31.137",
  "error": {
    "time": "2024-05-05T12:59:16.495607209+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrSnapshotLostByGC",
    "message": "[CDC:ErrSnapshotLostByGC]fail to create or maintain changefeed due to snapshot loss caused by GC. checkpoint-ts 449546842573897729 is earlier than or equal to GC safepoint at 9223372036854775807"
  }
}'
{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-error-2",
  "state": "failed",
  "checkpoint_tso": 449546842573897729,
  "checkpoint_time": "2024-05-05 12:58:31.137",
  "error": {
    "time": "2024-05-05T12:59:16.495607209+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrSnapshotLostByGC",
    "message": "[CDC:ErrSnapshotLostByGC]fail to create or maintain changefeed due to snapshot loss caused by GC. checkpoint-ts 449546842573897729 is earlier than or equal to GC safepoint at 9223372036854775807"
  }
}
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-error-2",' '"state":' '"failed",' '"checkpoint_tso":' 449546842573897729, '"checkpoint_time":' '"2024-05-05' '12:58:31.137",' '"error":' '{' '"time":' '"2024-05-05T12:59:16.495607209+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrSnapshotLostByGC",' '"message":' '"[CDC:ErrSnapshotLostByGC]fail' to create or maintain changefeed due to snapshot loss caused by GC. checkpoint-ts 449546842573897729 is earlier than or equal to GC safepoint at '9223372036854775807"' '}' '}'
++ jq -r .state
+ state=failed
+ [[ ! failed == \f\a\i\l\e\d ]]
++ jq -r .error.message
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-error-2",' '"state":' '"failed",' '"checkpoint_tso":' 449546842573897729, '"checkpoint_time":' '"2024-05-05' '12:58:31.137",' '"error":' '{' '"time":' '"2024-05-05T12:59:16.495607209+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrSnapshotLostByGC",' '"message":' '"[CDC:ErrSnapshotLostByGC]fail' to create or maintain changefeed due to snapshot loss caused by GC. checkpoint-ts 449546842573897729 is earlier than or equal to GC safepoint at '9223372036854775807"' '}' '}'
+ message='[CDC:ErrSnapshotLostByGC]fail to create or maintain changefeed due to snapshot loss caused by GC. checkpoint-ts 449546842573897729 is earlier than or equal to GC safepoint at 9223372036854775807'
+ [[ ! [CDC:ErrSnapshotLostByGC]fail to create or maintain changefeed due to snapshot loss caused by GC. checkpoint-ts 449546842573897729 is earlier than or equal to GC safepoint at 9223372036854775807 =~ \[CDC:ErrSnapshotLostByGC] ]]
run task successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.6719.out cli changefeed remove -c changefeed-error-2
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.t2 exists
check diff successfully
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only.cli.13332.out cli changefeed update -c simple-handle-key-only '--sink-uri=kafka://127.0.0.1:9092/simple-handle-key-only-14016?protocol=simple&max-message-bytes=700' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_handle_key_only/conf/changefeed.toml --no-confirm
table ddl_manager.finish_mark not exists for 31-th check, retry later
table mark.finish_mark_2 not exists for 5-th check, retry later
wait process cdc.test exit for 1-th time...
check diff failed 2-th time, retry later
Diff of changefeed config:
{Type:update Path:[SinkURI] From:kafka://127.0.0.1:9092/simple-handle-key-only-14016?protocol=simple To:kafka://127.0.0.1:9092/simple-handle-key-only-14016?protocol=simple&max-message-bytes=700}
{Type:update Path:[Config SyncPointInterval] From:<nil> To:0xc0038ac898}
{Type:update Path:[Config SyncPointRetention] From:<nil> To:0xc0038ac8a8}
{Type:update Path:[Config Consistent] From:<nil> To:0xc001267a40}
Update changefeed config successfully! 
ID: simple-handle-key-only
Info: {"upstream_id":7365375562342503600,"namespace":"default","id":"simple-handle-key-only","sink_uri":"kafka://127.0.0.1:9092/simple-handle-key-only-14016?protocol=simple\u0026max-message-bytes=700","create_time":"2024-05-05T12:59:09.498239819+08:00","start_ts":449546852130619398,"admin_job_type":1,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_table_monitor":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","encoder_concurrency":32,"terminator":"\r\n","enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"handle-key-only","large_message_handle_compression":"lz4","claim_check_storage_uri":""}},"advance_timeout":150,"send_bootstrap_interval_in_sec":0,"send_bootstrap_in_msg_count":0,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"stopped","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":0,"checkpoint_ts":449546854332628995,"checkpoint_time":"2024-05-05 12:59:15.993"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Changefeed remove successfully.
ID: changefeed-error-2
CheckpointTs: 449546842573897729
SinkURI: kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
wait process cdc.test exit for 2-th time...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 3-th time...
check diff failed 6-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 12:59:20 CST 2024] <<<<<< run test case resolve_lock success! >>>>>>
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only.cli.13362.out cli changefeed resume -c simple-handle-key-only
+ set +x
table mark.finish_mark_2 not exists for 6-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/batch_add_table/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
wait process cdc.test exit for 1-th time...
check diff failed 3-th time, retry later
PASS
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
table ddl_manager.finish_mark not exists for 32-th check, retry later
wait process cdc.test exit for 2-th time...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:59:21 CST 2024] <<<<<< START cdc server in changefeed_error case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/redo/ChangefeedNewRedoManagerError=2*return(true)'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.67626764.out server --log-file /tmp/tidb_cdc_test/changefeed_error/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_error/cdc_data --cluster-id default
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
check diff successfully
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
table test.finish_mark not exists for 1-th check, retry later
table mark.finish_mark_2 not exists for 7-th check, retry later
wait process cdc.test exit for 1-th time...
table ddl_manager.finish_mark not exists for 33-th check, retry later
wait process cdc.test exit for 2-th time...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7bd23c0012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:30926, start at 2024-05-05 12:59:22.1563736 +0800 CST m=+5.191054059	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:22.163 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:22.127 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:22.127 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/batch_add_table
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
check diff failed 4-th time, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:59:23 CST 2024] <<<<<< run test case processor_stop_delay success! >>>>>>
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/run.sh using Sink-Type: kafka... <<=================
+++ dirname /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/run.sh
++ cd /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo
++ pwd
+ CUR=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo
+ source /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/../_utils/test_prepare
++ UP_TIDB_HOST=127.0.0.1
++ UP_TIDB_PORT=4000
++ UP_TIDB_OTHER_PORT=4001
++ UP_TIDB_STATUS=10080
++ UP_TIDB_OTHER_STATUS=10081
++ DOWN_TIDB_HOST=127.0.0.1
++ DOWN_TIDB_PORT=3306
++ DOWN_TIDB_STATUS=20080
++ TLS_TIDB_HOST=127.0.0.1
++ TLS_TIDB_PORT=3307
++ TLS_TIDB_STATUS=30080
++ UP_PD_HOST_1=127.0.0.1
++ UP_PD_PORT_1=2379
++ UP_PD_PEER_PORT_1=2380
++ UP_PD_HOST_2=127.0.0.1
++ UP_PD_PORT_2=2679
++ UP_PD_PEER_PORT_2=2680
++ UP_PD_HOST_3=127.0.0.1
++ UP_PD_PORT_3=2779
++ UP_PD_PEER_PORT_3=2780
++ DOWN_PD_HOST=127.0.0.1
++ DOWN_PD_PORT=2479
++ DOWN_PD_PEER_PORT=2480
++ TLS_PD_HOST=127.0.0.1
++ TLS_PD_PORT=2579
++ TLS_PD_PEER_PORT=2580
++ UP_TIKV_HOST_1=127.0.0.1
++ UP_TIKV_PORT_1=20160
++ UP_TIKV_STATUS_PORT_1=20181
++ UP_TIKV_HOST_2=127.0.0.1
++ UP_TIKV_PORT_2=20161
++ UP_TIKV_STATUS_PORT_2=20182
++ UP_TIKV_HOST_3=127.0.0.1
++ UP_TIKV_PORT_3=20162
++ UP_TIKV_STATUS_PORT_3=20183
++ DOWN_TIKV_HOST=127.0.0.1
++ DOWN_TIKV_PORT=21160
++ DOWN_TIKV_STATUS_PORT=21180
++ TLS_TIKV_HOST=127.0.0.1
++ TLS_TIKV_PORT=22160
++ TLS_TIKV_STATUS_PORT=22180
+++ cat /tmp/tidb_cdc_test/KAFKA_VERSION
+++ echo 2.4.1
++ KAFKA_VERSION=2.4.1
+ WORK_DIR=/tmp/tidb_cdc_test/synced_status_with_redo
+ CDC_BINARY=cdc.test
+ SINK_TYPE=kafka
+ CDC_COUNT=3
+ DB_COUNT=4
+ trap stop_tidb_cluster EXIT
+ run_normal_case_and_unavailable_pd conf/changefeed-redo.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status_with_redo
+ mkdir -p /tmp/tidb_cdc_test/synced_status_with_redo
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status_with_redo
The 1 times to try to start tidb cluster...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7be4540010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5, pid:4145, start at 2024-05-05 12:59:23.300206533 +0800 CST m=+5.382999061	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:23.307 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:23.285 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:23.285 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 2-th check, retry later
table ddl_manager.finish_mark not exists for 34-th check, retry later
table mark.finish_mark_2 not exists for 8-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7bd23c0012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:30926, start at 2024-05-05 12:59:22.1563736 +0800 CST m=+5.191054059	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:22.163 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:22.127 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:22.127 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7bd2f40016	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:31011, start at 2024-05-05 12:59:22.206848101 +0800 CST m=+5.191025935	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:22.214 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:22.173 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:22.173 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_column_selector_avro/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_column_selector_avro/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_column_selector_avro/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_column_selector_avro/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_column_selector_avro/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:24 GMT
< Content-Length: 883
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/176cce97-1d5a-4536-ad25-fd9e30b81117
	{"id":"176cce97-1d5a-4536-ad25-fd9e30b81117","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885162}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	7

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b50a
	176cce97-1d5a-4536-ad25-fd9e30b81117

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/176cce97-1d5a-4536-ad25-fd9e30b81117
	{"id":"176cce97-1d5a-4536-ad25-fd9e30b81117","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885162}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	7

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b50a
	176cce97-1d5a-4536-ad25-fd9e30b81117

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/176cce97-1d5a-4536-ad25-fd9e30b81117
	{"id":"176cce97-1d5a-4536-ad25-fd9e30b81117","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885162}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/meta/ticdc-delete-etcd-key-count
	7

/tidb/cdc/default/__cdc_meta__/owner/22318f471df5b50a
	176cce97-1d5a-4536-ad25-fd9e30b81117

/tidb/cdc/default/default/upstream/7365375408703361060
	{"id":7365375408703361060,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.6813.out cli changefeed create --start-ts=0 '--sink-uri=kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' -c changefeed-initialize-error
check diff failed 5-th time, retry later
Create changefeed successfully!
ID: changefeed-initialize-error
Info: {"upstream_id":7365375408703361060,"namespace":"default","id":"changefeed-initialize-error","sink_uri":"kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:59:25.418236227+08:00","start_ts":449546856769257474,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546856769257474,"checkpoint_ts":449546856769257474,"checkpoint_time":"2024-05-05 12:59:25.288"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7be4540010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5, pid:4145, start at 2024-05-05 12:59:23.300206533 +0800 CST m=+5.382999061	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:23.307 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:23.285 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:23.285 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7be4200015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5, pid:4226, start at 2024-05-05 12:59:23.311793423 +0800 CST m=+5.337967706	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:23.320 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:23.323 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:23.323 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/region_merge/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/region_merge/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/region_merge/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/region_merge/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/region_merge/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
start tidb cluster in /tmp/tidb_cdc_test/synced_status_with_redo
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
+ set +x
check_changefeed_state http://127.0.0.1:2379 changefeed-initialize-error normal null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=changefeed-initialize-error
+ expected_state=normal
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c changefeed-initialize-error -s
table ddl_manager.finish_mark not exists for 35-th check, retry later
table mark.finish_mark_2 not exists for 9-th check, retry later
table test.finish_mark exists
check diff successfully
+ info='{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "warning",
  "checkpoint_tso": 449546856769257474,
  "checkpoint_time": "2024-05-05 12:59:25.288",
  "error": {
    "time": "2024-05-05T12:59:25.600447619+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrProcessorUnknown",
    "message": "changefeed new redo manager injected error"
  }
}'
+ echo '{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "warning",
  "checkpoint_tso": 449546856769257474,
  "checkpoint_time": "2024-05-05 12:59:25.288",
  "error": {
    "time": "2024-05-05T12:59:25.600447619+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrProcessorUnknown",
    "message": "changefeed new redo manager injected error"
  }
}'
{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "warning",
  "checkpoint_tso": 449546856769257474,
  "checkpoint_time": "2024-05-05 12:59:25.288",
  "error": {
    "time": "2024-05-05T12:59:25.600447619+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrProcessorUnknown",
    "message": "changefeed new redo manager injected error"
  }
}
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-initialize-error",' '"state":' '"warning",' '"checkpoint_tso":' 449546856769257474, '"checkpoint_time":' '"2024-05-05' '12:59:25.288",' '"error":' '{' '"time":' '"2024-05-05T12:59:25.600447619+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrProcessorUnknown",' '"message":' '"changefeed' new redo manager injected 'error"' '}' '}'
++ jq -r .state
+ state=warning
+ [[ ! warning == \n\o\r\m\a\l ]]
+ echo 'changefeed state warning does not equal to normal'
changefeed state warning does not equal to normal
+ exit 1
run task failed 1-th time, retry later
check diff successfully
wait process cdc.test exit for 1-th time...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
wait process cdc.test exit for 2-th time...
[Sun May  5 12:59:27 CST 2024] <<<<<< START cdc server in kafka_column_selector_avro case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_column_selector_avro.3237132373.out server --log-file /tmp/tidb_cdc_test/kafka_column_selector_avro/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_column_selector_avro/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
wait process cdc.test exit for 3-th time...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 12:59:28 CST 2024] <<<<<< run test case kafka_simple_handle_key_only success! >>>>>>
table ddl_manager.finish_mark not exists for 36-th check, retry later
[Sun May  5 12:59:28 CST 2024] <<<<<< START cdc server in region_merge case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.region_merge.56875689.out server --log-file /tmp/tidb_cdc_test/region_merge/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/region_merge/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table mark.finish_mark_2 not exists for 10-th check, retry later
check diff failed 1-th time, retry later
Verifying downstream PD is started...
check_changefeed_state http://127.0.0.1:2379 changefeed-initialize-error normal null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=changefeed-initialize-error
+ expected_state=normal
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c changefeed-initialize-error -s
+ info='{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "normal",
  "checkpoint_tso": 449546857634332676,
  "checkpoint_time": "2024-05-05 12:59:28.588",
  "error": null
}'
+ echo '{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "normal",
  "checkpoint_tso": 449546857634332676,
  "checkpoint_time": "2024-05-05 12:59:28.588",
  "error": null
}'
{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "normal",
  "checkpoint_tso": 449546857634332676,
  "checkpoint_time": "2024-05-05 12:59:28.588",
  "error": null
}
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-initialize-error",' '"state":' '"normal",' '"checkpoint_tso":' 449546857634332676, '"checkpoint_time":' '"2024-05-05' '12:59:28.588",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-initialize-error",' '"state":' '"normal",' '"checkpoint_tso":' 449546857634332676, '"checkpoint_time":' '"2024-05-05' '12:59:28.588",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.6949.out cli changefeed pause -c changefeed-initialize-error
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_manager.finish_mark not exists for 37-th check, retry later
table mark.finish_mark_2 not exists for 11-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:30 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c145c8bd-21ef-405d-bbd5-a393f2c60acd
	{"id":"c145c8bd-21ef-405d-bbd5-a393f2c60acd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885167}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ed069ce
	c145c8bd-21ef-405d-bbd5-a393f2c60acd

/tidb/cdc/default/default/upstream/7365375651063177701
	{"id":7365375651063177701,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c145c8bd-21ef-405d-bbd5-a393f2c60acd
	{"id":"c145c8bd-21ef-405d-bbd5-a393f2c60acd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885167}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ed069ce
	c145c8bd-21ef-405d-bbd5-a393f2c60acd

/tidb/cdc/default/default/upstream/7365375651063177701
	{"id":7365375651063177701,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/c145c8bd-21ef-405d-bbd5-a393f2c60acd
	{"id":"c145c8bd-21ef-405d-bbd5-a393f2c60acd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885167}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ed069ce
	c145c8bd-21ef-405d-bbd5-a393f2c60acd

/tidb/cdc/default/default/upstream/7365375651063177701
	{"id":7365375651063177701,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_column_selector_avro.cli.32432.out cli changefeed create --start-ts=449546857316876289 '--sink-uri=kafka://127.0.0.1:9092/column-selector-avro-test?protocol=avro&enable-tidb-extension=true&avro-enable-watermark=true' -c test --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_column_selector_avro/conf/changefeed.toml --schema-registry=http://127.0.0.1:8088
Create changefeed successfully!
ID: test
Info: {"upstream_id":7365375651063177701,"namespace":"default","id":"test","sink_uri":"kafka://127.0.0.1:9092/column-selector-avro-test?protocol=avro\u0026enable-tidb-extension=true\u0026avro-enable-watermark=true","create_time":"2024-05-05T12:59:30.95294567+08:00","start_ts":449546857316876289,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"avro","schema_registry":"http://127.0.0.1:8088","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"column_selectors":[{"matcher":["test.*"],"columns":["*","!b"]}],"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546857316876289,"checkpoint_ts":449546857316876289,"checkpoint_time":"2024-05-05 12:59:27.377"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ set +x
check_changefeed_state http://127.0.0.1:2379 changefeed-initialize-error stopped changefeed new redo manager injected error
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=changefeed-initialize-error
+ expected_state=stopped
+ error_msg=changefeed
+ tls_dir=error
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c changefeed-initialize-error -s
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
+ info='{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "stopped",
  "checkpoint_tso": 449546857634332680,
  "checkpoint_time": "2024-05-05 12:59:28.588",
  "error": {
    "time": "2024-05-05T12:59:25.600447619+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrProcessorUnknown",
    "message": "changefeed new redo manager injected error"
  }
}'
+ echo '{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "stopped",
  "checkpoint_tso": 449546857634332680,
  "checkpoint_time": "2024-05-05 12:59:28.588",
  "error": {
    "time": "2024-05-05T12:59:25.600447619+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrProcessorUnknown",
    "message": "changefeed new redo manager injected error"
  }
}'
{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "stopped",
  "checkpoint_tso": 449546857634332680,
  "checkpoint_time": "2024-05-05 12:59:28.588",
  "error": {
    "time": "2024-05-05T12:59:25.600447619+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrProcessorUnknown",
    "message": "changefeed new redo manager injected error"
  }
}
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-initialize-error",' '"state":' '"stopped",' '"checkpoint_tso":' 449546857634332680, '"checkpoint_time":' '"2024-05-05' '12:59:28.588",' '"error":' '{' '"time":' '"2024-05-05T12:59:25.600447619+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrProcessorUnknown",' '"message":' '"changefeed' new redo manager injected 'error"' '}' '}'
++ jq -r .state
+ state=stopped
+ [[ ! stopped == \s\t\o\p\p\e\d ]]
++ jq -r .error.message
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-initialize-error",' '"state":' '"stopped",' '"checkpoint_tso":' 449546857634332680, '"checkpoint_time":' '"2024-05-05' '12:59:28.588",' '"error":' '{' '"time":' '"2024-05-05T12:59:25.600447619+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrProcessorUnknown",' '"message":' '"changefeed' new redo manager injected 'error"' '}' '}'
+ message='changefeed new redo manager injected error'
+ [[ ! changefeed new redo manager injected error =~ changefeed ]]
run task successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.7029.out cli changefeed resume -c changefeed-initialize-error
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:31 GMT
< Content-Length: 859
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/29b16579-d3e4-4347-9864-a4fb6285da0b
	{"id":"29b16579-d3e4-4347-9864-a4fb6285da0b","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885168}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ed3ef3f
	29b16579-d3e4-4347-9864-a4fb6285da0b

/tidb/cdc/default/default/upstream/7365375657000692638
	{"id":7365375657000692638,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2779,http://127.0.0.1:2679,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/29b16579-d3e4-4347-9864-a4fb6285da0b
	{"id":"29b16579-d3e4-4347-9864-a4fb6285da0b","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885168}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ed3ef3f
	29b16579-d3e4-4347-9864-a4fb6285da0b

/tidb/cdc/default/default/upstream/7365375657000692638
	{"id":7365375657000692638,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2779,http://127.0.0.1:2679,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/29b16579-d3e4-4347-9864-a4fb6285da0b
	{"id":"29b16579-d3e4-4347-9864-a4fb6285da0b","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885168}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ed3ef3f
	29b16579-d3e4-4347-9864-a4fb6285da0b

/tidb/cdc/default/default/upstream/7365375657000692638
	{"id":7365375657000692638,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2779,http://127.0.0.1:2679,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
check diff failed 2-th time, retry later
Create changefeed successfully!
ID: 528c3122-f988-4738-8058-b466df85e835
Info: {"upstream_id":7365375657000692638,"namespace":"default","id":"528c3122-f988-4738-8058-b466df85e835","sink_uri":"kafka://127.0.0.1:9092/ticdc-region-merge-test-10848?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:59:31.570919764+08:00","start_ts":449546858367811589,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546858367811589,"checkpoint_ts":449546858367811589,"checkpoint_time":"2024-05-05 12:59:31.386"}
[Sun May  5 12:59:31 CST 2024] <<<<<< START kafka consumer in region_merge case >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
split_and_random_merge scale: 20
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/move_table/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
PASS
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
+ set +x
[Sun May  5 12:59:32 CST 2024] <<<<<< START kafka consumer in kafka_column_selector_avro case >>>>>>
consumer replica config found: /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_column_selector_avro/conf/changefeed.toml
schema registry uri found: http://127.0.0.1:8088
table mark.finish_mark_2 exists
table mark.finish_mark_3 not exists for 1-th check, retry later
Starting build checksum checker...
table test.finishmark not exists for 1-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_manager.finish_mark not exists for 38-th check, retry later
check diff failed 3-th time, retry later
+ set +x
check_changefeed_state http://127.0.0.1:2379 changefeed-initialize-error normal null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=changefeed-initialize-error
+ expected_state=normal
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c changefeed-initialize-error -s
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ info='{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "normal",
  "checkpoint_tso": 449546857634332680,
  "checkpoint_time": "2024-05-05 12:59:28.588",
  "error": null
}'
+ echo '{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "normal",
  "checkpoint_tso": 449546857634332680,
  "checkpoint_time": "2024-05-05 12:59:28.588",
  "error": null
}'
{
  "upstream_id": 7365375408703361060,
  "namespace": "default",
  "id": "changefeed-initialize-error",
  "state": "normal",
  "checkpoint_tso": 449546857634332680,
  "checkpoint_time": "2024-05-05 12:59:28.588",
  "error": null
}
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-initialize-error",' '"state":' '"normal",' '"checkpoint_tso":' 449546857634332680, '"checkpoint_time":' '"2024-05-05' '12:59:28.588",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365375408703361060, '"namespace":' '"default",' '"id":' '"changefeed-initialize-error",' '"state":' '"normal",' '"checkpoint_tso":' 449546857634332680, '"checkpoint_time":' '"2024-05-05' '12:59:28.588",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_error.cli.7122.out cli changefeed remove -c changefeed-initialize-error
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Changefeed remove successfully.
ID: changefeed-initialize-error
CheckpointTs: 449546858944790533
SinkURI: kafka://127.0.0.1:9092/ticdc-sink-retry-test-27469?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
table mark.finish_mark_3 not exists for 2-th check, retry later
table ddl_manager.finish_mark not exists for 39-th check, retry later
check diff failed 4-th time, retry later
table test.finishmark not exists for 2-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/move_table
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7c9508001d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj, pid:4364, start at 2024-05-05 12:59:34.635179863 +0800 CST m=+5.153289252	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:34.644 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:34.644 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:34.644 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7c9508001d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj, pid:4364, start at 2024-05-05 12:59:34.635179863 +0800 CST m=+5.153289252	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:34.644 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:34.644 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:34.644 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7c9e400014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj, pid:4453, start at 2024-05-05 12:59:35.206590827 +0800 CST m=+5.675560941	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:35.213 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:35.184 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:35.184 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/batch_add_table/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/batch_add_table/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/batch_add_table/tiflash/log/proxy.log"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/batch_add_table/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/batch_add_table/tiflash-proxy.toml"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ set +x
wait process cdc.test exit for 1-th time...
table mark.finish_mark_3 not exists for 3-th check, retry later
table ddl_manager.finish_mark not exists for 40-th check, retry later
check diff failed 5-th time, retry later
table test.finishmark exists
[2024/05/05 12:59:36.939 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.finishmark] [checkSum=0]
[2024/05/05 12:59:36.941 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.t1] [checkSum=718014124]
[2024/05/05 12:59:36.941 +08:00] [INFO] [main.go:107] ["get checksum for the upstream success"] [elapsed=8.37013ms]
[2024/05/05 12:59:36.947 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.finishmark] [checkSum=0]
[2024/05/05 12:59:36.948 +08:00] [INFO] [main.go:186] ["do checkSum success"] [table=test.t1] [checkSum=718014124]
[2024/05/05 12:59:36.949 +08:00] [INFO] [main.go:116] ["get checksum for the downstream success"] [elapsed=7.315849ms]
[2024/05/05 12:59:36.949 +08:00] [INFO] [main.go:95] ["compare checksum passed"]
wait process cdc.test exit for 2-th time...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 1-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:59:37 CST 2024] <<<<<< run test case changefeed_error success! >>>>>>
wait process cdc.test exit for 2-th time...
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/partition_table/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
*************************** 1. row ***************************
count(distinct region_id): 1
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:59:38 CST 2024] <<<<<< run test case kafka_column_selector_avro success! >>>>>>
table mark.finish_mark_3 not exists for 4-th check, retry later
table ddl_manager.finish_mark not exists for 41-th check, retry later
[Sun May  5 12:59:38 CST 2024] <<<<<< START cdc server in batch_add_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.batch_add_table.59255927.out server --log-file /tmp/tidb_cdc_test/batch_add_table/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/batch_add_table/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7cca08000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:10853, start at 2024-05-05 12:59:38.00284446 +0800 CST m=+5.201911611	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:38.010 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:37.986 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:37.986 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff successfully
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
start tidb cluster in /tmp/tidb_cdc_test/partition_table
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table mark.finish_mark_3 not exists for 5-th check, retry later
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_handle_key_only/run.sh: line 1: 13405 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/changefeed.toml" 2>&1
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_handle_key_only_avro/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table ddl_manager.finish_mark not exists for 42-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7cca08000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:10853, start at 2024-05-05 12:59:38.00284446 +0800 CST m=+5.201911611	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:38.010 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:37.986 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:37.986 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7ccb700007	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:10933, start at 2024-05-05 12:59:38.08127657 +0800 CST m=+5.226906377	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:38.088 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:38.076 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:38.076 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/error.log
arg matches is ArgMatches { args: {"data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/proxy.log"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
check diff failed 1-th time, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
Verifying downstream PD is started...
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:41 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4e9d5081-1727-4aab-a4cc-4682b921d695
	{"id":"4e9d5081-1727-4aab-a4cc-4682b921d695","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885179}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ef97e13
	4e9d5081-1727-4aab-a4cc-4682b921d695

/tidb/cdc/default/default/upstream/7365375708308749907
	{"id":7365375708308749907,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4e9d5081-1727-4aab-a4cc-4682b921d695
	{"id":"4e9d5081-1727-4aab-a4cc-4682b921d695","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885179}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ef97e13
	4e9d5081-1727-4aab-a4cc-4682b921d695

/tidb/cdc/default/default/upstream/7365375708308749907
	{"id":7365375708308749907,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4e9d5081-1727-4aab-a4cc-4682b921d695
	{"id":"4e9d5081-1727-4aab-a4cc-4682b921d695","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885179}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471ef97e13
	4e9d5081-1727-4aab-a4cc-4682b921d695

/tidb/cdc/default/default/upstream/7365375708308749907
	{"id":7365375708308749907,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.batch_add_table.cli.5986.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-batch-add-table-test-30677?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: 0af9b17f-afb8-4180-8a1a-1804a0b42247
Info: {"upstream_id":7365375708308749907,"namespace":"default","id":"0af9b17f-afb8-4180-8a1a-1804a0b42247","sink_uri":"kafka://127.0.0.1:9092/ticdc-batch-add-table-test-30677?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:59:42.279928259+08:00","start_ts":449546861187956739,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546861187956739,"checkpoint_ts":449546861187956739,"checkpoint_time":"2024-05-05 12:59:42.144"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table mark.finish_mark_3 not exists for 6-th check, retry later
split_and_random_merge scale: 40
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ cd /tmp/tidb_cdc_test/synced_status_with_redo
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.12274.out cli tso query --pd=http://127.0.0.1:2379
start tidb cluster in /tmp/tidb_cdc_test/kafka_simple_handle_key_only_avro
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table ddl_manager.finish_mark not exists for 43-th check, retry later
check diff failed 2-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
[Sun May  5 12:59:43 CST 2024] <<<<<< START kafka consumer in batch_add_table case >>>>>>
table batch_add_table.finish_mark not exists for 1-th check, retry later
Starting Upstream TiDB...
+ set +x
+ tso='449546861526646785
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546861526646785 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449546861526646785
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status_with_redo --binary cdc.test
[Sun May  5 12:59:44 CST 2024] <<<<<< START cdc server in synced_status_with_redo case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.1230612308.out server --log-file /tmp/tidb_cdc_test/synced_status_with_redo/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status_with_redo/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table mark.finish_mark_3 not exists for 7-th check, retry later
table ddl_manager.finish_mark not exists for 44-th check, retry later
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff failed 3-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table batch_add_table.finish_mark not exists for 2-th check, retry later
table mark.finish_mark_3 not exists for 8-th check, retry later
table ddl_manager.finish_mark not exists for 45-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
check diff failed 4-th time, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7d4d4c0013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:23748, start at 2024-05-05 12:59:46.404418775 +0800 CST m=+5.126984288	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:46.413 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:46.387 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:46.387 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7d4d4c0013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:23748, start at 2024-05-05 12:59:46.404418775 +0800 CST m=+5.126984288	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:46.413 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:46.387 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:46.387 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7d4d280014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:23831, start at 2024-05-05 12:59:46.405020802 +0800 CST m=+5.079218244	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:46.414 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:46.378 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:46.378 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/move_table/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/move_table/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/move_table/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/move_table/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/move_table/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table batch_add_table.finish_mark exists
check diff successfully
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:47 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/3a31c315-04ab-4008-823f-6803a54d4970
	{"id":"3a31c315-04ab-4008-823f-6803a54d4970","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885185}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f0659d6
	3a31c315-04ab-4008-823f-6803a54d4970

/tidb/cdc/default/default/upstream/7365375715196792913
	{"id":7365375715196792913,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/3a31c315-04ab-4008-823f-6803a54d4970
	{"id":"3a31c315-04ab-4008-823f-6803a54d4970","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885185}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f0659d6
	3a31c315-04ab-4008-823f-6803a54d4970

/tidb/cdc/default/default/upstream/7365375715196792913
	{"id":7365375715196792913,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/3a31c315-04ab-4008-823f-6803a54d4970
	{"id":"3a31c315-04ab-4008-823f-6803a54d4970","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885185}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f0659d6
	3a31c315-04ab-4008-823f-6803a54d4970

/tidb/cdc/default/default/upstream/7365375715196792913
	{"id":7365375715196792913,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed-redo.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449546861526646785 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.12368.out cli changefeed create --start-ts=449546861526646785 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
wait process cdc.test exit for 1-th time...
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365375715196792913,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T12:59:48.407297361+08:00","start_ts":449546861526646785,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"eventual","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"storage":"file:///tmp/tidb_cdc_test/synced_status/redo","use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546861526646785,"checkpoint_ts":449546861526646785,"checkpoint_time":"2024-05-05 12:59:43.436"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
table mark.finish_mark_3 not exists for 9-th check, retry later
table ddl_manager.finish_mark not exists for 46-th check, retry later
wait process cdc.test exit for 2-th time...
*************************** 1. row ***************************
count(distinct region_id): 1
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 12:59:49 CST 2024] <<<<<< run test case batch_add_table success! >>>>>>
check diff successfully
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.move_table.cli.25155.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   221  100   221    0     0   2976      0 --:--:-- --:--:-- --:--:--  2986
+ synced_status='{"synced":true,"sink_checkpoint_ts":"2024-05-05 12:59:43.436","puller_resolved_ts":"1970-01-01 08:00:00.000","last_synced_ts":"1970-01-01 08:00:00.000","now_ts":"2024-05-05 12:59:49.000","info":"Data syncing is finished"}'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '12:59:43.436","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '12:59:49.000","info":"Data' syncing is 'finished"}'
++ jq .synced
+ status=true
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '12:59:43.436","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '12:59:49.000","info":"Data' syncing is 'finished"}'
++ jq -r .sink_checkpoint_ts
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/ddl_sequence/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
+ sink_checkpoint_ts='2024-05-05 12:59:43.436'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '12:59:43.436","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '12:59:49.000","info":"Data' syncing is 'finished"}'
++ jq -r .puller_resolved_ts
+ puller_resolved_ts='1970-01-01 08:00:00.000'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '12:59:43.436","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '12:59:49.000","info":"Data' syncing is 'finished"}'
++ jq -r .last_synced_ts
+ last_synced_ts='1970-01-01 08:00:00.000'
+ '[' true '!=' true ']'
+ '[' '1970-01-01 08:00:00.000' '!=' '1970-01-01 08:00:00.000' ']'
+ '[' '1970-01-01 08:00:00.000' '!=' '1970-01-01 08:00:00.000' ']'
++ date '+%Y-%m-%d %H:%M:%S'
+ current='2024-05-05 12:59:50'
+ echo 'sink_checkpoint_ts is 2024-05-05' 12:59:43.436
sink_checkpoint_ts is 2024-05-05 12:59:43.436
++ date -d '2024-05-05 12:59:43.436' +%s
+ checkpoint_timestamp=1714885183
++ date -d '2024-05-05 12:59:50' +%s
+ current_timestamp=1714885190
+ '[' 7 -gt 300 ']'
+ run_sql 'USE TEST;Create table t1(a int primary key, b int);insert into t1 values(1,2);insert into t1 values(2,3);'
+ check_table_exists test.t1 127.0.0.1 3306
table test.t1 not exists for 1-th check, retry later
table mark.finish_mark_3 not exists for 10-th check, retry later
table ddl_manager.finish_mark not exists for 47-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/debezium/run.sh using Sink-Type: kafka... <<=================
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7d88d40010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:10047, start at 2024-05-05 12:59:50.210840217 +0800 CST m=+5.184411087	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:50.220 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:50.197 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:50.197 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7d88d40010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:10047, start at 2024-05-05 12:59:50.210840217 +0800 CST m=+5.184411087	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:50.220 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:50.197 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:50.197 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7d8b1c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-xnvpx-w79sg, pid:10126, start at 2024-05-05 12:59:50.371773881 +0800 CST m=+5.291010225	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:50.378 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:50.343 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:50.343 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/partition_table/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/partition_table/tiflash/log/error.log
arg matches is ArgMatches { args: {"addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/partition_table/tiflash-proxy.toml"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/partition_table/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/partition_table/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
check diff failed 1-th time, retry later
start tidb cluster in /tmp/tidb_cdc_test/ddl_sequence
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table test.t1 exists
+ sleep 5
+ set +x
+ tso='449546863519203329
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546863519203329 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
table mark.finish_mark_3 not exists for 11-th check, retry later
table ddl_manager.finish_mark not exists for 48-th check, retry later
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
check diff failed 2-th time, retry later
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.partition_table.cli.11584.out cli tso query --pd=http://127.0.0.1:2379
***************** properties *****************
"mysql.db"="move_table"
"mysql.user"="root"
"scanproportion"="0"
"recordcount"="10000"
"insertproportion"="0"
"mysql.host"="127.0.0.1"
"mysql.port"="4000"
"requestdistribution"="uniform"
"workload"="core"
"dotransactions"="false"
"updateproportion"="0"
"operationcount"="0"
"readproportion"="0"
"threadcount"="10"
"readallfields"="true"
**********************************************
Verifying downstream PD is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  1415  100   678  100   737   1374   1493 --:--:-- --:--:-- --:--:--  1494
HTTP/1.1 201 Created
Date: Sun, 05 May 2024 04:59:53 GMT
Location: http://localhost:8083/connectors/my-connector
Content-Type: application/json
Content-Length: 678
Server: Jetty(9.4.51.v20230217)

{"name":"my-connector","config":{"connector.class":"io.debezium.connector.mysql.MySqlConnector","tasks.max":"1","database.hostname":"127.0.0.1","database.port":"3310","database.user":"debezium","database.password":"dbz","database.server.id":"184054","topic.prefix":"dbserver1","schema.history.internal.kafka.bootstrap.servers":"127.0.0.1:9092","schema.history.internal.kafka.topic":"schemahistory.test","transforms":"x","transforms.x.type":"org.apache.kafka.connect.transforms.RegexRouter","transforms.x.regex":"(.*)","transforms.x.replacement":"output_debezium","binary.handling.mode":"base64","decimal.handling.mode":"double","name":"my-connector"},"tasks":[],"type":"source"}The 1 times to try to start tidb cluster...
split_and_random_merge scale: 80
table mark.finish_mark_3 not exists for 12-th check, retry later
Run finished, takes 1.176156306s
INSERT - Takes(s): 1.2, Count: 10000, OPS: 8538.3, Avg(us): 1137, Min(us): 784, Max(us): 5648, 95th(us): 2000, 99th(us): 2000
[Sun May  5 12:59:54 CST 2024] <<<<<< START cdc server in move_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.move_table.2525025252.out server --log-file /tmp/tidb_cdc_test/move_table/cdc1.log --log-level debug --data-dir /tmp/tidb_cdc_test/move_table/cdc_data1 --cluster-id default --addr 127.0.0.1:8300
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table ddl_manager.finish_mark not exists for 49-th check, retry later
+ set +x
+ tso='449546864216506369
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546864216506369 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 12:59:55 CST 2024] <<<<<< START cdc server in partition_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.partition_table.1162911631.out server --log-file /tmp/tidb_cdc_test/partition_table/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/partition_table/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
check diff failed 3-th time, retry later
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7dcab0000c	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:14511, start at 2024-05-05 12:59:54.42246062 +0800 CST m=+5.068900743	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:54.429 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:54.412 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:54.412 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7dcab0000c	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:14511, start at 2024-05-05 12:59:54.42246062 +0800 CST m=+5.068900743	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:54.429 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:54.412 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:54.412 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7dcc180015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:14596, start at 2024-05-05 12:59:54.541626239 +0800 CST m=+5.129956362	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:01:54.548 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-12:59:54.551 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:49:54.551 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_simple_handle_key_only_avro/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_simple_handle_key_only_avro/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_simple_handle_key_only_avro/tiflash-proxy.toml"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_simple_handle_key_only_avro/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_simple_handle_key_only_avro/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table ddl_manager.finish_mark not exists for 50-th check, retry later
table mark.finish_mark_3 not exists for 13-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/debezium
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
[Sun May  5 12:59:57 CST 2024] <<<<<< START cdc server in kafka_simple_handle_key_only_avro case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only_avro.1605616058.out server --log-file /tmp/tidb_cdc_test/kafka_simple_handle_key_only_avro/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_simple_handle_key_only_avro/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
check diff failed 4-th time, retry later
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   243  100   243    0     0   3197      0 --:--:-- --:--:-- --:--:--  3197
100   243  100   243    0     0   3192      0 --:--:-- --:--:-- --:--:--  3155
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 12:59:55.835","puller_resolved_ts":"2024-05-05 12:59:50.086","last_synced_ts":"2024-05-05 12:59:50.136","now_ts":"2024-05-05 12:59:57.000","info":"The data syncing is not finished, please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '12:59:55.835","puller_resolved_ts":"2024-05-05' '12:59:50.086","last_synced_ts":"2024-05-05' '12:59:50.136","now_ts":"2024-05-05' '12:59:57.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '12:59:55.835","puller_resolved_ts":"2024-05-05' '12:59:50.086","last_synced_ts":"2024-05-05' '12:59:50.136","now_ts":"2024-05-05' '12:59:57.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq -r .info
+ info='The data syncing is not finished, please wait'
+ '[' 'The data syncing is not finished, please wait' '!=' 'The data syncing is not finished, please wait' ']'
+ sleep 130
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:57 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/aab420d9-ebf4-438d-af85-5168587f9ef1
	{"id":"aab420d9-ebf4-438d-af85-5168587f9ef1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278ef4
	aab420d9-ebf4-438d-af85-5168587f9ef1

/tidb/cdc/default/default/upstream/7365375758487676008
	{"id":7365375758487676008,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/aab420d9-ebf4-438d-af85-5168587f9ef1
	{"id":"aab420d9-ebf4-438d-af85-5168587f9ef1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278ef4
	aab420d9-ebf4-438d-af85-5168587f9ef1

/tidb/cdc/default/default/upstream/7365375758487676008
	{"id":7365375758487676008,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/aab420d9-ebf4-438d-af85-5168587f9ef1
	{"id":"aab420d9-ebf4-438d-af85-5168587f9ef1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278ef4
	aab420d9-ebf4-438d-af85-5168587f9ef1

/tidb/cdc/default/default/upstream/7365375758487676008
	{"id":7365375758487676008,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.move_table.cli.25311.out cli changefeed create --start-ts=449546863519203329 '--sink-uri=kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Create changefeed successfully!
ID: 23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
Info: {"upstream_id":7365375758487676008,"namespace":"default","id":"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1","sink_uri":"kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:59:58.279220052+08:00","start_ts":449546863519203329,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546863519203329,"checkpoint_ts":449546863519203329,"checkpoint_time":"2024-05-05 12:59:51.037"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 04:59:58 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ac38ce7e-aa9d-41c4-98e9-2d10ba718419
	{"id":"ac38ce7e-aa9d-41c4-98e9-2d10ba718419","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f39f4d3
	ac38ce7e-aa9d-41c4-98e9-2d10ba718419

/tidb/cdc/default/default/upstream/7365375767106011962
	{"id":7365375767106011962,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ac38ce7e-aa9d-41c4-98e9-2d10ba718419
	{"id":"ac38ce7e-aa9d-41c4-98e9-2d10ba718419","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f39f4d3
	ac38ce7e-aa9d-41c4-98e9-2d10ba718419

/tidb/cdc/default/default/upstream/7365375767106011962
	{"id":7365375767106011962,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ac38ce7e-aa9d-41c4-98e9-2d10ba718419
	{"id":"ac38ce7e-aa9d-41c4-98e9-2d10ba718419","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f39f4d3
	ac38ce7e-aa9d-41c4-98e9-2d10ba718419

/tidb/cdc/default/default/upstream/7365375767106011962
	{"id":7365375767106011962,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.partition_table.cli.11692.out cli changefeed create --start-ts=449546864216506369 '--sink-uri=kafka://127.0.0.1:9092/ticdc-partition-table-test-20270?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: aa5225a8-ed45-4cb2-9666-3ed8d907c9f9
Info: {"upstream_id":7365375767106011962,"namespace":"default","id":"aa5225a8-ed45-4cb2-9666-3ed8d907c9f9","sink_uri":"kafka://127.0.0.1:9092/ticdc-partition-table-test-20270?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T12:59:58.689522425+08:00","start_ts":449546864216506369,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546864216506369,"checkpoint_ts":449546864216506369,"checkpoint_time":"2024-05-05 12:59:53.697"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table ddl_manager.finish_mark not exists for 51-th check, retry later
table mark.finish_mark_3 not exists for 14-th check, retry later
Verifying downstream PD is started...
*************************** 1. row ***************************
count(distinct region_id): 7
check diff successfully
+ set +x
[Sun May  5 12:59:59 CST 2024] <<<<<< START kafka consumer in move_table case >>>>>>
[Sun May  5 12:59:59 CST 2024] <<<<<< START cdc server in move_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8301/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.move_table.2534825354.out server --log-file /tmp/tidb_cdc_test/move_table/cdc2.log --log-level debug --data-dir /tmp/tidb_cdc_test/move_table/cdc_data2 --cluster-id default --addr 127.0.0.1:8301
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8301/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8301 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8301; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
wait process cdc.test exit for 1-th time...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
[Sun May  5 13:00:00 CST 2024] <<<<<< START kafka consumer in partition_table case >>>>>>
wait process cdc.test exit for 2-th time...
table ddl_manager.finish_mark not exists for 52-th check, retry later
table mark.finish_mark_3 not exists for 15-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:00 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/86618fbf-7df1-4bbc-92ad-dc7facf02ce3
	{"id":"86618fbf-7df1-4bbc-92ad-dc7facf02ce3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885198}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f471ac6
	86618fbf-7df1-4bbc-92ad-dc7facf02ce3

/tidb/cdc/default/default/upstream/7365375791173759494
	{"id":7365375791173759494,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/86618fbf-7df1-4bbc-92ad-dc7facf02ce3
	{"id":"86618fbf-7df1-4bbc-92ad-dc7facf02ce3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885198}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f471ac6
	86618fbf-7df1-4bbc-92ad-dc7facf02ce3

/tidb/cdc/default/default/upstream/7365375791173759494
	{"id":7365375791173759494,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/86618fbf-7df1-4bbc-92ad-dc7facf02ce3
	{"id":"86618fbf-7df1-4bbc-92ad-dc7facf02ce3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885198}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f471ac6
	86618fbf-7df1-4bbc-92ad-dc7facf02ce3

/tidb/cdc/default/default/upstream/7365375791173759494
	{"id":7365375791173759494,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only_avro.cli.16116.out cli tso query --pd=http://127.0.0.1:2379
wait process cdc.test exit for 3-th time...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 13:00:01 CST 2024] <<<<<< run test case changefeed_pause_resume success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table ddl_manager.finish_mark not exists for 53-th check, retry later
table mark.finish_mark_3 not exists for 16-th check, retry later
+ set +x
+ tso='449546866147196932
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546866147196932 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only_avro.cli.16159.out cli changefeed create --start-ts=449546866147196932 '--sink-uri=kafka://127.0.0.1:9092/simple-handle-key-only-avro-14984?protocol=simple&encoding-format=avro' -c simple-handle-key-only-avro --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_handle_key_only_avro/conf/changefeed.toml
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8301/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8301 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8301 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8301
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:02 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** processors info ***:

changefeedID: default/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
{UpstreamID:7365375758487676008 Namespace:default ID:23dc79ee-1d9d-4f53-b8fc-e276f6728fa1 SinkURI:kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:59:58.279220052 +0800 CST StartTs:449546863519203329 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc0000622d0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546865406902277}
{CheckpointTs:449546864174563577 MinTableBarrierTs:449546866219286532 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/119b8f6b-7703-4064-8285-e6838fca69af
	{"id":"119b8f6b-7703-4064-8285-e6838fca69af","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885200}

/tidb/cdc/default/__cdc_meta__/capture/aab420d9-ebf4-438d-af85-5168587f9ef1
	{"id":"aab420d9-ebf4-438d-af85-5168587f9ef1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278ef4
	aab420d9-ebf4-438d-af85-5168587f9ef1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278f49
	119b8f6b-7703-4064-8285-e6838fca69af

/tidb/cdc/default/default/changefeed/info/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"upstream-id":7365375758487676008,"namespace":"default","changefeed-id":"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1","sink-uri":"kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:59:58.279220052+08:00","start-ts":449546863519203329,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546865406902277}

/tidb/cdc/default/default/changefeed/status/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":449546864174563577,"min-table-barrier-ts":449546866219286532,"admin-job-type":0}

/tidb/cdc/default/default/task/position/119b8f6b-7703-4064-8285-e6838fca69af/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/aab420d9-ebf4-438d-af85-5168587f9ef1/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375758487676008
	{"id":7365375758487676008,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** processors info ***:

changefeedID: default/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
{UpstreamID:7365375758487676008 Namespace:default ID:23dc79ee-1d9d-4f53-b8fc-e276f6728fa1 SinkURI:kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:59:58.279220052 +0800 CST StartTs:449546863519203329 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc0000622d0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546865406902277}
{CheckpointTs:449546864174563577 MinTableBarrierTs:449546866219286532 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/119b8f6b-7703-4064-8285-e6838fca69af
	{"id":"119b8f6b-7703-4064-8285-e6838fca69af","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885200}

/tidb/cdc/default/__cdc_meta__/capture/aab420d9-ebf4-438d-af85-5168587f9ef1
	{"id":"aab420d9-ebf4-438d-af85-5168587f9ef1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278ef4
	aab420d9-ebf4-438d-af85-5168587f9ef1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278f49
	119b8f6b-7703-4064-8285-e6838fca69af

/tidb/cdc/default/default/changefeed/info/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"upstream-id":7365375758487676008,"namespace":"default","changefeed-id":"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1","sink-uri":"kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:59:58.279220052+08:00","start-ts":449546863519203329,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546865406902277}

/tidb/cdc/default/default/changefeed/status/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":449546864174563577,"m+ grep -q 'failed to get info:'
in-table-barrier-ts":449546866219286532,"admin-job-type":0}

/tidb/cdc/default/default/task/position/119b8f6b-7703-4064-8285-e6838fca69af/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/aab420d9-ebf4-438d-af85-5168587f9ef1/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375758487676008
	{"id":7365375758487676008,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** processors info ***:

changefeedID: default/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
{UpstreamID:7365375758487676008 Namespace:default ID:23dc79ee-1d9d-4f53-b8fc-e276f6728fa1 SinkURI:kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:59:58.279220052 +0800 CST StartTs:449546863519203329 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc0000622d0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546865406902277}
{CheckpointTs:449546864174563577 MinTableBarrierTs:449546866219286532 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/119b8f6b-7703-4064-8285-e6838fca69af
	{"id":"119b8f6b-7703-4064-8285-e6838fca69af","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885200}

/tidb/cdc/default/__cdc_meta__/capture/aab420d9-ebf4-438d-af85-5168587f9ef1
	{"id":"aab420d9-ebf4-438d-af85-5168587f9ef1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278ef4
	aab420d9-ebf4-438d-af85-5168587f9ef1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278f49
	119b8f6b-7703-4064-8285-e6838fca69af

/tidb/cdc/default/default/changefeed/info/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"upstream-id":7365375758487676008,"namespace":"default","changefeed-id":"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1","sink-uri":"kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:59:58.279220052+08:00","start-ts":449546863519203329,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546865406902277}

/tidb/cdc/default/default/changefeed/status/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":449546864174563577,"min-table-barrier-ts":449546866219286532,"admin-job-type":0}

/tidb/cdc/default/default/task/position/119b8f6b-7703-4064-8285-e6838fca69af/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/aab420d9-ebf4-438d-af85-5168587f9ef1/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375758487676008
	{"id":7365375758487676008,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 13:00:02 CST 2024] <<<<<< START cdc server in move_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.move_table.2546725469.out server --log-file /tmp/tidb_cdc_test/move_table/cdc3.log --log-level debug --data-dir /tmp/tidb_cdc_test/move_table/cdc_data3 --cluster-id default --addr 127.0.0.1:8302
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8302/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8302/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8302 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8302; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
Create changefeed successfully!
ID: simple-handle-key-only-avro
Info: {"upstream_id":7365375791173759494,"namespace":"default","id":"simple-handle-key-only-avro","sink_uri":"kafka://127.0.0.1:9092/simple-handle-key-only-avro-14984?protocol=simple\u0026encoding-format=avro","create_time":"2024-05-05T13:00:02.959554495+08:00","start_ts":449546866147196932,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"handle-key-only","large_message_handle_compression":"lz4","claim_check_storage_uri":""}},"advance_timeout":150,"send_bootstrap_interval_in_sec":0,"send_bootstrap_in_msg_count":0,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546866147196932,"checkpoint_ts":449546866147196932,"checkpoint_time":"2024-05-05 13:00:01.062"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7e50700003	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:8274, start at 2024-05-05 13:00:02.97798641 +0800 CST m=+5.083019284	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:02.985 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:02.972 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:02.972 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7e50700003	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:8274, start at 2024-05-05 13:00:02.97798641 +0800 CST m=+5.083019284	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:02.985 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:02.972 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:02.972 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7e5058001a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:8365, start at 2024-05-05 13:00:03.009539665 +0800 CST m=+5.061919506	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:03.016 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:03.016 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:03.016 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/ddl_sequence/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/ddl_sequence/tiflash/log/error.log
arg matches is ArgMatches { args: {"advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/ddl_sequence/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/ddl_sequence/tiflash/log/proxy.log"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/ddl_sequence/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_rocks/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
+ set +x
table mark.finish_mark_3 not exists for 17-th check, retry later
table region_merge.t1 exists
check diff failed 1-th time, retry later
table ddl_manager.finish_mark not exists for 54-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8302/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8302 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8302 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8302
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:05 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** processors info ***:

changefeedID: default/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
{UpstreamID:7365375758487676008 Namespace:default ID:23dc79ee-1d9d-4f53-b8fc-e276f6728fa1 SinkURI:kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:59:58.279220052 +0800 CST StartTs:449546863519203329 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc0031245a0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546865406902277}
{CheckpointTs:449546867267862534 MinTableBarrierTs:449546867267862535 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/119b8f6b-7703-4064-8285-e6838fca69af
	{"id":"119b8f6b-7703-4064-8285-e6838fca69af","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885200}

/tidb/cdc/default/__cdc_meta__/capture/48091314-99cc-4c25-9e6a-eac1cba88989
	{"id":"48091314-99cc-4c25-9e6a-eac1cba88989","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885203}

/tidb/cdc/default/__cdc_meta__/capture/aab420d9-ebf4-438d-af85-5168587f9ef1
	{"id":"aab420d9-ebf4-438d-af85-5168587f9ef1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278ef4
	aab420d9-ebf4-438d-af85-5168587f9ef1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278f49
	119b8f6b-7703-4064-8285-e6838fca69af

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278f78
	48091314-99cc-4c25-9e6a-eac1cba88989

/tidb/cdc/default/default/changefeed/info/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"upstream-id":7365375758487676008,"namespace":"default","changefeed-id":"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1","sink-uri":"kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:59:58.279220052+08:00","start-ts":449546863519203329,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546865406902277}

/tidb/cdc/default/default/changefeed/status/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":449546867267862534,"min-table-barrier-ts":449546867267862535,"admin-job-type":0}

/tidb/cdc/default/default/task/position/119b8f6b-7703-4064-8285-e6838fca69af/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/48091314-99cc-4c25-9e6a-eac1cba88989/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/aab420d9-ebf4-438d-af85-5168587f9ef1/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375758487676008
	{"id":7365375758487676008,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** processors info ***:

changefeedID: default/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
{UpstreamID:7365375758487676008 Namespace:default ID:23dc79ee-1d9d-4f53-b8fc-e276f6728fa1 SinkURI:kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:59:58.279220052 +0800 CST StartTs:449546863519203329 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc0031245a0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546865406902277}
{CheckpointTs:449546867267862534 MinTableBarrierTs:449546867267862535 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/119b8f6b-7703-4064-8285-e6838fca69af
	{"id":"119b8f6b-7703-4064-8285-e6838fca69af","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885200}

/tidb/cdc/default/__cdc_meta__/capture/48091314-99cc-4c25-9e6a-eac1cba88989
	{"id":"48091314-99cc-4c25-9e6a-eac1cba88989","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885203}

/tidb/cdc/default/__cdc_meta__/capture/aab420d9-ebf4-438d-af85-5168587f9ef1
	{"id":"aab420d9-ebf4-438d-af85-5168587f9ef1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278ef4
	aab420d9-ebf4-438d-af85-5168587f9ef1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278f49
	119b8f6b-7703-4064-8285-e6838fca69af

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278f78
	48091314-99cc-4c25-9e6a-eac1cba88989

/tidb/cdc/default/default/changefeed/info/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"upstream-id":7365375758487676008,"namespace":"default","changefeed-id":"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1","sink-uri":"kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:59:58.279220052+08:00","start-ts":449546863519203329,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546865406902277}

/tidb/cdc/default/default/changefeed/status/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":449546867267862534,"min-table-barrier-ts":449546867267862535,"admin-job-type":0}

/tidb/cdc/default/default/task/position/119b8f6b-7703-4064-8285-e6838fca69af/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/48091314-99cc-4c25-9e6a-eac1cba88989/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/aab420d9-ebf4-438d-af85-5168587f9ef1/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375758487676008
	{"id":7365375758487676008,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ echo '

*** processors info ***:

changefeedID: default/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
{UpstreamID:7365375758487676008 Namespace:default ID:23dc79ee-1d9d-4f53-b8fc-e276f6728fa1 SinkURI:kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 12:59:58.279220052 +0800 CST StartTs:449546863519203329 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc0031245a0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546865406902277}
{CheckpointTs:449546867267862534 MinTableBarrierTs:449546867267862535 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/119b8f6b-7703-4064-8285-e6838fca69af
	{"id":"119b8f6b-7703-4064-8285-e6838fca69af","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885200}

/tidb/cdc/default/__cdc_meta__/capture/48091314-99cc-4c25-9e6a-eac1cba88989
	{"id":"48091314-99cc-4c25-9e6a-eac1cba88989","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885203}

/tidb/cdc/default/__cdc_meta__/capture/aab420d9-ebf4-438d-af85-5168587f9ef1
	{"id":"aab420d9-ebf4-438d-af85-5168587f9ef1","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885195}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278ef4
	aab420d9-ebf4-438d-af85-5168587f9ef1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278f49
	119b8f6b-7703-4064-8285-e6838fca69af

/tidb/cdc/default/__cdc_meta__/owner/22318f471f278f78
	48091314-99cc-4c25-9e6a-eac1cba88989

/tidb/cdc/default/default/changefeed/info/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"upstream-id":7365375758487676008,"namespace":"default","changefeed-id":"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1","sink-uri":"kafka://127.0.0.1:9092/ticdc-move-table-test-9479?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T12:59:58.279220052+08:00","start-ts":449546863519203329,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546865406902277}

/tidb/cdc/default/default/changefeed/status/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":449546867267862534,"min-table-barrier-ts":449546867267862535,"admin-job-type":0}

/tidb/cdc/default/default/task/position/119b8f6b-7703-4064-8285-e6838fca69af/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/48091314-99cc-4c25-9e6a-eac1cba88989/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/aab420d9-ebf4-438d-af85-5168587f9ef1/23dc79ee-1d9d-4f53-b8fc-e276f6728fa1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365375758487676008
	{"id":7365375758487676008,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ break
+ set +x
table move_table.usertable exists
go: downloading github.com/benbjohnson/clock v1.3.5
go: downloading github.com/IBM/sarama v1.41.2
go: downloading github.com/xdg/scram v1.0.5
go: downloading go.etcd.io/etcd/server/v3 v3.5.12
go: downloading github.com/tinylib/msgp v1.1.6
go: downloading github.com/go-mysql-org/go-mysql v1.7.1-0.20240314115043-2199dfb0ba98
go: downloading github.com/cakturk/go-netstat v0.0.0-20200220111822-e5b49efee7a5
go: downloading github.com/KimMachineGun/automemlimit v0.2.4
go: downloading github.com/gavv/monotime v0.0.0-20190418164738-30dba4353424
go: downloading github.com/apache/pulsar-client-go v0.11.0
go: downloading gorm.io/gorm v1.24.5
go: downloading github.com/aws/aws-sdk-go-v2 v1.19.1
go: downloading github.com/edwingeng/deque v0.0.0-20191220032131-8596380dee17
go: downloading github.com/pierrec/lz4/v4 v4.1.18
go: downloading github.com/gin-gonic/gin v1.9.1
go: downloading github.com/grpc-ecosystem/go-grpc-prometheus v1.2.0
go: downloading github.com/phayes/freeport v0.0.0-20180830031419-95f893ade6f2
go: downloading github.com/stretchr/objx v0.5.2
go: downloading github.com/xdg/stringprep v1.0.3
go: downloading github.com/containerd/cgroups v1.0.4
go: downloading github.com/philhofer/fwd v1.1.1
go: downloading github.com/jinzhu/now v1.1.5
go: downloading github.com/jinzhu/inflection v1.0.0
go: downloading github.com/opencontainers/runtime-spec v1.0.2
go: downloading github.com/godbus/dbus/v5 v5.0.4
go: downloading github.com/sirupsen/logrus v1.9.3
go: downloading github.com/cilium/ebpf v0.4.0
go: downloading github.com/siddontang/go v0.0.0-20180604090527-bdc77568d726
go: downloading github.com/siddontang/go-log v0.0.0-20180807004314-8d05993dda07
go: downloading github.com/gin-contrib/sse v0.1.0
go: downloading github.com/go-playground/validator/v10 v10.14.0
go: downloading github.com/ugorji/go/codec v1.2.11
go: downloading github.com/pelletier/go-toml/v2 v2.0.8
go: downloading github.com/godbus/dbus v0.0.0-20190726142602-4481cbc300e2
go: downloading github.com/aws/smithy-go v1.13.5
go: downloading github.com/hashicorp/go-multierror v1.1.1
go: downloading github.com/eapache/go-xerial-snappy v0.0.0-20230731223053-c322873962e3
go: downloading github.com/eapache/go-resiliency v1.4.0
go: downloading github.com/jcmturner/gokrb5/v8 v8.4.4
go: downloading github.com/jcmturner/gofork v1.7.6
go: downloading github.com/rcrowley/go-metrics v0.0.0-20201227073835-cf1acfcdf475
go: downloading github.com/eapache/queue v1.1.0
go: downloading github.com/leodido/go-urn v1.2.4
go: downloading github.com/gabriel-vasile/mimetype v1.4.2
go: downloading github.com/go-playground/universal-translator v0.18.1
go: downloading github.com/linkedin/goavro/v2 v2.11.1
go: downloading github.com/bits-and-blooms/bitset v1.4.0
go: downloading github.com/pierrec/lz4 v2.6.1+incompatible
go: downloading github.com/AthenZ/athenz v1.10.39
go: downloading golang.org/x/mod v0.17.0
go: downloading github.com/99designs/keyring v1.2.1
go: downloading github.com/spaolacci/murmur3 v1.1.0
go: downloading github.com/hashicorp/errwrap v1.0.0
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_sequence.cli.9708.out cli tso query --pd=http://127.0.0.1:2379
go: downloading github.com/go-playground/locales v0.14.1
go: downloading github.com/gsterjov/go-libsecret v0.0.0-20161001094733-a6f4afe4910c
go: downloading github.com/dvsekhvalnov/jose2go v1.5.0
go: downloading github.com/mtibben/percent v0.2.1
go: downloading github.com/soheilhy/cmux v0.1.5
go: downloading github.com/grpc-ecosystem/grpc-gateway v1.16.0
go: downloading go.etcd.io/etcd/pkg/v3 v3.5.12
go: downloading github.com/tmc/grpc-websocket-proxy v0.0.0-20220101234140-673ab2c3ae75
go: downloading go.etcd.io/bbolt v1.3.9
go: downloading github.com/jonboulle/clockwork v0.4.0
go: downloading go.opentelemetry.io/otel/sdk v1.22.0
go: downloading go.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracegrpc v1.22.0
go: downloading sigs.k8s.io/yaml v1.4.0
go: downloading go.etcd.io/etcd/raft/v3 v3.5.12
go: downloading github.com/xiang90/probing v0.0.0-20221125231312-a49e3df8f510
go: downloading github.com/golang-jwt/jwt/v4 v4.5.0
go: downloading go.etcd.io/etcd/client/v2 v2.305.12
go: downloading github.com/jcmturner/aescts/v2 v2.0.0
go: downloading github.com/jcmturner/rpc/v2 v2.0.3
go: downloading github.com/jcmturner/dnsutils/v2 v2.0.0
go: downloading github.com/hashicorp/go-uuid v1.0.3
go: downloading github.com/gorilla/websocket v1.5.1
go: downloading go.opentelemetry.io/otel/exporters/otlp/otlptrace v1.22.0
go: downloading go.opentelemetry.io/proto/otlp v1.1.0
go: downloading github.com/cenkalti/backoff/v4 v4.2.1
go: downloading github.com/grpc-ecosystem/grpc-gateway/v2 v2.19.1
start tidb cluster in /tmp/tidb_cdc_test/multi_rocks
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table ddl_manager.finish_mark not exists for 55-th check, retry later
check diff successfully
wait process cdc.test exit for 1-th time...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 2-th time...
+ set +x
+ tso='449546867578503169
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546867578503169 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:00:07 CST 2024] <<<<<< START cdc server in ddl_sequence case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_sequence.97489750.out server --log-file /tmp/tidb_cdc_test/ddl_sequence/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_sequence/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
go: downloading github.com/ardielle/ardielle-go v1.5.2
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:00:08 CST 2024] <<<<<< run test case region_merge success! >>>>>>
table mark.finish_mark_3 not exists for 18-th check, retry later
table ddl_manager.finish_mark not exists for 56-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7eaac40011	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:33772, start at 2024-05-05 13:00:08.771058569 +0800 CST m=+5.142348887	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:08.778 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:08.753 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:08.753 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7eaac40011	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:33772, start at 2024-05-05 13:00:08.771058569 +0800 CST m=+5.142348887	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:08.778 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:08.753 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:08.753 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7eab6c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-mwn3b-9ckdl, pid:33849, start at 2024-05-05 13:00:08.819679645 +0800 CST m=+5.143856554	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:08.826 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:08.795 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:08.795 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/debezium/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/debezium/tiflash/log/error.log
arg matches is ArgMatches { args: {"config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/debezium/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/debezium/tiflash/db/proxy"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/debezium/tiflash/log/proxy.log"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table partition_table.t exists
table partition_table.t1 exists
table partition_table.t2 not exists for 1-th check, retry later
table mark.finish_mark_3 not exists for 19-th check, retry later
table ddl_manager.finish_mark not exists for 57-th check, retry later
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only_avro.cli.16221.out cli changefeed pause -c simple-handle-key-only-avro
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Debugger for raftstore-v2 is used
Debugger for raftstore-v2 is used
table partition_table.t2 not exists for 2-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:11 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b4260f97-28bd-41ae-8ab5-9cb3a976d3b5
	{"id":"b4260f97-28bd-41ae-8ab5-9cb3a976d3b5","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885208}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f6a4bd6
	b4260f97-28bd-41ae-8ab5-9cb3a976d3b5

/tidb/cdc/default/default/upstream/7365375817447408534
	{"id":7365375817447408534,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b4260f97-28bd-41ae-8ab5-9cb3a976d3b5
	{"id":"b4260f97-28bd-41ae-8ab5-9cb3a976d3b5","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885208}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f6a4bd6
	b4260f97-28bd-41ae-8ab5-9cb3a976d3b5

/tidb/cdc/default/default/upstream/7365375817447408534
	{"id":7365375817447408534,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b4260f97-28bd-41ae-8ab5-9cb3a976d3b5
	{"id":"b4260f97-28bd-41ae-8ab5-9cb3a976d3b5","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885208}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f6a4bd6
	b4260f97-28bd-41ae-8ab5-9cb3a976d3b5

/tidb/cdc/default/default/upstream/7365375817447408534
	{"id":7365375817447408534,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_sequence.cli.9807.out cli changefeed create --start-ts=449546867578503169 '--sink-uri=kafka://127.0.0.1:9092/ticdc-ddl-sequence-test-31580?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: 9aa1e024-cd45-4e27-bd9f-8e8fef575bb8
Info: {"upstream_id":7365375817447408534,"namespace":"default","id":"9aa1e024-cd45-4e27-bd9f-8e8fef575bb8","sink_uri":"kafka://127.0.0.1:9092/ticdc-ddl-sequence-test-31580?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:00:11.546221068+08:00","start_ts":449546867578503169,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546867578503169,"checkpoint_ts":449546867578503169,"checkpoint_time":"2024-05-05 13:00:06.522"}
PASS
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
[Sun May  5 13:00:11 CST 2024] <<<<<< START cdc server in debezium case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.debezium.3518335185.out server --log-file /tmp/tidb_cdc_test/debezium/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/debezium/cdc_data --cluster-id default
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table mark.finish_mark_3 not exists for 20-th check, retry later
table ddl_manager.finish_mark not exists for 58-th check, retry later
table partition_table.t2 not exists for 3-th check, retry later
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only_avro.cli.16253.out cli changefeed update -c simple-handle-key-only-avro '--sink-uri=kafka://127.0.0.1:9092/simple-handle-key-only-avro-14984?protocol=simple&encoding-format=avro&max-message-bytes=650' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_handle_key_only_avro/conf/changefeed.toml --no-confirm
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
[Sun May  5 13:00:12 CST 2024] <<<<<< START kafka consumer in ddl_sequence case >>>>>>
Diff of changefeed config:
{Type:update Path:[SinkURI] From:kafka://127.0.0.1:9092/simple-handle-key-only-avro-14984?protocol=simple&encoding-format=avro To:kafka://127.0.0.1:9092/simple-handle-key-only-avro-14984?protocol=simple&encoding-format=avro&max-message-bytes=650}
{Type:update Path:[Config SyncPointInterval] From:<nil> To:0xc00396b018}
{Type:update Path:[Config SyncPointRetention] From:<nil> To:0xc00396b028}
{Type:update Path:[Config Consistent] From:<nil> To:0xc0012b57a0}
Update changefeed config successfully! 
ID: simple-handle-key-only-avro
Info: {"upstream_id":7365375791173759494,"namespace":"default","id":"simple-handle-key-only-avro","sink_uri":"kafka://127.0.0.1:9092/simple-handle-key-only-avro-14984?protocol=simple\u0026encoding-format=avro\u0026max-message-bytes=650","create_time":"2024-05-05T13:00:02.959554495+08:00","start_ts":449546866147196932,"admin_job_type":1,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_table_monitor":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","encoder_concurrency":32,"terminator":"\r\n","enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"handle-key-only","large_message_handle_compression":"lz4","claim_check_storage_uri":""}},"advance_timeout":150,"send_bootstrap_interval_in_sec":0,"send_bootstrap_in_msg_count":0,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"stopped","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":0,"checkpoint_ts":449546868597981187,"checkpoint_time":"2024-05-05 13:00:10.411"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/cli_with_auth/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
Debugger for raftstore-v2 is used
Debugger for raftstore-v2 is used
table mark.finish_mark_3 not exists for 21-th check, retry later
table partition_table.t2 not exists for 4-th check, retry later
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_handle_key_only_avro.cli.16283.out cli changefeed resume -c simple-handle-key-only-avro
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
table ddl_sequence.finish_mark not exists for 1-th check, retry later
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:15 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/3dae4b6c-566b-4990-b7f2-28f9987cb64d
	{"id":"3dae4b6c-566b-4990-b7f2-28f9987cb64d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885212}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f7eb6ca
	3dae4b6c-566b-4990-b7f2-28f9987cb64d

/tidb/cdc/default/default/upstream/7365375842072144172
	{"id":7365375842072144172,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/3dae4b6c-566b-4990-b7f2-28f9987cb64d
	{"id":"3dae4b6c-566b-4990-b7f2-28f9987cb64d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885212}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f7eb6ca
	3dae4b6c-566b-4990-b7f2-28f9987cb64d

/tidb/cdc/default/default/upstream/7365375842072144172
	{"id":7365375842072144172,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/3dae4b6c-566b-4990-b7f2-28f9987cb64d
	{"id":"3dae4b6c-566b-4990-b7f2-28f9987cb64d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885212}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471f7eb6ca
	3dae4b6c-566b-4990-b7f2-28f9987cb64d

/tidb/cdc/default/default/upstream/7365375842072144172
	{"id":7365375842072144172,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.debezium.cli.35240.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/output_ticdc?protocol=debezium&kafka-version=2.4.0'
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_manager.finish_mark not exists for 59-th check, retry later
Create changefeed successfully!
ID: adc40299-e4df-4b7f-a32f-579913894f1f
Info: {"upstream_id":7365375842072144172,"namespace":"default","id":"adc40299-e4df-4b7f-a32f-579913894f1f","sink_uri":"kafka://127.0.0.1:9092/output_ticdc?protocol=debezium\u0026kafka-version=2.4.0","create_time":"2024-05-05T13:00:15.487884916+08:00","start_ts":449546869893234692,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"debezium","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546869893234692,"checkpoint_ts":449546869893234692,"checkpoint_time":"2024-05-05 13:00:15.352"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
PASS
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
table mark.finish_mark_3 not exists for 22-th check, retry later
table ddl_sequence.finish_mark not exists for 2-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/cli_with_auth
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
+ set +x
go: downloading github.com/fatih/color v1.16.0
go: downloading github.com/alecthomas/chroma v0.10.0
go: downloading github.com/thessem/zap-prettyconsole v0.3.0
go: downloading github.com/google/uuid v1.3.1
go: downloading github.com/pingcap/tidb/pkg/parser v0.0.0-20231116213047-1f7c1e02bcd4
go: downloading github.com/segmentio/kafka-go v0.4.45
go: downloading github.com/pingcap/tidb v1.1.0-beta.0.20231117065153-a4f85c356873
go: downloading github.com/google/go-cmp v0.6.0
go: downloading go.uber.org/zap v1.26.0
go: downloading github.com/mattn/go-colorable v0.1.13
go: downloading golang.org/x/sys v0.14.0
go: downloading github.com/Code-Hex/dd v1.1.0
go: downloading github.com/klauspost/compress v1.17.1
go: downloading github.com/pierrec/lz4/v4 v4.1.15
go: downloading github.com/dlclark/regexp2 v1.4.0
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
table ddl_manager.finish_mark not exists for 60-th check, retry later
table partition_table.t2 exists
table partition_table.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 1-th check, retry later
table ddl_sequence.finish_mark not exists for 3-th check, retry later
table mark.finish_mark_3 not exists for 23-th check, retry later
go: downloading github.com/pingcap/errors v0.11.5-0.20221009092201-b66cddb77c32
go: downloading github.com/pingcap/log v1.1.1-0.20230317032135-a0d097d16e22
go: downloading golang.org/x/exp v0.0.0-20231006140011-7918f672742d
go: downloading github.com/shirou/gopsutil/v3 v3.23.10
go: downloading github.com/grpc-ecosystem/go-grpc-middleware v1.3.0
go: downloading github.com/cockroachdb/errors v1.8.1
go: downloading github.com/prometheus/client_golang v1.17.0
go: downloading github.com/tikv/client-go/v2 v2.0.8-0.20231114060955-8fc8a528217e
go: downloading github.com/golang/protobuf v1.5.3
go: downloading github.com/pingcap/sysutil v1.0.1-0.20230407040306-fb007c5aff21
go: downloading github.com/prometheus/client_model v0.5.0
go: downloading google.golang.org/protobuf v1.31.0
go: downloading github.com/pingcap/kvproto v0.0.0-20230925123611-87bebcc0d071
go: downloading google.golang.org/grpc v1.59.0
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
table test.finish_mark not exists for 3-th check, retry later
table ddl_manager.finish_mark not exists for 61-th check, retry later
table partition_table.finish_mark not exists for 2-th check, retry later
go: downloading github.com/prometheus/procfs v0.12.0
go: downloading github.com/cespare/xxhash/v2 v2.2.0
go: downloading github.com/prometheus/common v0.45.0
go: downloading github.com/cockroachdb/redact v1.0.8
go: downloading github.com/cockroachdb/sentry-go v0.6.1-cockroachdb.2
go: downloading github.com/cockroachdb/logtags v0.0.0-20190617123548-eb05cc24525f
go: downloading github.com/rogpeppe/go-internal v1.11.0
go: downloading github.com/tikv/pd/client v0.0.0-20231114041114-86831ce71865
go: downloading go.etcd.io/etcd/api/v3 v3.5.10
go: downloading go.etcd.io/etcd/client/v3 v3.5.10
go: downloading golang.org/x/sync v0.4.0
go: downloading github.com/matttproud/golang_protobuf_extensions/v2 v2.0.0
go: downloading go.etcd.io/etcd/client/pkg/v3 v3.5.10
go: downloading google.golang.org/genproto/googleapis/api v0.0.0-20231016165738-49dd2c1f3d0b
go: downloading google.golang.org/genproto v0.0.0-20231016165738-49dd2c1f3d0b
go: downloading golang.org/x/net v0.18.0
go: downloading google.golang.org/genproto/googleapis/rpc v0.0.0-20231016165738-49dd2c1f3d0b
check_changefeed_state http://127.0.0.1:2379 3711772d-5b74-468b-8f81-d58646fe860e finished null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=3711772d-5b74-468b-8f81-d58646fe860e
+ expected_state=finished
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c 3711772d-5b74-468b-8f81-d58646fe860e -s
+ info='{
  "upstream_id": 7365375431770261931,
  "namespace": "default",
  "id": "3711772d-5b74-468b-8f81-d58646fe860e",
  "state": "finished",
  "checkpoint_tso": 449546868017594374,
  "checkpoint_time": "2024-05-05 13:00:08.197",
  "error": null
}'
+ echo '{
  "upstream_id": 7365375431770261931,
  "namespace": "default",
  "id": "3711772d-5b74-468b-8f81-d58646fe860e",
  "state": "finished",
  "checkpoint_tso": 449546868017594374,
  "checkpoint_time": "2024-05-05 13:00:08.197",
  "error": null
}'
{
  "upstream_id": 7365375431770261931,
  "namespace": "default",
  "id": "3711772d-5b74-468b-8f81-d58646fe860e",
  "state": "finished",
  "checkpoint_tso": 449546868017594374,
  "checkpoint_time": "2024-05-05 13:00:08.197",
  "error": null
}
++ echo '{' '"upstream_id":' 7365375431770261931, '"namespace":' '"default",' '"id":' '"3711772d-5b74-468b-8f81-d58646fe860e",' '"state":' '"finished",' '"checkpoint_tso":' 449546868017594374, '"checkpoint_time":' '"2024-05-05' '13:00:08.197",' '"error":' null '}'
++ jq -r .state
+ state=finished
+ [[ ! finished == \f\i\n\i\s\h\e\d ]]
++ echo '{' '"upstream_id":' 7365375431770261931, '"namespace":' '"default",' '"id":' '"3711772d-5b74-468b-8f81-d58646fe860e",' '"state":' '"finished",' '"checkpoint_tso":' 449546868017594374, '"checkpoint_time":' '"2024-05-05' '13:00:08.197",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:00:14 CST 2024] <<<<<< run test case changefeed_finish success! >>>>>>
table test.finish_mark not exists for 4-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
[2024/05/05 13:00:20.133 +08:00] [INFO] [main.go:61] ["table mover started"]
[2024/05/05 13:00:20.136 +08:00] [INFO] [main.go:166] ["new cluster initialized"]
[2024/05/05 13:00:20.137 +08:00] [DEBUG] [main.go:192] ["retrieved owner ID"] [ownerID=aab420d9-ebf4-438d-af85-5168587f9ef1]
[2024/05/05 13:00:20.137 +08:00] [DEBUG] [main.go:199] ["retrieved owner addr"] [ownerAddr=127.0.0.1:8300]
[2024/05/05 13:00:20.137 +08:00] [DEBUG] [main.go:210] ["retrieved changefeeds"] [changefeedsError="json: unsupported type: map[model.ChangeFeedID]*mvccpb.KeyValue"]
[2024/05/05 13:00:20.313 +08:00] [DEBUG] [main.go:229] ["retrieved processor details"] [changefeed=23dc79ee-1d9d-4f53-b8fc-e276f6728fa1] [captureID=119b8f6b-7703-4064-8285-e6838fca69af] [processorDetail="{\"table_ids\":[]}"]
[2024/05/05 13:00:20.465 +08:00] [DEBUG] [main.go:229] ["retrieved processor details"] [changefeed=23dc79ee-1d9d-4f53-b8fc-e276f6728fa1] [captureID=48091314-99cc-4c25-9e6a-eac1cba88989] [processorDetail="{\"table_ids\":[]}"]
[2024/05/05 13:00:20.665 +08:00] [DEBUG] [main.go:229] ["retrieved processor details"] [changefeed=23dc79ee-1d9d-4f53-b8fc-e276f6728fa1] [captureID=aab420d9-ebf4-438d-af85-5168587f9ef1] [processorDetail="{\"table_ids\":[106,108]}"]
[2024/05/05 13:00:20.665 +08:00] [INFO] [main.go:75] ["task status"] [status="{\"119b8f6b-7703-4064-8285-e6838fca69af\":[],\"48091314-99cc-4c25-9e6a-eac1cba88989\":[],\"aab420d9-ebf4-438d-af85-5168587f9ef1\":[{\"ID\":106,\"Changefeed\":\"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1\"},{\"ID\":108,\"Changefeed\":\"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1\"}]}"]
[2024/05/05 13:00:20.665 +08:00] [DEBUG] [main.go:288] ["preparing HTTP API call to owner"] [formStr="cf-id=23dc79ee-1d9d-4f53-b8fc-e276f6728fa1&target-cp-id=48091314-99cc-4c25-9e6a-eac1cba88989&table-id=106"]
table mark.finish_mark_3 not exists for 24-th check, retry later
[2024/05/05 13:00:20.763 +08:00] [INFO] [main.go:180] ["moved table successful"] [tableID=106]
[2024/05/05 13:00:20.763 +08:00] [DEBUG] [main.go:288] ["preparing HTTP API call to owner"] [formStr="cf-id=23dc79ee-1d9d-4f53-b8fc-e276f6728fa1&target-cp-id=48091314-99cc-4c25-9e6a-eac1cba88989&table-id=108"]
[2024/05/05 13:00:20.825 +08:00] [INFO] [main.go:180] ["moved table successful"] [tableID=108]
[2024/05/05 13:00:20.825 +08:00] [INFO] [main.go:114] ["all tables are moved"] [sourceCapture=aab420d9-ebf4-438d-af85-5168587f9ef1] [targetCapture=48091314-99cc-4c25-9e6a-eac1cba88989]
table move_table.check1 exists
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_manager.finish_mark not exists for 62-th check, retry later
table partition_table.finish_mark not exists for 3-th check, retry later
check diff successfully
table ddl_sequence.finish_mark not exists for 4-th check, retry later
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
table test.finish_mark not exists for 5-th check, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table mark.finish_mark_3 not exists for 25-th check, retry later
wait process cdc.test exit for 2-th time...
table ddl_manager.finish_mark not exists for 63-th check, retry later
table partition_table.finish_mark not exists for 4-th check, retry later
table ddl_sequence.finish_mark not exists for 5-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:00:23 CST 2024] <<<<<< run test case kafka_simple_handle_key_only_avro success! >>>>>>
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/common_1/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 6-th check, retry later
table mark.finish_mark_3 not exists for 26-th check, retry later
table partition_table.finish_mark not exists for 5-th check, retry later
table ddl_sequence.finish_mark exists
check diff successfully
start tidb cluster in /tmp/tidb_cdc_test/common_1
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 13:00:24.983 +08:00] [INFO] [main.go:61] ["table mover started"]
[2024/05/05 13:00:24.986 +08:00] [INFO] [main.go:166] ["new cluster initialized"]
[2024/05/05 13:00:24.986 +08:00] [DEBUG] [main.go:192] ["retrieved owner ID"] [ownerID=aab420d9-ebf4-438d-af85-5168587f9ef1]
[2024/05/05 13:00:24.987 +08:00] [DEBUG] [main.go:199] ["retrieved owner addr"] [ownerAddr=127.0.0.1:8300]
[2024/05/05 13:00:24.987 +08:00] [DEBUG] [main.go:210] ["retrieved changefeeds"] [changefeedsError="json: unsupported type: map[model.ChangeFeedID]*mvccpb.KeyValue"]
[2024/05/05 13:00:25.163 +08:00] [DEBUG] [main.go:229] ["retrieved processor details"] [changefeed=23dc79ee-1d9d-4f53-b8fc-e276f6728fa1] [captureID=119b8f6b-7703-4064-8285-e6838fca69af] [processorDetail="{\"table_ids\":[]}"]
table ddl_manager.finish_mark not exists for 64-th check, retry later
[2024/05/05 13:00:25.363 +08:00] [DEBUG] [main.go:229] ["retrieved processor details"] [changefeed=23dc79ee-1d9d-4f53-b8fc-e276f6728fa1] [captureID=48091314-99cc-4c25-9e6a-eac1cba88989] [processorDetail="{\"table_ids\":[108]}"]
[2024/05/05 13:00:25.563 +08:00] [DEBUG] [main.go:229] ["retrieved processor details"] [changefeed=23dc79ee-1d9d-4f53-b8fc-e276f6728fa1] [captureID=aab420d9-ebf4-438d-af85-5168587f9ef1] [processorDetail="{\"table_ids\":[110]}"]
[2024/05/05 13:00:25.563 +08:00] [INFO] [main.go:75] ["task status"] [status="{\"119b8f6b-7703-4064-8285-e6838fca69af\":[],\"48091314-99cc-4c25-9e6a-eac1cba88989\":[{\"ID\":108,\"Changefeed\":\"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1\"}],\"aab420d9-ebf4-438d-af85-5168587f9ef1\":[{\"ID\":110,\"Changefeed\":\"23dc79ee-1d9d-4f53-b8fc-e276f6728fa1\"}]}"]
[2024/05/05 13:00:25.563 +08:00] [DEBUG] [main.go:288] ["preparing HTTP API call to owner"] [formStr="cf-id=23dc79ee-1d9d-4f53-b8fc-e276f6728fa1&target-cp-id=aab420d9-ebf4-438d-af85-5168587f9ef1&table-id=108"]
wait process cdc.test exit for 1-th time...
[2024/05/05 13:00:25.613 +08:00] [INFO] [main.go:180] ["moved table successful"] [tableID=108]
[2024/05/05 13:00:25.613 +08:00] [INFO] [main.go:114] ["all tables are moved"] [sourceCapture=48091314-99cc-4c25-9e6a-eac1cba88989] [targetCapture=aab420d9-ebf4-438d-af85-5168587f9ef1]
check diff successfully
table test.finish_mark not exists for 7-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 2-th time...
table mark.finish_mark_3 not exists for 27-th check, retry later
table move_table.check2 not exists for 1-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:00:26 CST 2024] <<<<<< run test case ddl_sequence success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_manager.finish_mark not exists for 65-th check, retry later
table partition_table.finish_mark not exists for 6-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/force_replicate_table/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table mark.finish_mark_3 not exists for 28-th check, retry later
table move_table.check2 exists
check diff successfully
wait process cdc.test exit for 1-th time...
table ddl_manager.finish_mark not exists for 66-th check, retry later
table partition_table.finish_mark exists
check diff successfully
wait process cdc.test exit for 2-th time...
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 1-th time...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7fe2d40014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:11223, start at 2024-05-05 13:00:28.769690636 +0800 CST m=+5.182372686	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:28.775 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:28.775 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:28.775 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7fe2d40014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:11223, start at 2024-05-05 13:00:28.769690636 +0800 CST m=+5.182372686	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:28.775 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:28.775 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:28.775 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7feb480006	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:11301, start at 2024-05-05 13:00:29.271652648 +0800 CST m=+5.626829838	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:29.280 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:29.266 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:29.266 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/cli_with_auth/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/cli_with_auth/tiflash/log/error.log
arg matches is ArgMatches { args: {"pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/cli_with_auth/tiflash-proxy.toml"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/cli_with_auth/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/cli_with_auth/tiflash/db/proxy"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
wait process cdc.test exit for 3-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:00:30 CST 2024] <<<<<< run test case partition_table success! >>>>>>
table mark.finish_mark_3 not exists for 29-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/force_replicate_table
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table ddl_manager.finish_mark not exists for 67-th check, retry later
table test.finish_mark not exists for 8-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 13:00:31 CST 2024] <<<<<< run test case move_table success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.12770.out cli tso query --pd=http://127.0.0.1:2379
table mark.finish_mark_3 not exists for 30-th check, retry later
table ddl_manager.finish_mark not exists for 68-th check, retry later
table test.finish_mark not exists for 9-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
+ set +x
+ tso='449546874329497601
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546874329497601 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:00:33 CST 2024] <<<<<< START cdc server in cli_with_auth case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.1282112823.out server --log-file /tmp/tidb_cdc_test/cli_with_auth/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cli_with_auth/cdc_data --cluster-id default --config /tmp/tidb_cdc_test/cli_with_auth/server.toml
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table mark.finish_mark_3 not exists for 31-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table ddl_manager.finish_mark not exists for 69-th check, retry later
table test.finish_mark not exists for 10-th check, retry later
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_handle_key_only_avro/run.sh: line 1: 16331 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/changefeed.toml" 2>&1
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_claim_check/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table mark.finish_mark_3 not exists for 32-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c803e440014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5, pid:7042, start at 2024-05-05 13:00:34.596318087 +0800 CST m=+5.222311481	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:34.602 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:34.577 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:34.577 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c803e440014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5, pid:7042, start at 2024-05-05 13:00:34.596318087 +0800 CST m=+5.222311481	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:34.602 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:34.577 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:34.577 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c803f20000d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-ww4ds-mcdd5, pid:7125, start at 2024-05-05 13:00:34.644214316 +0800 CST m=+5.215964263	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:34.650 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:34.632 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:34.632 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/common_1/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/common_1/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/common_1/tiflash-proxy.toml"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/common_1/tiflash/log/proxy.log"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/common_1/tiflash/db/proxy"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 11-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:37 GMT
< Content-Length: 859
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/30b8f8f4-09e8-4f82-8c59-47f1f6225236
	{"id":"30b8f8f4-09e8-4f82-8c59-47f1f6225236","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885234}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471fd2983a
	30b8f8f4-09e8-4f82-8c59-47f1f6225236

/tidb/cdc/default/default/upstream/7365375936628896892
	{"id":7365375936628896892,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2779,http://127.0.0.1:2679,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/30b8f8f4-09e8-4f82-8c59-47f1f6225236
	{"id":"30b8f8f4-09e8-4f82-8c59-47f1f6225236","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885234}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471fd2983a
	30b8f8f4-09e8-4f82-8c59-47f1f6225236

/tidb/cdc/default/default/upstream/7365375936628896892
	{"id":7365375936628896892,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2779,http://127.0.0.1:2679,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/30b8f8f4-09e8-4f82-8c59-47f1f6225236
	{"id":"30b8f8f4-09e8-4f82-8c59-47f1f6225236","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885234}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471fd2983a
	30b8f8f4-09e8-4f82-8c59-47f1f6225236

/tidb/cdc/default/default/upstream/7365375936628896892
	{"id":7365375936628896892,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2779,http://127.0.0.1:2679,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.12878.out cli changefeed create --start-ts=449546874329497601 '--sink-uri=kafka://127.0.0.1:9092/ticdc-cli-test-31846?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' --tz=Asia/Shanghai -c=custom-changefeed-name
table ddl_manager.finish_mark not exists for 70-th check, retry later
[WARN] --tz is deprecated in changefeed settings.
Create changefeed successfully!
ID: custom-changefeed-name
Info: {"upstream_id":7365375936628896892,"namespace":"default","id":"custom-changefeed-name","sink_uri":"kafka://127.0.0.1:9092/ticdc-cli-test-31846?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:00:37.519711846+08:00","start_ts":449546874329497601,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546874329497601,"checkpoint_ts":449546874329497601,"checkpoint_time":"2024-05-05 13:00:32.275"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/kafka_simple_claim_check
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 12-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/resourcecontrol/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table mark.finish_mark_3 not exists for 33-th check, retry later
+ set +x
[Sun May  5 13:00:38 CST 2024] <<<<<< START kafka consumer in cli_with_auth case >>>>>>
table test.simple not exists for 1-th check, retry later
table ddl_manager.finish_mark not exists for 71-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
table test.finish_mark not exists for 13-th check, retry later
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.common_1.cli.8599.out cli tso query --pd=http://127.0.0.1:2379
table mark.finish_mark_3 not exists for 34-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.simple not exists for 2-th check, retry later
table ddl_manager.finish_mark not exists for 72-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/resourcecontrol
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
1:00PM INF > Info cdc.mysql=kafka://127.0.0.1:9092/output_debezium cdc.tidb=kafka://127.0.0.1:9092/output_ticdc db.mysql=root@tcp(127.0.0.1:3310)/{db}?allowNativePasswords=true db.tidb=root@tcp(127.0.0.1:4000)/{db}?allowNativePasswords=true
1:00PM INF > Run case=sql/data_types.sql
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ set +x
+ tso='449546876558245889
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546876558245889 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:00:42 CST 2024] <<<<<< START cdc server in common_1 case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.common_1.86388640.out server --log-file /tmp/tidb_cdc_test/common_1/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/common_1/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table mark.finish_mark_3 not exists for 35-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c80ac2c000d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4, pid:4298, start at 2024-05-05 13:00:41.624097508 +0800 CST m=+5.130116159	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:41.633 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:41.611 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:41.611 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c80ac2c000d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4, pid:4298, start at 2024-05-05 13:00:41.624097508 +0800 CST m=+5.130116159	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:41.633 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:41.611 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:41.611 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c80ac200014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-rgkc6-k4bm4, pid:4379, start at 2024-05-05 13:00:41.644037369 +0800 CST m=+5.100442461	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:41.650 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:41.608 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:41.608 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/force_replicate_table/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/force_replicate_table/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/force_replicate_table/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/force_replicate_table/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/force_replicate_table/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table ddl_manager.finish_mark not exists for 73-th check, retry later
table test.finish_mark not exists for 14-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/autorandom/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table test.simple exists
table test.`simple-dash` exists
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=custom-changefeed-name
+ expected_state=normal
+ error_msg=null
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c custom-changefeed-name -s
+ info='{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546877042688006,
  "checkpoint_time": "2024-05-05 13:00:42.625",
  "error": null
}'
+ echo '{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546877042688006,
  "checkpoint_time": "2024-05-05 13:00:42.625",
  "error": null
}'
{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546877042688006,
  "checkpoint_time": "2024-05-05 13:00:42.625",
  "error": null
}
++ echo '{' '"upstream_id":' 7365375936628896892, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546877042688006, '"checkpoint_time":' '"2024-05-05' '13:00:42.625",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365375936628896892, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546877042688006, '"checkpoint_time":' '"2024-05-05' '13:00:42.625",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
changefeed count 1 check pass, pd_addr: http://127.0.0.1:2379
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
changefeed count 1 check pass, pd_addr: http://127.0.0.1:2679
changefeed count 1 check pass, pd_addr: http://127.0.0.1:2779
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c80b2dc0019	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj, pid:7177, start at 2024-05-05 13:00:42.075994645 +0800 CST m=+29.246001480	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:42.081 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:42.039 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:42.039 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c80b2dc0019	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj, pid:7177, start at 2024-05-05 13:00:42.075994645 +0800 CST m=+29.246001480	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:42.081 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:42.039 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:42.039 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c7f58b0000a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-wrhxv-rsnnj, pid:7262, start at 2024-05-05 13:00:19.893481563 +0800 CST m=+7.012322522	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:19.899 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:19.884 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:19.884 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/multi_rocks/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/multi_rocks/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/multi_rocks/tiflash/log/proxy.log"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/multi_rocks/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/multi_rocks/tiflash-proxy.toml"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
changefeed count 1 check pass, pd_addr: http://127.0.0.1:2379,http://127.0.0.1:2679,http://127.0.0.1:2779
table mark.finish_mark_3 not exists for 36-th check, retry later
table ddl_manager.finish_mark not exists for 74-th check, retry later
Error: [CDC:ErrChangefeedUpdateRefused]changefeed update error: can only update changefeed config when it is stopped or failed
update changefeed config should fail when changefeed is running, got Diff of changefeed config:
{Type:update Path:[Config CaseSensitive] From:false To:true}
{Type:update Path:[Config SyncPointInterval] From:<nil> To:0xc003d0af80}
{Type:update Path:[Config SyncPointRetention] From:<nil> To:0xc003d0af88}
{Type:update Path:[Config Consistent] From:<nil> To:0xc000a2c1c0}
{Type:update Path:[Config Scheduler EnableTableAcrossNodes] From:false To:true}
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13249.out cli changefeed --changefeed-id custom-changefeed-name pause
table test.finish_mark not exists for 15-th check, retry later
[Sun May  5 13:00:44 CST 2024] <<<<<< START cdc server in force_replicate_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.force_replicate_table.58765878.out server --log-file /tmp/tidb_cdc_test/force_replicate_table/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/force_replicate_table/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
PASS
coverage: 1.9% of statements in github.com/pingcap/tiflow/...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:45 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ded3c234-59f1-4a83-920d-5d52ae364663
	{"id":"ded3c234-59f1-4a83-920d-5d52ae364663","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885242}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471feb37dc
	ded3c234-59f1-4a83-920d-5d52ae364663

/tidb/cdc/default/default/upstream/7365375963276977412
	{"id":7365375963276977412,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ded3c234-59f1-4a83-920d-5d52ae364663
	{"id":"ded3c234-59f1-4a83-920d-5d52ae364663","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885242}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471feb37dc
	ded3c234-59f1-4a83-920d-5d52ae364663

/tidb/cdc/default/default/upstream/7365375963276977412
	{"id":7365375963276977412,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ded3c234-59f1-4a83-920d-5d52ae364663
	{"id":"ded3c234-59f1-4a83-920d-5d52ae364663","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885242}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471feb37dc
	ded3c234-59f1-4a83-920d-5d52ae364663

/tidb/cdc/default/default/upstream/7365375963276977412
	{"id":7365375963276977412,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Create changefeed successfully!
ID: 4115e00a-af8a-46b7-bada-2c4e56313b51
Info: {"upstream_id":7365375963276977412,"namespace":"default","id":"4115e00a-af8a-46b7-bada-2c4e56313b51","sink_uri":"kafka://127.0.0.1:9092/ticdc-common-1-test-25733?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:00:45.542170962+08:00","start_ts":449546876558245889,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546876558245889,"checkpoint_ts":449546876558245889,"checkpoint_time":"2024-05-05 13:00:40.777"}
[Sun May  5 13:00:45 CST 2024] <<<<<< START kafka consumer in common_1 case >>>>>>
[Sun May  5 13:00:45 CST 2024] <<<<<< START cdc server in multi_rocks case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_rocks.2088420886.out server --log-file /tmp/tidb_cdc_test/multi_rocks/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_rocks/cdc_data --cluster-id default
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table test.finish_mark exists
check diff successfully
+ set +x
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 1-th time...
table ddl_manager.finish_mark exists
check diff successfully
start tidb cluster in /tmp/tidb_cdc_test/autorandom
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
wait process cdc.test exit for 1-th time...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
wait process cdc.test exit for 2-th time...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:47 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1a623ca8-0a6d-43e8-960b-8d65ec369526
	{"id":"1a623ca8-0a6d-43e8-960b-8d65ec369526","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885245}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47200109c9
	1a623ca8-0a6d-43e8-960b-8d65ec369526

/tidb/cdc/default/default/upstream/7365375984231856705
	{"id":7365375984231856705,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1a623ca8-0a6d-43e8-960b-8d65ec369526
	{"id":"1a623ca8-0a6d-43e8-960b-8d65ec369526","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885245}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47200109c9
	1a623ca8-0a6d-43e8-960b-8d65ec369526

/tidb/cdc/default/default/upstream/7365375984231856705
	{"id":7365375984231856705,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1a623ca8-0a6d-43e8-960b-8d65ec369526
	{"id":"1a623ca8-0a6d-43e8-960b-8d65ec369526","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885245}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47200109c9
	1a623ca8-0a6d-43e8-960b-8d65ec369526

/tidb/cdc/default/default/upstream/7365375984231856705
	{"id":7365375984231856705,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Create changefeed successfully!
ID: 56923ed1-ac3f-4cd7-b72d-2cb80c524815
Info: {"upstream_id":7365375984231856705,"namespace":"default","id":"56923ed1-ac3f-4cd7-b72d-2cb80c524815","sink_uri":"kafka://127.0.0.1:9092/ticdc-force_replicate_table-test-12543?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:00:48.187113253+08:00","start_ts":449546877628841985,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":true,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546877628841985,"checkpoint_ts":449546877628841985,"checkpoint_time":"2024-05-05 13:00:44.861"}
wait process cdc.test exit for 2-th time...
[Sun May  5 13:00:48 CST 2024] <<<<<< START kafka consumer in force_replicate_table case >>>>>>
consumer replica config found: /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/force_replicate_table/conf/changefeed.toml
wait process cdc.test exit for 3-th time...
wait process cdc.test exit for 3-th time...
\033[0;36m<<< Run all test success >>>\033[0m
table mark.finish_mark_3 not exists for 37-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 13:00:48 CST 2024] <<<<<< run test case many_pk_or_uk success! >>>>>>
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
wait process cdc.test exit for 4-th time...
table common_1.v1 not exists for 1-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:48 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/d9cd96c8-3418-4078-b309-6231badc0ee2
	{"id":"d9cd96c8-3418-4078-b309-6231badc0ee2","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885246}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471fa2d005
	d9cd96c8-3418-4078-b309-6231badc0ee2

/tidb/cdc/default/default/upstream/7365375890307360376
	{"id":7365375890307360376,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/d9cd96c8-3418-4078-b309-6231badc0ee2
	{"id":"d9cd96c8-3418-4078-b309-6231badc0ee2","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885246}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471fa2d005
	d9cd96c8-3418-4078-b309-6231badc0ee2

/tidb/cdc/default/default/upstream/7365375890307360376
	{"id":7365375890307360376,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/d9cd96c8-3418-4078-b309-6231badc0ee2
	{"id":"d9cd96c8-3418-4078-b309-6231badc0ee2","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885246}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f471fa2d005
	d9cd96c8-3418-4078-b309-6231badc0ee2

/tidb/cdc/default/default/upstream/7365375890307360376
	{"id":7365375890307360376,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_rocks.cli.21165.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-multi-rocks-test-22663?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Pipeline] }
Create changefeed successfully!
ID: 48eaf422-3271-435a-906a-42e342910f8a
Info: {"upstream_id":7365375890307360376,"namespace":"default","id":"48eaf422-3271-435a-906a-42e342910f8a","sink_uri":"kafka://127.0.0.1:9092/ticdc-multi-rocks-test-22663?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:00:49.336961641+08:00","start_ts":449546878763401219,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546878763401219,"checkpoint_ts":449546878763401219,"checkpoint_time":"2024-05-05 13:00:49.189"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8118540003	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:17481, start at 2024-05-05 13:00:48.536015997 +0800 CST m=+5.186661750	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:48.543 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:48.533 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:48.533 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Pipeline] // timeout
cdc.test: no process found
wait process cdc.test exit for 5-th time...
process cdc.test already exit
[Sun May  5 13:00:49 CST 2024] <<<<<< run test case ddl_manager success! >>>>>>
[Pipeline] }
table mark.finish_mark_3 not exists for 38-th check, retry later
[Pipeline] // stage
[Pipeline] }
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=custom-changefeed-name
+ expected_state=stopped
+ error_msg=null
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c custom-changefeed-name -s
+ info='{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "stopped",
  "checkpoint_tso": 449546877566976009,
  "checkpoint_time": "2024-05-05 13:00:44.625",
  "error": null
}'
+ echo '{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "stopped",
  "checkpoint_tso": 449546877566976009,
  "checkpoint_time": "2024-05-05 13:00:44.625",
  "error": null
}'
{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "stopped",
  "checkpoint_tso": 449546877566976009,
  "checkpoint_time": "2024-05-05 13:00:44.625",
  "error": null
}
++ echo '{' '"upstream_id":' 7365375936628896892, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"stopped",' '"checkpoint_tso":' 449546877566976009, '"checkpoint_time":' '"2024-05-05' '13:00:44.625",' '"error":' null '}'
++ jq -r .state
+ state=stopped
+ [[ ! stopped == \s\t\o\p\p\e\d ]]
++ echo '{' '"upstream_id":' 7365375936628896892, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"stopped",' '"checkpoint_tso":' 449546877566976009, '"checkpoint_time":' '"2024-05-05' '13:00:44.625",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13337.out cli changefeed update --pd=http://127.0.0.1:2379,http://127.0.0.1:2679,http://127.0.0.1:2779 --config=/tmp/tidb_cdc_test/cli_with_auth/changefeed.toml --no-confirm --changefeed-id custom-changefeed-name
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
Diff of changefeed config:
{Type:update Path:[Config CaseSensitive] From:false To:true}
{Type:update Path:[Config SyncPointInterval] From:<nil> To:0xc003a90658}
{Type:update Path:[Config SyncPointRetention] From:<nil> To:0xc003a90668}
{Type:update Path:[Config Consistent] From:<nil> To:0xc0014321c0}
{Type:update Path:[Config Scheduler EnableTableAcrossNodes] From:false To:true}
Update changefeed config successfully! 
ID: custom-changefeed-name
Info: {"upstream_id":7365375936628896892,"namespace":"default","id":"custom-changefeed-name","sink_uri":"kafka://127.0.0.1:9092/ticdc-cli-test-31846?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:00:37.519711846+08:00","start_ts":449546874329497601,"admin_job_type":1,"config":{"memory_quota":1073741824,"case_sensitive":true,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_table_monitor":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","encoder_concurrency":32,"terminator":"\r\n","enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":true,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"stopped","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":0,"checkpoint_ts":449546877566976009,"checkpoint_time":"2024-05-05 13:00:44.625"}
PASS
[Pipeline] }
+ set +x
[Sun May  5 13:00:50 CST 2024] <<<<<< START kafka consumer in multi_rocks case >>>>>>
coverage: 2.8% of statements in github.com/pingcap/tiflow/...
table common_1.v1 not exists for 2-th check, retry later
***************** properties *****************
"insertproportion"="0"
"readproportion"="0"
"readallfields"="true"
"updateproportion"="0"
"recordcount"="1000"
"threadcount"="2"
"mysql.port"="4000"
"workload"="core"
"mysql.db"="multi_rocks"
"requestdistribution"="uniform"
"mysql.user"="root"
"table"="a1"
"mysql.host"="127.0.0.1"
"operationcount"="0"
"dotransactions"="false"
"scanproportion"="0"
**********************************************
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8118540003	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:17481, start at 2024-05-05 13:00:48.536015997 +0800 CST m=+5.186661750	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:48.543 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:48.533 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:48.533 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8118f80014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:17565, start at 2024-05-05 13:00:48.618810891 +0800 CST m=+5.215333971	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:48.625 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:48.624 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:48.624 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_simple_claim_check/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_simple_claim_check/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_simple_claim_check/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_simple_claim_check/tiflash-proxy.toml"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_simple_claim_check/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13377.out cli changefeed --changefeed-id custom-changefeed-name resume
Run finished, takes 1.126635603s
INSERT - Takes(s): 1.1, Count: 1000, OPS: 898.5, Avg(us): 2221, Min(us): 740, Max(us): 586876, 95th(us): 2000, 99th(us): 2000
***************** properties *****************
"operationcount"="0"
"readallfields"="true"
"mysql.db"="multi_rocks"
"dotransactions"="false"
"scanproportion"="0"
"insertproportion"="0"
"mysql.user"="root"
"threadcount"="2"
"recordcount"="1000"
"readproportion"="0"
"requestdistribution"="uniform"
"table"="a2"
"mysql.host"="127.0.0.1"
"updateproportion"="0"
"mysql.port"="4000"
"workload"="core"
**********************************************
table mark.finish_mark_3 not exists for 39-th check, retry later
PASS
table common_1.v1 exists
table common_1.recover_and_insert not exists for 1-th check, retry later
Run finished, takes 544.560053ms
INSERT - Takes(s): 0.5, Count: 1000, OPS: 1898.2, Avg(us): 1056, Min(us): 782, Max(us): 17649, 95th(us): 2000, 99th(us): 2000
***************** properties *****************
"requestdistribution"="uniform"
"readallfields"="true"
"insertproportion"="0"
"readproportion"="0"
"mysql.db"="multi_rocks"
"table"="a3"
"dotransactions"="false"
"scanproportion"="0"
"threadcount"="2"
"updateproportion"="0"
"mysql.port"="4000"
"workload"="core"
"mysql.user"="root"
"mysql.host"="127.0.0.1"
"operationcount"="0"
"recordcount"="1000"
**********************************************
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
Run finished, takes 520.072182ms
INSERT - Takes(s): 0.5, Count: 1000, OPS: 1985.4, Avg(us): 1009, Min(us): 728, Max(us): 16278, 95th(us): 2000, 99th(us): 2000
***************** properties *****************
"requestdistribution"="uniform"
"updateproportion"="0"
"scanproportion"="0"
"mysql.host"="127.0.0.1"
"threadcount"="2"
"mysql.user"="root"
"workload"="core"
"dotransactions"="false"
"readallfields"="true"
"insertproportion"="0"
"readproportion"="0"
"mysql.db"="multi_rocks"
"table"="a4"
"recordcount"="1000"
"operationcount"="0"
"mysql.port"="4000"
**********************************************
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8152340018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:10966, start at 2024-05-05 13:00:52.268305913 +0800 CST m=+5.136113501	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:52.276 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:52.237 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:52.237 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8152340018	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:10966, start at 2024-05-05 13:00:52.268305913 +0800 CST m=+5.136113501	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:52.276 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:52.237 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:52.237 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8152dc0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-vv6pz-b694q, pid:11052, start at 2024-05-05 13:00:52.32216826 +0800 CST m=+5.137711596	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:52.331 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:52.329 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:52.329 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/resourcecontrol/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/resourcecontrol/tiflash/log/error.log
arg matches is ArgMatches { args: {"advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/resourcecontrol/tiflash-proxy.toml"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/resourcecontrol/tiflash/log/proxy.log"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/resourcecontrol/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Run finished, takes 481.770719ms
INSERT - Takes(s): 0.5, Count: 1000, OPS: 2146.0, Avg(us): 931, Min(us): 710, Max(us): 15723, 95th(us): 2000, 99th(us): 2000
***************** properties *****************
"recordcount"="1000"
"requestdistribution"="uniform"
"operationcount"="0"
"readallfields"="true"
"scanproportion"="0"
"insertproportion"="0"
"mysql.db"="multi_rocks"
"mysql.user"="root"
"readproportion"="0"
"table"="a5"
"updateproportion"="0"
"threadcount"="2"
"mysql.port"="4000"
"dotransactions"="false"
"mysql.host"="127.0.0.1"
"workload"="core"
**********************************************
[Sun May  5 13:00:53 CST 2024] <<<<<< START cdc server in kafka_simple_claim_check case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check.1895018952.out server --log-file /tmp/tidb_cdc_test/kafka_simple_claim_check/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_simple_claim_check/cdc_data --cluster-id default
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table mark.finish_mark_3 not exists for 40-th check, retry later
+ set +x
Run finished, takes 517.009994ms
INSERT - Takes(s): 0.5, Count: 999, OPS: 1995.6, Avg(us): 985, Min(us): 720, Max(us): 16306, 95th(us): 2000, 99th(us): 2000
table common_1.recover_and_insert not exists for 2-th check, retry later
table force_replicate_table.t0 exists
table force_replicate_table.t1 exists
table force_replicate_table.t2 exists
table force_replicate_table.t3 not exists for 1-th check, retry later
table multi_rocks.finish_mark not exists for 1-th check, retry later
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.resourcecontrol.cli.12489.out cli tso query --pd=http://127.0.0.1:2379
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table mark.finish_mark_3 not exists for 41-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:00:56 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fec08838-8f67-4a5b-a007-177633aaba98
	{"id":"fec08838-8f67-4a5b-a007-177633aaba98","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885254}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47201be1d0
	fec08838-8f67-4a5b-a007-177633aaba98

/tidb/cdc/default/default/upstream/7365376015384860658
	{"id":7365376015384860658,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fec08838-8f67-4a5b-a007-177633aaba98
	{"id":"fec08838-8f67-4a5b-a007-177633aaba98","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885254}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47201be1d0
	fec08838-8f67-4a5b-a007-177633aaba98

/tidb/cdc/default/default/upstream/7365376015384860658
	{"id":7365376015384860658,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fec08838-8f67-4a5b-a007-177633aaba98
	{"id":"fec08838-8f67-4a5b-a007-177633aaba98","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885254}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47201be1d0
	fec08838-8f67-4a5b-a007-177633aaba98

/tidb/cdc/default/default/upstream/7365376015384860658
	{"id":7365376015384860658,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check.cli.19014.out cli tso query --pd=http://127.0.0.1:2379
table common_1.recover_and_insert not exists for 3-th check, retry later
+ set +x
+ tso='449546880479657985
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546880479657985 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:00:57 CST 2024] <<<<<< START cdc server in resourcecontrol case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.resourcecontrol.1253212534.out server --log-file /tmp/tidb_cdc_test/resourcecontrol/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/resourcecontrol/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table force_replicate_table.t3 exists
table force_replicate_table.t4 not exists for 1-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c81961c001f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:30556, start at 2024-05-05 13:00:56.629138094 +0800 CST m=+5.058601523	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:56.636 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:56.633 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:56.633 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c81961c001f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:30556, start at 2024-05-05 13:00:56.629138094 +0800 CST m=+5.058601523	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:56.636 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:56.633 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:56.633 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8198480008	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:30645, start at 2024-05-05 13:00:56.730283122 +0800 CST m=+5.108321439	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:02:56.741 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:00:56.722 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:50:56.722 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
table multi_rocks.finish_mark not exists for 2-th check, retry later
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=custom-changefeed-name
+ expected_state=normal
+ error_msg=null
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c custom-changefeed-name -s
+ info='{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546880712704005,
  "checkpoint_time": "2024-05-05 13:00:56.625",
  "error": null
}'
+ echo '{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546880712704005,
  "checkpoint_time": "2024-05-05 13:00:56.625",
  "error": null
}'
{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546880712704005,
  "checkpoint_time": "2024-05-05 13:00:56.625",
  "error": null
}
++ echo '{' '"upstream_id":' 7365375936628896892, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546880712704005, '"checkpoint_time":' '"2024-05-05' '13:00:56.625",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365375936628896892, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546880712704005, '"checkpoint_time":' '"2024-05-05' '13:00:56.625",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13461.out cli changefeed --changefeed-id custom-changefeed-name remove
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/autorandom/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/autorandom/tiflash/log/error.log
arg matches is ArgMatches { args: {"addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/autorandom/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/autorandom/tiflash-proxy.toml"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/autorandom/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table mark.finish_mark_3 not exists for 42-th check, retry later
Changefeed remove successfully.
ID: custom-changefeed-name
CheckpointTs: 449546880725811201
SinkURI: kafka://127.0.0.1:9092/ticdc-cli-test-31846?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
+ set +x
+ tso='449546880832765953
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546880832765953 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check.cli.19050.out cli changefeed create --start-ts=449546880832765953 '--sink-uri=kafka://127.0.0.1:9092/kafka-simple-claim-check-17868?protocol=simple' -c kafka-simple-claim-check --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_claim_check/conf/changefeed.toml
Create changefeed successfully!
ID: kafka-simple-claim-check
Info: {"upstream_id":7365376015384860658,"namespace":"default","id":"kafka-simple-claim-check","sink_uri":"kafka://127.0.0.1:9092/kafka-simple-claim-check-17868?protocol=simple","create_time":"2024-05-05T13:00:58.995262256+08:00","start_ts":449546880832765953,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"claim-check","large_message_handle_compression":"snappy","claim_check_storage_uri":"file:///tmp/kafka-simple-claim-check"}},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546880832765953,"checkpoint_ts":449546880832765953,"checkpoint_time":"2024-05-05 13:00:57.083"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
table common_1.recover_and_insert exists
table common_1.finish_mark not exists for 1-th check, retry later
table force_replicate_table.t4 exists
table force_replicate_table.t5 not exists for 1-th check, retry later
table multi_rocks.finish_mark not exists for 3-th check, retry later
+ set +x
table mark.finish_mark_3 not exists for 43-th check, retry later
[Sun May  5 13:00:59 CST 2024] <<<<<< START cdc server in autorandom case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.autorandom.3209832100.out server --log-file /tmp/tidb_cdc_test/autorandom/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/autorandom/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ set +x
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:01:00 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/06f5a9c0-ee81-4777-bc70-34d2990a72f9
	{"id":"06f5a9c0-ee81-4777-bc70-34d2990a72f9","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885257}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472028c8ce
	06f5a9c0-ee81-4777-bc70-34d2990a72f9

/tidb/cdc/default/default/upstream/7365376039824108281
	{"id":7365376039824108281,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/06f5a9c0-ee81-4777-bc70-34d2990a72f9
	{"id":"06f5a9c0-ee81-4777-bc70-34d2990a72f9","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885257}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472028c8ce
	06f5a9c0-ee81-4777-bc70-34d2990a72f9

/tidb/cdc/default/default/upstream/7365376039824108281
	{"id":7365376039824108281,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/06f5a9c0-ee81-4777-bc70-34d2990a72f9
	{"id":"06f5a9c0-ee81-4777-bc70-34d2990a72f9","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885257}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472028c8ce
	06f5a9c0-ee81-4777-bc70-34d2990a72f9

/tidb/cdc/default/default/upstream/7365376039824108281
	{"id":7365376039824108281,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.resourcecontrol.cli.12585.out cli changefeed create --start-ts=449546880479657985 '--sink-uri=kafka://127.0.0.1:9092/ticdc-resourcecontrol-test-22877?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: 0d3fba5a-7c1d-43e2-84be-79e1b40f4f8b
Info: {"upstream_id":7365376039824108281,"namespace":"default","id":"0d3fba5a-7c1d-43e2-84be-79e1b40f4f8b","sink_uri":"kafka://127.0.0.1:9092/ticdc-resourcecontrol-test-22877?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:01:00.746037749+08:00","start_ts":449546880479657985,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546880479657985,"checkpoint_ts":449546880479657985,"checkpoint_time":"2024-05-05 13:00:55.736"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table common_1.finish_mark not exists for 2-th check, retry later
table force_replicate_table.t5 exists
table force_replicate_table.t6 not exists for 1-th check, retry later
table multi_rocks.finish_mark not exists for 4-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_source/run.sh using Sink-Type: kafka... <<=================
+ set +x
[Sun May  5 13:01:02 CST 2024] <<<<<< START kafka consumer in resourcecontrol case >>>>>>
table mark.finish_mark_3 not exists for 44-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
table common_1.finish_mark not exists for 3-th check, retry later
changefeed count 0 check pass, pd_addr: http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13547.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-cli-test-31846?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' --tz=Asia/Shanghai -c=custom-changefeed-name
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:01:02 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/693586b8-6e40-4318-b8b0-a367cf34f9e4
	{"id":"693586b8-6e40-4318-b8b0-a367cf34f9e4","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885260}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472041f7bf
	693586b8-6e40-4318-b8b0-a367cf34f9e4

/tidb/cdc/default/default/upstream/7365376060066152888
	{"id":7365376060066152888,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/693586b8-6e40-4318-b8b0-a367cf34f9e4
	{"id":"693586b8-6e40-4318-b8b0-a367cf34f9e4","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885260}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472041f7bf
	693586b8-6e40-4318-b8b0-a367cf34f9e4

/tidb/cdc/default/default/upstream/7365376060066152888
	{"id":7365376060066152888,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/693586b8-6e40-4318-b8b0-a367cf34f9e4
	{"id":"693586b8-6e40-4318-b8b0-a367cf34f9e4","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885260}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472041f7bf
	693586b8-6e40-4318-b8b0-a367cf34f9e4

/tidb/cdc/default/default/upstream/7365376060066152888
	{"id":7365376060066152888,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
table force_replicate_table.t6 exists
check_data_subset force_replicate_table.t0 127.0.0.1 4000 127.0.0.1 3306
[WARN] --tz is deprecated in changefeed settings.
Create changefeed successfully!
ID: f19678af-ccb7-4815-b58c-6b1f5e44787a
Info: {"upstream_id":7365376060066152888,"namespace":"default","id":"f19678af-ccb7-4815-b58c-6b1f5e44787a","sink_uri":"kafka://127.0.0.1:9092/ticdc-autorandom-test-5086?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:01:03.170980186+08:00","start_ts":449546882392522754,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546882392522754,"checkpoint_ts":449546882392522754,"checkpoint_time":"2024-05-05 13:01:03.033"}
[Sun May  5 13:01:03 CST 2024] <<<<<< START kafka consumer in autorandom case >>>>>>
table autorandom_test.table_a not exists for 1-th check, retry later
table multi_rocks.finish_mark exists
check diff successfully
Create changefeed successfully!
ID: custom-changefeed-name
Info: {"upstream_id":7365375936628896892,"namespace":"default","id":"custom-changefeed-name","sink_uri":"kafka://127.0.0.1:9092/ticdc-cli-test-31846?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:01:03.390610636+08:00","start_ts":449546882455961601,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546882455961601,"checkpoint_ts":449546882455961601,"checkpoint_time":"2024-05-05 13:01:03.275"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
run task successfully
check_data_subset force_replicate_table.t1 127.0.0.1 4000 127.0.0.1 3306
wait process cdc.test exit for 1-th time...
The 1 times to try to start tidb cluster...
table mark.finish_mark_3 not exists for 45-th check, retry later
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:04 CST 2024] <<<<<< run test case multi_rocks success! >>>>>>
table resourcecontrol.finish_mark not exists for 1-th check, retry later
+ set +x
run task successfully
check_data_subset force_replicate_table.t2 127.0.0.1 4000 127.0.0.1 3306
run task successfully
check_data_subset force_replicate_table.t3 127.0.0.1 4000 127.0.0.1 3306
table common_1.finish_mark not exists for 4-th check, retry later
table autorandom_test.table_a not exists for 2-th check, retry later
table mark.finish_mark_3 not exists for 46-th check, retry later
run task successfully
check_data_subset force_replicate_table.t4 127.0.0.1 4000 127.0.0.1 3306
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/capture_session_done_during_task/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
run task successfully
check_data_subset force_replicate_table.t5 127.0.0.1 4000 127.0.0.1 3306
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check.cli.19101.out cli changefeed pause -c kafka-simple-claim-check
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
table common_1.finish_mark exists
check diff successfully
table resourcecontrol.finish_mark exists
check diff successfully
run task successfully
check_data_subset force_replicate_table.t6 127.0.0.1 4000 127.0.0.1 3306
id=19,a=NULL doesn't exist in downstream table force_replicate_table.t6
run task failed 1-th time, retry later
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 1-th time...
start tidb cluster in /tmp/tidb_cdc_test/multi_source
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/capture_session_done_during_task
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table autorandom_test.table_a exists
check diff successfully
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check.cli.19134.out cli changefeed update -c kafka-simple-claim-check '--sink-uri=kafka://127.0.0.1:9092/kafka-simple-claim-check-17868?protocol=simple&max-message-bytes=2048' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_claim_check/conf/changefeed.toml --no-confirm
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=custom-changefeed-name
+ expected_state=normal
+ error_msg=null
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c custom-changefeed-name -s
+ info='{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546883347251204,
  "checkpoint_time": "2024-05-05 13:01:06.675",
  "error": null
}'
+ echo '{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546883347251204,
  "checkpoint_time": "2024-05-05 13:01:06.675",
  "error": null
}'
{
  "upstream_id": 7365375936628896892,
  "namespace": "default",
  "id": "custom-changefeed-name",
  "state": "normal",
  "checkpoint_tso": 449546883347251204,
  "checkpoint_time": "2024-05-05 13:01:06.675",
  "error": null
}
++ echo '{' '"upstream_id":' 7365375936628896892, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546883347251204, '"checkpoint_time":' '"2024-05-05' '13:01:06.675",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365375936628896892, '"namespace":' '"default",' '"id":' '"custom-changefeed-name",' '"state":' '"normal",' '"checkpoint_tso":' 449546883347251204, '"checkpoint_time":' '"2024-05-05' '13:01:06.675",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
wait process cdc.test exit for 2-th time...
table mark.finish_mark_3 exists
table mark.finish_mark not exists for 1-th check, retry later
Diff of changefeed config:
{Type:update Path:[SinkURI] From:kafka://127.0.0.1:9092/kafka-simple-claim-check-17868?protocol=simple To:kafka://127.0.0.1:9092/kafka-simple-claim-check-17868?protocol=simple&max-message-bytes=2048}
{Type:update Path:[Config SyncPointInterval] From:<nil> To:0xc000f25768}
{Type:update Path:[Config SyncPointRetention] From:<nil> To:0xc000f25778}
{Type:update Path:[Config Consistent] From:<nil> To:0xc0011440e0}
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:08 CST 2024] <<<<<< run test case resourcecontrol success! >>>>>>
Update changefeed config successfully! 
ID: kafka-simple-claim-check
Info: {"upstream_id":7365376015384860658,"namespace":"default","id":"kafka-simple-claim-check","sink_uri":"kafka://127.0.0.1:9092/kafka-simple-claim-check-17868?protocol=simple\u0026max-message-bytes=2048","create_time":"2024-05-05T13:00:58.995262256+08:00","start_ts":449546880832765953,"admin_job_type":1,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_table_monitor":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","encoder_concurrency":32,"terminator":"\r\n","enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"claim-check","large_message_handle_compression":"snappy","claim_check_storage_uri":"file:///tmp/kafka-simple-claim-check"}},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"stopped","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":0,"checkpoint_ts":449546883008561163,"checkpoint_time":"2024-05-05 13:01:05.383"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:08 CST 2024] <<<<<< run test case common_1 success! >>>>>>
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13661.out cli changefeed create --start-ts=449546874329497601 '--sink-uri=kafka://127.0.0.1:9093/ticdc-cli-test-ssl-3491?protocol=open-protocol&ca=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem&cert=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem&key=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem&kafka-version=2.4.1&max-message-bytes=10485760&insecure-skip-verify=true' --tz=Asia/Shanghai
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:09 CST 2024] <<<<<< run test case autorandom success! >>>>>>
[WARN] --tz is deprecated in changefeed settings.
Create changefeed successfully!
ID: b9ec4f68-36f8-4922-ab4d-4c102bc44b25
Info: {"upstream_id":7365375936628896892,"namespace":"default","id":"b9ec4f68-36f8-4922-ab4d-4c102bc44b25","sink_uri":"kafka://127.0.0.1:9093/ticdc-cli-test-ssl-3491?protocol=open-protocol\u0026ca=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/ca.pem\u0026cert=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client.pem\u0026key=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_certificates/client-key.pem\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760\u0026insecure-skip-verify=true","create_time":"2024-05-05T13:01:09.261731964+08:00","start_ts":449546874329497601,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546874329497601,"checkpoint_ts":449546874329497601,"checkpoint_time":"2024-05-05 13:00:32.275"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check.cli.19167.out cli changefeed resume -c kafka-simple-claim-check
check_data_subset force_replicate_table.t6 127.0.0.1 4000 127.0.0.1 3306
table mark.finish_mark exists
run task successfully
Verifying downstream PD is started...
PASS
check diff successfully
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
wait process cdc.test exit for 1-th time...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13707.out cli unsafe delete-service-gc-safepoint
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Confirm that you know what this command will do and use it at your own risk [Y/N]
CDC service GC safepoint truncated in PD!
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:11 CST 2024] <<<<<< run test case force_replicate_table success! >>>>>>
wait process cdc.test exit for 3-th time...
+ set +x
table test.finish_mark not exists for 1-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13744.out cli unsafe reset --no-confirm --pd=http://127.0.0.1:2379
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 13:01:12 CST 2024] <<<<<< run test case default_value success! >>>>>>
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
reset and all metadata truncated in PD!
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
TEST FAILED: OUTPUT DOES NOT CONTAIN 'id: 1'
____________________________________
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
check data failed 1-th time, retry later
check data successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:11 CST 2024] <<<<<< run test case ddl_puller_lag success! >>>>>>
+ set +x
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 2-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 2-th time...
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13858.out cli unsafe resolve-lock --region=92
wait process cdc.test exit for 3-th time...
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 13:01:18 CST 2024] <<<<<< run test case kafka_simple_claim_check success! >>>>>>
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c82dae00017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85, pid:7460, start at 2024-05-05 13:01:17.40732052 +0800 CST m=+5.091189374	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:17.413 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:17.419 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:17.419 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c82dae00017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85, pid:7460, start at 2024-05-05 13:01:17.40732052 +0800 CST m=+5.091189374	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:17.413 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:17.419 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:17.419 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c82db800015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l0fvq-x3d85, pid:7538, start at 2024-05-05 13:01:17.445255215 +0800 CST m=+5.077003929	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:17.451 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:17.408 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:17.408 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/multi_source/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/multi_source/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/multi_source/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/multi_source/tiflash/log/proxy.log"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/multi_source/tiflash-proxy.toml"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c82e4040013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s, pid:21270, start at 2024-05-05 13:01:17.997256055 +0800 CST m=+5.220837496	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:18.003 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:18.002 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:18.002 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cli_with_auth.cli.13888.out cli unsafe resolve-lock --region=92 --ts=449546885365760004
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_source.cli.8918.out cli tso query --pd=http://127.0.0.1:2379
[Pipeline] // dir
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // timeout
[Pipeline] // cache
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
+ set +x
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   218  100   212  100     6   178k   5159 --:--:-- --:--:-- --:--:--  207k
{
    "error_msg": "[CDC:ErrAPIInvalidParam]invalid log level: json: cannot unmarshal string into Go value of type struct { Level string \"json:\\\"log_level\\\"\" }",
    "error_code": "CDC:ErrAPIInvalidParam"
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/generate_column/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[Pipeline] // container
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] }
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c82e4040013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s, pid:21270, start at 2024-05-05 13:01:17.997256055 +0800 CST m=+5.220837496	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:18.003 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:18.002 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:18.002 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c82e63c000d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s, pid:21361, start at 2024-05-05 13:01:18.108669526 +0800 CST m=+5.278445905	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:18.114 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:18.095 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:18.095 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/capture_session_done_during_task/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/capture_session_done_during_task/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/capture_session_done_during_task/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/capture_session_done_during_task/tiflash/log/proxy.log"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/capture_session_done_during_task/tiflash-proxy.toml"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Pipeline] // withEnv
[Pipeline] // timeout
[Pipeline] }
[Pipeline] }
[Pipeline] // node
[Pipeline] // stage
[Pipeline] }
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] // container
[Pipeline] }
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
[Pipeline] // node
[Pipeline] }
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
+ set +x
+ tso='449546887081230337
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546887081230337 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:01:22 CST 2024] <<<<<< START cdc server in multi_source case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_source.89578959.out server --log-file /tmp/tidb_cdc_test/multi_source/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_source/cdc_data --cluster-id default
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.capture_session_done_during_task.cli.22717.out cli tso query --pd=http://127.0.0.1:2379
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/ddl_only_block_related_table/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
}  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   182  100   182    0     0   207k      0 --:--:-- --:--:-- --:--:--  177k
{
 "version": "v8.2.0-alpha-53-g0de8dc3e4",
 "git_hash": "0de8dc3e43ec741eba58047155ce7f3dba8eb4f7",
 "id": "ec989175-8ae9-40d4-81a8-9b601997c667",
 "pid": 12826,
 "is_owner": true
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
}wait process cdc.test exit for 1-th time...
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
wait process cdc.test exit for 2-th time...
+ set +x
+ tso='449546887837253636
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ awk -F ' ' '{print $1}'
+ echo 449546887837253636 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ set +x
[Sun May  5 13:01:25 CST 2024] <<<<<< START cdc server in capture_session_done_during_task case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/processor/processorManagerHandleNewChangefeedDelay=sleep(2000)'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.capture_session_done_during_task.2276222764.out server --log-file /tmp/tidb_cdc_test/capture_session_done_during_task/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/capture_session_done_during_task/cdc_data --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:25 CST 2024] <<<<<< run test case cli_with_auth success! >>>>>>
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:01:25 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/39159d0e-56ab-4c32-89df-38d2d0a069dc
	{"id":"39159d0e-56ab-4c32-89df-38d2d0a069dc","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885282}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47208f0fd0
	39159d0e-56ab-4c32-89df-38d2d0a069dc

/tidb/cdc/default/default/upstream/7365376149367901045
	{"id":7365376149367901045,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/39159d0e-56ab-4c32-89df-38d2d0a069dc
	{"id":"39159d0e-56ab-4c32-89df-38d2d0a069dc","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885282}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47208f0fd0
	39159d0e-56ab-4c32-89df-38d2d0a069dc

/tidb/cdc/default/default/upstream/7365376149367901045
	{"id":7365376149367901045,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/39159d0e-56ab-4c32-89df-38d2d0a069dc
	{"id":"39159d0e-56ab-4c32-89df-38d2d0a069dc","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885282}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47208f0fd0
	39159d0e-56ab-4c32-89df-38d2d0a069dc

/tidb/cdc/default/default/upstream/7365376149367901045
	{"id":7365376149367901045,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_source.cli.9007.out cli changefeed create --start-ts=449546887081230337 '--sink-uri=kafka://127.0.0.1:9092/ticdc-multi-source-test-20377?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: ced66804-73cc-4fb9-9984-49a3004831d5
Info: {"upstream_id":7365376149367901045,"namespace":"default","id":"ced66804-73cc-4fb9-9984-49a3004831d5","sink_uri":"kafka://127.0.0.1:9092/ticdc-multi-source-test-20377?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:01:25.920862578+08:00","start_ts":449546887081230337,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546887081230337,"checkpoint_ts":449546887081230337,"checkpoint_time":"2024-05-05 13:01:20.919"}
PASS
start tidb cluster in /tmp/tidb_cdc_test/ddl_only_block_related_table
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/simple/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
+ set +x
[Sun May  5 13:01:27 CST 2024] <<<<<< START kafka consumer in multi_source case >>>>>>
go: downloading github.com/pingcap/log v1.1.1-0.20240314023424-862ccc32f18d
go: downloading github.com/pingcap/errors v0.11.5-0.20240318064555-6bd07397691f
go: downloading go.uber.org/zap v1.27.0
go: downloading github.com/pingcap/tidb-tools v0.0.0-20240305021104-9f9bea84490b
go: downloading github.com/BurntSushi/toml v1.3.2
go: downloading github.com/pingcap/tidb v1.1.0-beta.0.20240415145106-cd9c676e9ba4
go: downloading go.uber.org/atomic v1.11.0
go: downloading gopkg.in/natefinch/lumberjack.v2 v2.2.1
go: downloading go.uber.org/multierr v1.11.0
go: downloading github.com/pingcap/failpoint v0.0.0-20220801062533-2eaa32854a6c
go: downloading github.com/pingcap/tidb/pkg/parser v0.0.0-20240410110152-5fc42c9be2f5
go: downloading google.golang.org/grpc v1.62.1
go: downloading github.com/go-sql-driver/mysql v1.7.1
go: downloading github.com/coreos/go-semver v0.3.1
start tidb cluster in /tmp/tidb_cdc_test/generate_column
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
go: downloading github.com/golang/protobuf v1.5.4
go: downloading golang.org/x/net v0.24.0
go: downloading google.golang.org/protobuf v1.33.0
go: downloading golang.org/x/sys v0.19.0
go: downloading google.golang.org/genproto/googleapis/rpc v0.0.0-20240401170217-c3f982113cda
go: downloading google.golang.org/genproto v0.0.0-20240401170217-c3f982113cda
go: downloading golang.org/x/text v0.14.0
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:01:28 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/bab3891d-0765-4147-afab-02713c410aa0
	{"id":"bab3891d-0765-4147-afab-02713c410aa0","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885285}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472090bff2
	bab3891d-0765-4147-afab-02713c410aa0

/tidb/cdc/default/default/upstream/7365376144250551570
	{"id":7365376144250551570,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/bab3891d-0765-4147-afab-02713c410aa0
	{"id":"bab3891d-0765-4147-afab-02713c410aa0","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885285}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472090bff2
	bab3891d-0765-4147-afab-02713c410aa0

/tidb/cdc/default/default/upstream/7365376144250551570
	{"id":7365376144250551570,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/bab3891d-0765-4147-afab-02713c410aa0
	{"id":"bab3891d-0765-4147-afab-02713c410aa0","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885285}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472090bff2
	bab3891d-0765-4147-afab-02713c410aa0

/tidb/cdc/default/default/upstream/7365376144250551570
	{"id":7365376144250551570,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Verifying downstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/simple
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
[Sun May  5 13:01:29 CST 2024] <<<<<< START kafka consumer in capture_session_done_during_task case >>>>>>
lease 22318f472090bff2 revoked
go: downloading github.com/cznic/mathutil v0.0.0-20181122101859-297441e03548
go: downloading golang.org/x/exp v0.0.0-20240409090435-93d18d7e34b8
go: downloading golang.org/x/sync v0.7.0
go: downloading github.com/pingcap/tipb v0.0.0-20240318032315-55a7867ddd50
go: downloading github.com/influxdata/tdigest v0.0.1
go: downloading github.com/tikv/client-go/v2 v2.0.8-0.20240409022718-714958ccd4d5
go: downloading github.com/tiancaiamao/gp v0.0.0-20221230034425-4025bc8a4d4a
go: downloading github.com/danjacques/gofslock v0.0.0-20240212154529-d899e02bfe22
go: downloading github.com/tikv/pd/client v0.0.0-20240322051414-fb9e2d561b6e
go: downloading go.etcd.io/etcd/client/v3 v3.5.12
go: downloading github.com/pingcap/kvproto v0.0.0-20240227073058-929ab83f9754
go: downloading github.com/docker/go-units v0.5.0
go: downloading gopkg.in/yaml.v2 v2.4.0
go: downloading github.com/pingcap/sysutil v1.0.1-0.20240311050922-ae81ee01f3a5
go: downloading github.com/coocood/freecache v1.2.1
go: downloading github.com/grpc-ecosystem/go-grpc-middleware v1.4.0
go: downloading github.com/jellydator/ttlcache/v3 v3.0.1
go: downloading github.com/ngaut/pools v0.0.0-20180318154953-b7bc8c42aac7
go: downloading github.com/prometheus/client_model v0.6.1
go: downloading github.com/spf13/pflag v1.0.5
go: downloading github.com/prometheus/client_golang v1.19.0
go: downloading github.com/opentracing/opentracing-go v1.2.0
go: downloading github.com/google/uuid v1.6.0
go: downloading github.com/opentracing/basictracer-go v1.1.0
go: downloading github.com/shirou/gopsutil/v3 v3.24.2
go: downloading github.com/cockroachdb/errors v1.11.1
go: downloading github.com/uber/jaeger-client-go v2.30.0+incompatible
go: downloading cloud.google.com/go/storage v1.39.1
go: downloading github.com/gorilla/mux v1.8.0
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azcore v1.9.1
go: downloading github.com/stretchr/testify v1.9.0
go: downloading github.com/yangkeao/ldap/v3 v3.4.5-0.20230421065457-369a3bab1117
go: downloading github.com/Azure/azure-sdk-for-go/sdk/azidentity v1.5.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/storage/azblob v1.0.0
go: downloading github.com/scalalang2/golang-fifo v0.1.5
go: downloading github.com/aliyun/alibaba-cloud-sdk-go v1.61.1581
go: downloading github.com/aws/aws-sdk-go v1.50.0
go: downloading github.com/go-resty/resty/v2 v2.11.0
go: downloading github.com/klauspost/compress v1.17.8
go: downloading github.com/ks3sdklib/aws-sdk-go v1.2.9
go: downloading github.com/tikv/pd v1.1.0-beta.0.20240407022249-7179657d129b
go: downloading golang.org/x/oauth2 v0.18.0
go: downloading google.golang.org/api v0.170.0
go: downloading github.com/twmb/murmur3 v1.1.6
go: downloading github.com/tidwall/btree v1.7.0
go: downloading go.etcd.io/etcd/api/v3 v3.5.12
go: downloading github.com/gogo/protobuf v1.3.2
go: downloading golang.org/x/tools v0.20.0
go: downloading github.com/google/btree v1.1.2
go: downloading github.com/cespare/xxhash/v2 v2.3.0
go: downloading go.uber.org/mock v0.4.0
go: downloading github.com/ngaut/sync2 v0.0.0-20141008032647-7a24ed77b2ef
go: downloading cloud.google.com/go v0.112.2
go: downloading github.com/Azure/go-ntlmssp v0.0.0-20221128193559-754e69321358
go: downloading github.com/go-asn1-ber/asn1-ber v1.5.4
go: downloading github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec
go: downloading github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc
go: downloading github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2
go: downloading gopkg.in/yaml.v3 v3.0.1
go: downloading github.com/Azure/azure-sdk-for-go/sdk/internal v1.5.1
go: downloading github.com/AzureAD/microsoft-authentication-library-for-go v1.2.1
go: downloading golang.org/x/crypto v0.22.0
go: downloading go.etcd.io/etcd/client/pkg/v3 v3.5.12
go: downloading github.com/dgraph-io/ristretto v0.1.1
go: downloading github.com/lestrrat-go/jwx/v2 v2.0.21
go: downloading golang.org/x/time v0.5.0
go: downloading github.com/dolthub/swiss v0.2.1
go: downloading github.com/golang/snappy v0.0.4
go: downloading github.com/joho/sqltocsv v0.0.0-20210428211105-a6d6801d59df
go: downloading github.com/carlmjohnson/flagext v0.21.0
go: downloading github.com/jedib0t/go-pretty/v6 v6.2.2
go: downloading github.com/cockroachdb/pebble v1.1.0
go: downloading github.com/jfcg/sorty/v2 v2.1.0
go: downloading github.com/beorn7/perks v1.0.1
go: downloading github.com/prometheus/common v0.52.2
go: downloading github.com/prometheus/procfs v0.13.0
go: downloading github.com/cloudfoundry/gosigar v1.3.6
go: downloading cloud.google.com/go/compute/metadata v0.2.3
go: downloading github.com/pkg/errors v0.9.1
go: downloading github.com/uber/jaeger-lib v2.4.1+incompatible
go: downloading github.com/cockroachdb/logtags v0.0.0-20230118201751-21c54148d20b
go: downloading github.com/cockroachdb/redact v1.1.5
go: downloading github.com/getsentry/sentry-go v0.27.0
go: downloading github.com/tklauser/go-sysconf v0.3.12
go: downloading cloud.google.com/go/compute v1.25.1
go: downloading google.golang.org/genproto/googleapis/api v0.0.0-20240401170217-c3f982113cda
go: downloading github.com/dgryski/go-farm v0.0.0-20200201041132-a6ae2369ad13
go: downloading github.com/cheggaaa/pb/v3 v3.0.8
go: downloading github.com/otiai10/copy v1.2.0
go: downloading github.com/google/pprof v0.0.0-20240117000934-35fc243c5815
go: downloading github.com/robfig/cron/v3 v3.0.1
go: downloading github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2
go: downloading github.com/spkg/bom v1.0.0
go: downloading github.com/robfig/cron v1.2.0
go: downloading github.com/xitongsys/parquet-go v1.6.0
go: downloading github.com/dolthub/maphash v0.1.0
go: downloading github.com/wangjohn/quickselect v0.0.0-20161129230411-ed8402a42d5f
go: downloading github.com/kr/pretty v0.3.1
go: downloading github.com/jfcg/sixb v1.3.8
go: downloading github.com/coreos/go-systemd/v22 v22.5.0
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
go: downloading cloud.google.com/go/iam v1.1.7
go: downloading github.com/googleapis/gax-go/v2 v2.12.3
go: downloading github.com/VividCortex/ewma v1.2.0
go: downloading github.com/fatih/color v1.16.0
go: downloading github.com/mattn/go-colorable v0.1.13
go: downloading github.com/mattn/go-isatty v0.0.20
go: downloading github.com/mattn/go-runewidth v0.0.15
go: downloading github.com/pingcap/badger v1.5.1-0.20230103063557-828f39b09b6d
go: downloading github.com/pingcap/goleveldb v0.0.0-20191226122134-f82aafb29989
go: downloading github.com/kylelemons/godebug v1.1.0
go: downloading github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c
go: downloading go.opencensus.io v0.23.1-0.20220331163232-052120675fac
go: downloading go.opentelemetry.io/otel v1.24.0
go: downloading go.opentelemetry.io/otel/trace v1.24.0
go: downloading github.com/tklauser/numcpus v0.6.1
go: downloading github.com/kr/text v0.2.0
go: downloading github.com/rogpeppe/go-internal v1.12.0
go: downloading github.com/rivo/uniseg v0.4.7
go: downloading github.com/golang-jwt/jwt/v5 v5.2.0
go: downloading github.com/apache/thrift v0.16.0
go: downloading github.com/dustin/go-humanize v1.0.1
go: downloading github.com/golang/glog v1.2.0
go: downloading github.com/lestrrat-go/blackmagic v1.0.2
go: downloading github.com/lestrrat-go/httprc v1.0.5
go: downloading github.com/lestrrat-go/iter v1.0.2
go: downloading github.com/lestrrat-go/option v1.0.1
go: downloading github.com/lestrrat-go/httpcc v1.0.1
go: downloading github.com/golang-jwt/jwt v3.2.2+incompatible
go: downloading github.com/ncw/directio v1.0.5
go: downloading github.com/klauspost/cpuid v1.3.1
go: downloading github.com/coocood/bbloom v0.0.0-20190830030839-58deb6228d64
go: downloading github.com/coocood/rtutil v0.0.0-20190304133409-c84515f646f2
go: downloading github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
go: downloading go.opentelemetry.io/otel/metric v1.24.0
go: downloading github.com/go-logr/logr v1.4.1
go: downloading github.com/go-logr/stdr v1.2.2
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_claim_check/run.sh: line 1: 19216 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/changefeed.toml" 2>&1
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_claim_check_avro/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
go: downloading github.com/cockroachdb/tokenbucket v0.0.0-20230807174530-cc333fc44b06
go: downloading github.com/DataDog/zstd v1.5.5
table capture_session_done_during_task.t exists
check diff failed 1-th time, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/kafka_simple_claim_check_avro
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
check diff failed 2-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
go: downloading go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.49.0
go: downloading github.com/google/s2a-go v0.1.7
go: downloading github.com/googleapis/enterprise-certificate-proxy v0.3.2
go: downloading go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.49.0
go: downloading github.com/felixge/httpsnoop v1.0.4
go: downloading github.com/jmespath/go-jmespath v0.4.0
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
go: downloading github.com/modern-go/reflect2 v1.0.2
go: downloading github.com/json-iterator/go v1.1.12
go: downloading github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd
Verifying downstream PD is started...
check diff failed 3-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
check diff successfully
check diff failed 1-th time, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/savepoint/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8408a40014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:7068, start at 2024-05-05 13:01:36.71769968 +0800 CST m=+5.099416170	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:36.726 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:36.731 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:36.731 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8408a40014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:7068, start at 2024-05-05 13:01:36.71769968 +0800 CST m=+5.099416170	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:36.726 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:36.731 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:36.731 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c840a080015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:7145, start at 2024-05-05 13:01:36.802310595 +0800 CST m=+5.128189034	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:36.810 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:36.770 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:36.770 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash-proxy.toml"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/ddl_only_block_related_table/tiflash/db/proxy"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c840fec0005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:33309, start at 2024-05-05 13:01:37.148595924 +0800 CST m=+5.031431478	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:37.156 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:37.147 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:37.147 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c840fec0005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:33309, start at 2024-05-05 13:01:37.148595924 +0800 CST m=+5.031431478	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:37.156 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:37.147 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:37.147 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c841228000d	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-jpkvb-xcql7, pid:33397, start at 2024-05-05 13:01:37.304537039 +0800 CST m=+5.124171604	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:37.313 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:37.290 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:37.290 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/generate_column/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/generate_column/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/generate_column/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/generate_column/tiflash-proxy.toml"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/generate_column/tiflash/log/proxy.log"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff successfully
wait process cdc.test exit for 1-th time...
[Sun May  5 13:01:39 CST 2024] <<<<<< START cdc server in ddl_only_block_related_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_only_block_related_table.85978599.out server --log-file /tmp/tidb_cdc_test/ddl_only_block_related_table/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_only_block_related_table/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.generate_column.cli.34845.out cli tso query --pd=http://127.0.0.1:2379
start tidb cluster in /tmp/tidb_cdc_test/savepoint
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
wait process cdc.test exit for 2-th time...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c842fb00005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:21852, start at 2024-05-05 13:01:39.183356886 +0800 CST m=+5.064863195	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:39.191 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:39.180 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:39.180 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c842fb00005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:21852, start at 2024-05-05 13:01:39.183356886 +0800 CST m=+5.064863195	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:39.191 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:39.180 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:39.180 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8431300014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:21923, start at 2024-05-05 13:01:39.298041111 +0800 CST m=+5.118647260	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:39.304 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:39.276 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:39.276 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/simple/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/simple/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/simple/tiflash-proxy.toml"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/simple/tiflash/log/proxy.log"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/simple/tiflash/db/proxy"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:40 CST 2024] <<<<<< run test case capture_session_done_during_task success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
+ tso='449546892279021569
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546892279021569 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:01:42 CST 2024] <<<<<< START cdc server in generate_column case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.generate_column.3488634888.out server --log-file /tmp/tidb_cdc_test/generate_column/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/generate_column/cdc_data --cluster-id default
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.simple.cli.23320.out cli tso query --pd=http://127.0.0.1:2379
[2024/05/05 13:01:33.614 +08:00] [INFO] [case.go:115] ["sync updatePKUK take: 12.316065409s"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:01:43 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/786ebb71-5966-45df-b467-f10887201476
	{"id":"786ebb71-5966-45df-b467-f10887201476","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885300}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720d887c5
	786ebb71-5966-45df-b467-f10887201476

/tidb/cdc/default/default/upstream/7365376232154429504
	{"id":7365376232154429504,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/786ebb71-5966-45df-b467-f10887201476
	{"id":"786ebb71-5966-45df-b467-f10887201476","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885300}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720d887c5
	786ebb71-5966-45df-b467-f10887201476

/tidb/cdc/default/default/upstream/7365376232154429504
	{"id":7365376232154429504,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/786ebb71-5966-45df-b467-f10887201476
	{"id":"786ebb71-5966-45df-b467-f10887201476","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885300}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720d887c5
	786ebb71-5966-45df-b467-f10887201476

/tidb/cdc/default/default/upstream/7365376232154429504
	{"id":7365376232154429504,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_only_block_related_table.cli.8656.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' -c=ddl-only-block-related-table
Create changefeed successfully!
ID: ddl-only-block-related-table
Info: {"upstream_id":7365376232154429504,"namespace":"default","id":"ddl-only-block-related-table","sink_uri":"kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:01:43.516485982+08:00","start_ts":449546892969508868,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546892969508868,"checkpoint_ts":449546892969508868,"checkpoint_time":"2024-05-05 13:01:43.381"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
+ tso='449546892798853121
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546892798853121 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c847a980017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:20417, start at 2024-05-05 13:01:44.012503236 +0800 CST m=+5.131709654	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:44.018 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:44.023 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:44.023 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c847a980017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:20417, start at 2024-05-05 13:01:44.012503236 +0800 CST m=+5.131709654	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:44.018 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:44.023 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:44.023 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c847bec0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:20497, start at 2024-05-05 13:01:44.084361073 +0800 CST m=+5.156034311	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:44.090 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:44.059 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:44.059 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
+ set +x
[Sun May  5 13:01:44 CST 2024] <<<<<< START kafka consumer in ddl_only_block_related_table case >>>>>>
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/kafka_simple_claim_check_avro/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/kafka_simple_claim_check_avro/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/kafka_simple_claim_check_avro/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/kafka_simple_claim_check_avro/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/kafka_simple_claim_check_avro/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Sun May  5 13:01:45 CST 2024] <<<<<< START cdc server in simple case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.simple.2337423376.out server --log-file /tmp/tidb_cdc_test/simple/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/simple/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table ddl_only_block_related_table.finish_mark not exists for 1-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:01:45 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b786a6d6-2608-47a7-8732-881f3432ec66
	{"id":"b786a6d6-2608-47a7-8732-881f3432ec66","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885302}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720de45d0
	b786a6d6-2608-47a7-8732-881f3432ec66

/tidb/cdc/default/default/upstream/7365376226046064331
	{"id":7365376226046064331,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b786a6d6-2608-47a7-8732-881f3432ec66
	{"id":"b786a6d6-2608-47a7-8732-881f3432ec66","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885302}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720de45d0
	b786a6d6-2608-47a7-8732-881f3432ec66

/tidb/cdc/default/default/upstream/7365376226046064331
	{"id":7365376226046064331,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b786a6d6-2608-47a7-8732-881f3432ec66
	{"id":"b786a6d6-2608-47a7-8732-881f3432ec66","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885302}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720de45d0
	b786a6d6-2608-47a7-8732-881f3432ec66

/tidb/cdc/default/default/upstream/7365376226046064331
	{"id":7365376226046064331,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.generate_column.cli.34939.out cli changefeed create --start-ts=449546892279021569 '--sink-uri=kafka://127.0.0.1:9092/ticdc-generate-column-test-31844?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: 1509db9e-844a-47f1-bba6-74a57a8c54ea
Info: {"upstream_id":7365376226046064331,"namespace":"default","id":"1509db9e-844a-47f1-bba6-74a57a8c54ea","sink_uri":"kafka://127.0.0.1:9092/ticdc-generate-column-test-31844?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:01:45.757264947+08:00","start_ts":449546892279021569,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546892279021569,"checkpoint_ts":449546892279021569,"checkpoint_time":"2024-05-05 13:01:40.747"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Sun May  5 13:01:47 CST 2024] <<<<<< START cdc server in kafka_simple_claim_check_avro case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check_avro.2183521837.out server --log-file /tmp/tidb_cdc_test/kafka_simple_claim_check_avro/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/kafka_simple_claim_check_avro/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ set +x
[Sun May  5 13:01:47 CST 2024] <<<<<< START kafka consumer in generate_column case >>>>>>
table generate_column.t not exists for 1-th check, retry later
table ddl_only_block_related_table.finish_mark not exists for 2-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:01:48 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/efb76ece-9826-4679-bf43-074ba9980162
	{"id":"efb76ece-9826-4679-bf43-074ba9980162","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885305}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720e441f0
	efb76ece-9826-4679-bf43-074ba9980162

/tidb/cdc/default/default/upstream/7365376244924962132
	{"id":7365376244924962132,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/efb76ece-9826-4679-bf43-074ba9980162
	{"id":"efb76ece-9826-4679-bf43-074ba9980162","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885305}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720e441f0
	efb76ece-9826-4679-bf43-074ba9980162

/tidb/cdc/default/default/upstream/7365376244924962132
	{"id":7365376244924962132,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/efb76ece-9826-4679-bf43-074ba9980162
	{"id":"efb76ece-9826-4679-bf43-074ba9980162","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885305}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720e441f0
	efb76ece-9826-4679-bf43-074ba9980162

/tidb/cdc/default/default/upstream/7365376244924962132
	{"id":7365376244924962132,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.simple.cli.23437.out cli changefeed create --start-ts=449546892798853121 '--sink-uri=kafka+ssl://127.0.0.1:9092/ticdc-simple-test-11292?protocol=open-protocol&partition-num=4&kafka-client-id=cdc_test_simple&kafka-version=2.4.1&max-message-bytes=10485760'
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Create changefeed successfully!
ID: 34170955-66ba-42a9-965a-aa845cf1be35
Info: {"upstream_id":7365376244924962132,"namespace":"default","id":"34170955-66ba-42a9-965a-aa845cf1be35","sink_uri":"kafka+ssl://127.0.0.1:9092/ticdc-simple-test-11292?protocol=open-protocol\u0026partition-num=4\u0026kafka-client-id=cdc_test_simple\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:01:48.815295787+08:00","start_ts":449546892798853121,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546892798853121,"checkpoint_ts":449546892798853121,"checkpoint_time":"2024-05-05 13:01:42.730"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table generate_column.t not exists for 2-th check, retry later
table ddl_only_block_related_table.finish_mark not exists for 3-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:01:50 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f5478032-ad29-4c72-ade5-a64c81700bcd
	{"id":"f5478032-ad29-4c72-ade5-a64c81700bcd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885307}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720f2dfca
	f5478032-ad29-4c72-ade5-a64c81700bcd

/tidb/cdc/default/default/upstream/7365376252425115831
	{"id":7365376252425115831,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f5478032-ad29-4c72-ade5-a64c81700bcd
	{"id":"f5478032-ad29-4c72-ade5-a64c81700bcd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885307}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720f2dfca
	f5478032-ad29-4c72-ade5-a64c81700bcd

/tidb/cdc/default/default/upstream/7365376252425115831
	{"id":7365376252425115831,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f5478032-ad29-4c72-ade5-a64c81700bcd
	{"id":"f5478032-ad29-4c72-ade5-a64c81700bcd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885307}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720f2dfca
	f5478032-ad29-4c72-ade5-a64c81700bcd

/tidb/cdc/default/default/upstream/7365376252425115831
	{"id":7365376252425115831,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check_avro.cli.21894.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
[Sun May  5 13:01:50 CST 2024] <<<<<< START kafka consumer in simple case >>>>>>
succeed to verify meta placement rules
ERROR 1146 (42S02) at line 1: Table 'test.simple1' doesn't exist
check data failed 1-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_only_block_related_table.finish_mark not exists for 4-th check, retry later
table generate_column.t exists
table generate_column.t1 exists
check diff failed 1-th time, retry later
+ set +x
+ tso='449546894855110660
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546894855110660 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check_avro.cli.21933.out cli changefeed create --start-ts=449546894855110660 '--sink-uri=kafka://127.0.0.1:9092/kafka-simple-claim-check-avro-12467?protocol=simple&encoding-format=avro' -c kafka-simple-claim-check-avro --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_claim_check_avro/conf/changefeed.toml
Create changefeed successfully!
ID: kafka-simple-claim-check-avro
Info: {"upstream_id":7365376252425115831,"namespace":"default","id":"kafka-simple-claim-check-avro","sink_uri":"kafka://127.0.0.1:9092/kafka-simple-claim-check-avro-12467?protocol=simple\u0026encoding-format=avro","create_time":"2024-05-05T13:01:52.462931649+08:00","start_ts":449546894855110660,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"claim-check","large_message_handle_compression":"snappy","claim_check_storage_uri":"file:///tmp/kafka-simple-avro-claim-check"}},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546894855110660,"checkpoint_ts":449546894855110660,"checkpoint_time":"2024-05-05 13:01:50.574"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
ERROR 1146 (42S02) at line 1: Table 'test.simple1' doesn't exist
check data failed 2-th time, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/ddl_attributes/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
table ddl_only_block_related_table.finish_mark exists
check diff successfully
+ set +x
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
TEST FAILED: OUTPUT DOES NOT CONTAIN 'id: 1'
____________________________________
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
check data failed 3-th time, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c84f3040005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:14993, start at 2024-05-05 13:01:51.684670983 +0800 CST m=+5.162097601	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:51.690 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:51.681 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:51.681 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c84f3040005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:14993, start at 2024-05-05 13:01:51.684670983 +0800 CST m=+5.162097601	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:51.690 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:51.681 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:51.681 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c84f3c40014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:15070, start at 2024-05-05 13:01:51.756686411 +0800 CST m=+5.184284515	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:03:51.762 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:01:51.729 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:51:51.729 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/savepoint/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/savepoint/tiflash/log/error.log
arg matches is ArgMatches { args: {"config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/savepoint/tiflash-proxy.toml"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/savepoint/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/savepoint/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:55 CST 2024] <<<<<< run test case generate_column success! >>>>>>
wait process 8602 exit for 1-th time...
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils/kill_cdc_pid: line 19: kill: (8602) - No such process
wait process 8602 exit for 2-th time...
process 8602 already exit
[Sun May  5 13:01:55 CST 2024] <<<<<< START cdc server in ddl_only_block_related_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ExecuteNotDone=return(true)'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_only_block_related_table.87698771.out server --log-file /tmp/tidb_cdc_test/ddl_only_block_related_table/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_only_block_related_table/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
check data successfully
wait process cdc.test exit for 1-th time...
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.savepoint.cli.16470.out cli tso query --pd=http://127.0.0.1:2379
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:01:58 CST 2024] <<<<<< run test case simple success! >>>>>>
+ set +x
+ tso='449546896574251009
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546896574251009 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:01:58 CST 2024] <<<<<< START cdc server in savepoint case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.savepoint.1650416506.out server --log-file /tmp/tidb_cdc_test/savepoint/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/savepoint/cdc_data --cluster-id default
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:01:58 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-only-block-related-table
{UpstreamID:7365376232154429504 Namespace:default ID:ddl-only-block-related-table SinkURI:kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 13:01:43.516485982 +0800 CST StartTs:449546892969508868 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc003ca1e60 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546892995985413}
{CheckpointTs:449546896849240071 MinTableBarrierTs:449546896849240071 AdminJobType:noop}
span: {table_id:112,start_key:7480000000000000ff705f720000000000fa,end_key:7480000000000000ff705f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating
span: {table_id:108,start_key:7480000000000000ff6c5f720000000000fa,end_key:7480000000000000ff6c5f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating
span: {table_id:110,start_key:7480000000000000ff6e5f720000000000fa,end_key:7480000000000000ff6e5f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/307fbb5f-71d0-4d4f-bcef-586a583f8c13
	{"id":"307fbb5f-71d0-4d4f-bcef-586a583f8c13","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885316}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720d8889f
	307fbb5f-71d0-4d4f-bcef-586a583f8c13

/tidb/cdc/default/default/changefeed/info/ddl-only-block-related-table
	{"upstream-id":7365376232154429504,"namespace":"default","changefeed-id":"ddl-only-block-related-table","sink-uri":"kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T13:01:43.516485982+08:00","start-ts":449546892969508868,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546892995985413}

/tidb/cdc/default/default/changefeed/status/ddl-only-block-related-table
	{"checkpoint-ts":449546896849240071,"min-table-barrier-ts":449546896849240071,"admin-job-type":0}

/tidb/cdc/default/default/task/position/307fbb5f-71d0-4d4f-bcef-586a583f8c13/ddl-only-block-related-table
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365376232154429504
	{"id":7365376232154429504,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-only-block-related-table
{UpstreamID:7365376232154429504 Namespace:default ID:ddl-only-block-related-table SinkURI:kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 13:01:43.516485982 +0800 CST StartTs:449546892969508868 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc003ca1e60 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546892995985413}
{CheckpointTs:449546896849240071 MinTableBarrierTs:449546896849240071 AdminJobType:noop}
span: {table_id:112,start_key:7480000000000000ff705f720000000000fa,end_key:7480000000000000ff705f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating
span: {table_id:108,start_key:7480000000000000ff6c5f720000000000fa,end_key:7480000000000000ff6c5f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating
span: {table_id:110,start_key:7480000000000000ff6e5f720000000000fa,end_key:7480000000000000ff6e5f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/307fbb5f-71d0-4d4f-bcef-586a583f8c13
	{"id":"307fbb5f-71d0-4d4f-bcef-586a583f8c13","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885316}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720d8889f
	307fbb5f-71d0-4d4f-bcef-586a583f8c13

/tidb/cdc/default/default/changefeed/info/ddl-only-block-related-table
	{"upstream-id":7365376232154429504,"namespace":"default","changefeed-id":"ddl-only-block-related-table","sink-uri":"kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T13:01:43.516485982+08:00","start-ts":449546892969508868,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546892995985413}

/tidb/cdc/default/default/changefeed/status/ddl-only-block-related-table
	{"checkpoint-ts":449546896849240071,"min-table-barrier-ts":449546896849240071,"admin-job-type":0}

/tidb/cdc/default/default/task/position/307fbb5f-71d0-4d4f-bcef-586a583f8c13/ddl-only-block-related-table
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365376232154429504
	{"id":7365376232154429504,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-only-block-related-table
{UpstreamID:7365376232154429504 Namespace:default ID:ddl-only-block-related-table SinkURI:kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 13:01:43.516485982 +0800 CST StartTs:449546892969508868 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc003ca1e60 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546892995985413}
{CheckpointTs:449546896849240071 MinTableBarrierTs:449546896849240071 AdminJobType:noop}
span: {table_id:112,start_key:7480000000000000ff705f720000000000fa,end_key:7480000000000000ff705f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating
span: {table_id:108,start_key:7480000000000000ff6c5f720000000000fa,end_key:7480000000000000ff6c5f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating
span: {table_id:110,start_key:7480000000000000ff6e5f720000000000fa,end_key:7480000000000000ff6e5f730000000000fa}, resolvedTs: 449546896849240071, checkpointTs: 449546896849240071, state: Replicating



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/307fbb5f-71d0-4d4f-bcef-586a583f8c13
	{"id":"307fbb5f-71d0-4d4f-bcef-586a583f8c13","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885316}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720d8889f
	307fbb5f-71d0-4d4f-bcef-586a583f8c13

/tidb/cdc/default/default/changefeed/info/ddl-only-block-related-table
	{"upstream-id":7365376232154429504,"namespace":"default","changefeed-id":"ddl-only-block-related-table","sink-uri":"kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T13:01:43.516485982+08:00","start-ts":449546892969508868,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546892995985413}

/tidb/cdc/default/default/changefeed/status/ddl-only-block-related-table
	{"checkpoint-ts":449546896849240071,"min-table-barrier-ts":449546896849240071,"admin-job-type":0}

/tidb/cdc/default/default/task/position/307fbb5f-71d0-4d4f-bcef-586a583f8c13/ddl-only-block-related-table
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365376232154429504
	{"id":7365376232154429504,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
check_ts_not_forward ddl-only-block-related-table
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check_avro.cli.21981.out cli changefeed pause -c kafka-simple-claim-check-avro
PASS
coverage: 2.0% of statements in github.com/pingcap/tiflow/...
start tidb cluster in /tmp/tidb_cdc_test/ddl_attributes
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check_avro.cli.22029.out cli changefeed update -c kafka-simple-claim-check-avro '--sink-uri=kafka://127.0.0.1:9092/kafka-simple-claim-check-avro-12467?protocol=simple&encoding-format=avro&max-message-bytes=2048' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_claim_check_avro/conf/changefeed.toml --no-confirm
Diff of changefeed config:
{Type:update Path:[SinkURI] From:kafka://127.0.0.1:9092/kafka-simple-claim-check-avro-12467?protocol=simple&encoding-format=avro To:kafka://127.0.0.1:9092/kafka-simple-claim-check-avro-12467?protocol=simple&encoding-format=avro&max-message-bytes=2048}
{Type:update Path:[Config SyncPointInterval] From:<nil> To:0xc001bd35f8}
{Type:update Path:[Config SyncPointRetention] From:<nil> To:0xc001bd3648}
{Type:update Path:[Config Consistent] From:<nil> To:0xc001326460}
Update changefeed config successfully! 
ID: kafka-simple-claim-check-avro
Info: {"upstream_id":7365376252425115831,"namespace":"default","id":"kafka-simple-claim-check-avro","sink_uri":"kafka://127.0.0.1:9092/kafka-simple-claim-check-avro-12467?protocol=simple\u0026encoding-format=avro\u0026max-message-bytes=2048","create_time":"2024-05-05T13:01:52.462931649+08:00","start_ts":449546894855110660,"admin_job_type":1,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_table_monitor":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"simple","encoder_concurrency":32,"terminator":"\r\n","enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"claim-check","large_message_handle_compression":"snappy","claim_check_storage_uri":"file:///tmp/kafka-simple-avro-claim-check"}},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"stopped","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":0,"checkpoint_ts":449546897044013066,"checkpoint_time":"2024-05-05 13:01:58.924"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:02:01 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/43fced82-0f5a-40cf-a329-ab607d00b686
	{"id":"43fced82-0f5a-40cf-a329-ab607d00b686","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885318}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472110c9d4
	43fced82-0f5a-40cf-a329-ab607d00b686

/tidb/cdc/default/default/upstream/7365376294830310968
	{"id":7365376294830310968,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/43fced82-0f5a-40cf-a329-ab607d00b686
	{"id":"43fced82-0f5a-40cf-a329-ab607d00b686","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885318}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472110c9d4
	43fced82-0f5a-40cf-a329-ab607d00b686

/tidb/cdc/default/default/upstream/7365376294830310968
	{"id":7365376294830310968,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/43fced82-0f5a-40cf-a329-ab607d00b686
	{"id":"43fced82-0f5a-40cf-a329-ab607d00b686","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885318}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472110c9d4
	43fced82-0f5a-40cf-a329-ab607d00b686

/tidb/cdc/default/default/upstream/7365376294830310968
	{"id":7365376294830310968,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.savepoint.cli.16565.out cli changefeed create --start-ts=449546896574251009 '--sink-uri=kafka://127.0.0.1:9092/ticdc-savepoint-test-1666?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: e56a8402-df54-4107-a4ec-8a6e7e87d26a
Info: {"upstream_id":7365376294830310968,"namespace":"default","id":"e56a8402-df54-4107-a4ec-8a6e7e87d26a","sink_uri":"kafka://127.0.0.1:9092/ticdc-savepoint-test-1666?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:02:02.134059475+08:00","start_ts":449546896574251009,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546896574251009,"checkpoint_ts":449546896574251009,"checkpoint_time":"2024-05-05 13:01:57.132"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.kafka_simple_claim_check_avro.cli.22068.out cli changefeed resume -c kafka-simple-claim-check-avro
+ set +x
[Sun May  5 13:02:03 CST 2024] <<<<<< START kafka consumer in savepoint case >>>>>>
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
PASS
coverage: 2.1% of statements in github.com/pingcap/tiflow/...
run task failed 1-th time, retry later
[2024/05/05 13:02:03.099 +08:00] [WARN] [diff.go:182] ["table struct is not equal"] [reason="column num not equal, one is 5 another is 4"]
table savepoint.finish_mark not exists for 1-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
table test.finish_mark not exists for 1-th check, retry later
check_ts_not_forward ddl-only-block-related-table
table savepoint.finish_mark exists
check diff successfully
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 2-th check, retry later
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:02:08 CST 2024] <<<<<< run test case savepoint success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark exists
check diff successfully
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/cdc_server_tips/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:02:11 CST 2024] <<<<<< run test case kafka_simple_claim_check_avro success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
start tidb cluster in /tmp/tidb_cdc_test/cdc_server_tips
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8617480012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s, pid:24187, start at 2024-05-05 13:02:10.411356216 +0800 CST m=+5.184103457	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:10.418 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:10.386 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:10.386 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8617480012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s, pid:24187, start at 2024-05-05 13:02:10.411356216 +0800 CST m=+5.184103457	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:10.418 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:10.386 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:10.386 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8649f00002	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-11vs6-jv95s, pid:24275, start at 2024-05-05 13:02:13.629145095 +0800 CST m=+8.338966732	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:13.636 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:13.628 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:13.628 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/ddl_attributes/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/ddl_attributes/tiflash/log/error.log
arg matches is ArgMatches { args: {"config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/ddl_attributes/tiflash-proxy.toml"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/ddl_attributes/tiflash/log/proxy.log"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/ddl_attributes/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_attributes.cli.25737.out cli tso query --pd=http://127.0.0.1:2379
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   221  100   221    0     0   2586      0 --:--:-- --:--:-- --:--:--  2600
+ synced_status='{"synced":true,"sink_checkpoint_ts":"2024-05-05 13:02:05.985","puller_resolved_ts":"2024-05-05 13:01:59.986","last_synced_ts":"2024-05-05 12:59:50.136","now_ts":"2024-05-05 13:02:07.000","info":"Data syncing is finished"}'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:02:05.985","puller_resolved_ts":"2024-05-05' '13:01:59.986","last_synced_ts":"2024-05-05' '12:59:50.136","now_ts":"2024-05-05' '13:02:07.000","info":"Data' syncing is 'finished"}'
++ jq .synced
+ status=true
+ '[' true '!=' true ']'
+ kill_pd
++ ps aux
++ grep pd-server
++ grep /tmp/tidb_cdc_test/synced_status_with_redo
+ info='jenkins     9965  8.0  0.0 13768876 144936 ?     Sl   12:59   0:13 pd-server --advertise-client-urls http://127.0.0.1:2379 --client-urls http://0.0.0.0:2379 --advertise-peer-urls http://127.0.0.1:2380 --peer-urls http://0.0.0.0:2380 --config /tmp/tidb_cdc_test/synced_status_with_redo/pd-config.toml --log-file /tmp/tidb_cdc_test/synced_status_with_redo/pd1.log --data-dir /tmp/tidb_cdc_test/synced_status_with_redo/pd1 --name=pd1 --initial-cluster=pd1=http://127.0.0.1:2380
jenkins    10032  5.3  0.0 14030316 140032 ?     Sl   12:59   0:08 pd-server --advertise-client-urls http://127.0.0.1:2479 --client-urls http://0.0.0.0:2479 --advertise-peer-urls http://127.0.0.1:2480 --peer-urls http://0.0.0.0:2480 --config /tmp/tidb_cdc_test/synced_status_with_redo/pd-config.toml --log-file /tmp/tidb_cdc_test/synced_status_with_redo/down_pd.log --data-dir /tmp/tidb_cdc_test/synced_status_with_redo/down_pd'
++ ps aux
++ grep pd-server
++ grep /tmp/tidb_cdc_test/synced_status_with_redo
++ awk '{print $2}'
++ xargs kill -9
+ sleep 20
{"level":"warn","ts":1714885333.0401213,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc003e58e00/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":1714885333.0402014,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":1714885333.1186633,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000c8dc00/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"info","ts":1714885333.1187088,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":1714885333.9922001,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc001768a80/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":1714885333.9922507,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":"2024-05-05T13:02:17.861152+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0011b9a40/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:17.861311+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a43340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:17.913506+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a5f180/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
+ set +x
+ tso='449546902027894786
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546902027894786 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:02:19 CST 2024] <<<<<< START cdc server in ddl_attributes case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_attributes.2578025782.out server --log-file /tmp/tidb_cdc_test/ddl_attributes/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_attributes/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/run.sh using Sink-Type: kafka... <<=================
+++ dirname /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/run.sh
++ cd /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status
++ pwd
+ CUR=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status
+ source /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/../_utils/test_prepare
++ UP_TIDB_HOST=127.0.0.1
++ UP_TIDB_PORT=4000
++ UP_TIDB_OTHER_PORT=4001
++ UP_TIDB_STATUS=10080
++ UP_TIDB_OTHER_STATUS=10081
++ DOWN_TIDB_HOST=127.0.0.1
++ DOWN_TIDB_PORT=3306
++ DOWN_TIDB_STATUS=20080
++ TLS_TIDB_HOST=127.0.0.1
++ TLS_TIDB_PORT=3307
++ TLS_TIDB_STATUS=30080
++ UP_PD_HOST_1=127.0.0.1
++ UP_PD_PORT_1=2379
++ UP_PD_PEER_PORT_1=2380
++ UP_PD_HOST_2=127.0.0.1
++ UP_PD_PORT_2=2679
++ UP_PD_PEER_PORT_2=2680
++ UP_PD_HOST_3=127.0.0.1
++ UP_PD_PORT_3=2779
++ UP_PD_PEER_PORT_3=2780
++ DOWN_PD_HOST=127.0.0.1
++ DOWN_PD_PORT=2479
++ DOWN_PD_PEER_PORT=2480
++ TLS_PD_HOST=127.0.0.1
++ TLS_PD_PORT=2579
++ TLS_PD_PEER_PORT=2580
++ UP_TIKV_HOST_1=127.0.0.1
++ UP_TIKV_PORT_1=20160
++ UP_TIKV_STATUS_PORT_1=20181
++ UP_TIKV_HOST_2=127.0.0.1
++ UP_TIKV_PORT_2=20161
++ UP_TIKV_STATUS_PORT_2=20182
++ UP_TIKV_HOST_3=127.0.0.1
++ UP_TIKV_PORT_3=20162
++ UP_TIKV_STATUS_PORT_3=20183
++ DOWN_TIKV_HOST=127.0.0.1
++ DOWN_TIKV_PORT=21160
++ DOWN_TIKV_STATUS_PORT=21180
++ TLS_TIKV_HOST=127.0.0.1
++ TLS_TIKV_PORT=22160
++ TLS_TIKV_STATUS_PORT=22180
+++ cat /tmp/tidb_cdc_test/KAFKA_VERSION
+++ echo 2.4.1
++ KAFKA_VERSION=2.4.1
+ WORK_DIR=/tmp/tidb_cdc_test/synced_status
+ CDC_BINARY=cdc.test
+ SINK_TYPE=kafka
+ CDC_COUNT=3
+ DB_COUNT=4
+ trap stop_tidb_cluster EXIT
+ run_normal_case_and_unavailable_pd conf/changefeed.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status
+ mkdir -p /tmp/tidb_cdc_test/synced_status
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status
The 1 times to try to start tidb cluster...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:02:22 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ae7fff32-13b1-4e66-ba79-b0fcb22e27be
	{"id":"ae7fff32-13b1-4e66-ba79-b0fcb22e27be","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885339}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47215fb6dd
	ae7fff32-13b1-4e66-ba79-b0fcb22e27be

/tidb/cdc/default/default/upstream/7365376375296110233
	{"id":7365376375296110233,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ae7fff32-13b1-4e66-ba79-b0fcb22e27be
	{"id":"ae7fff32-13b1-4e66-ba79-b0fcb22e27be","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885339}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47215fb6dd
	ae7fff32-13b1-4e66-ba79-b0fcb22e27be

/tidb/cdc/default/default/upstream/7365376375296110233
	{"id":7365376375296110233,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ae7fff32-13b1-4e66-ba79-b0fcb22e27be
	{"id":"ae7fff32-13b1-4e66-ba79-b0fcb22e27be","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885339}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47215fb6dd
	ae7fff32-13b1-4e66-ba79-b0fcb22e27be

/tidb/cdc/default/default/upstream/7365376375296110233
	{"id":7365376375296110233,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_attributes.cli.25835.out cli changefeed create --start-ts=449546902027894786 '--sink-uri=kafka://127.0.0.1:9092/ticdc-ddl-attributes-test-25534?protocol=open-protocol&partition-num=4&kafka-version=2.4.1'
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Create changefeed successfully!
ID: bd60529b-73ef-42f2-a746-c4cb05d721b1
Info: {"upstream_id":7365376375296110233,"namespace":"default","id":"bd60529b-73ef-42f2-a746-c4cb05d721b1","sink_uri":"kafka://127.0.0.1:9092/ticdc-ddl-attributes-test-25534?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1","create_time":"2024-05-05T13:02:22.924912215+08:00","start_ts":449546902027894786,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546902027894786,"checkpoint_ts":449546902027894786,"checkpoint_time":"2024-05-05 13:02:17.936"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
[2024/05/05 13:02:21.791 +08:00] [INFO] [dailytest.go:68] ["test pass!!!"]
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/kafka_simple_claim_check_avro/run.sh: line 1: 22100 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/changefeed.toml" 2>&1
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_adapter_compatibility/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
wait process cdc.test exit for 3-th time...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 13:02:24 CST 2024] <<<<<< run test case cdc success! >>>>>>
+ set +x
[Sun May  5 13:02:24 CST 2024] <<<<<< START kafka consumer in ddl_attributes case >>>>>>
{"level":"warn","ts":"2024-05-05T13:02:23.862211+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0011b9a40/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:23.862411+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a43340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:23.914427+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a5f180/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c86e1c40013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:24696, start at 2024-05-05 13:02:23.380667078 +0800 CST m=+5.031005091	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:23.386 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:23.345 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:23.345 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c86e1c40013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:24696, start at 2024-05-05 13:02:23.380667078 +0800 CST m=+5.031005091	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:23.386 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:23.345 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:23.345 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c86e4d00004	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:24771, start at 2024-05-05 13:02:23.5434074 +0800 CST m=+5.132696206	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:23.549 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:23.540 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:23.540 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/cdc_server_tips/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/cdc_server_tips/tiflash/log/error.log
arg matches is ArgMatches { args: {"addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/cdc_server_tips/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/cdc_server_tips/tiflash/db/proxy"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/cdc_server_tips/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
start tidb cluster in /tmp/tidb_cdc_test/canal_json_adapter_compatibility
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/synced_status
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.cli.26169.out cli tso query --pd=http://127.0.0.1:2379
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
+ set +x
+ tso='449546904402657281
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546904402657281 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
try a VALID cdc server command
[Sun May  5 13:02:28 CST 2024] <<<<<< START cdc server in cdc_server_tips case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.2621626218.out server --log-file /tmp/tidb_cdc_test/cdc_server_tips/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cdc_server_tips/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:01 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:02 --:--:--     0{"level":"warn","ts":"2024-05-05T13:02:29.863316+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a43340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:29.864191+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0011b9a40/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:29.915732+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a5f180/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:02:32 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fb2dc3f5-9631-41b0-9d8a-25d524bba5cf
	{"id":"fb2dc3f5-9631-41b0-9d8a-25d524bba5cf","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885349}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47218f05df
	fb2dc3f5-9631-41b0-9d8a-25d524bba5cf

/tidb/cdc/default/default/upstream/7365376430278356149
	{"id":7365376430278356149,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fb2dc3f5-9631-41b0-9d8a-25d524bba5cf
	{"id":"fb2dc3f5-9631-41b0-9d8a-25d524bba5cf","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885349}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47218f05df
	fb2dc3f5-9631-41b0-9d8a-25d524bba5cf

/tidb/cdc/default/default/upstream/7365376430278356149
	{"id":7365376430278356149,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fb2dc3f5-9631-41b0-9d8a-25d524bba5cf
	{"id":"fb2dc3f5-9631-41b0-9d8a-25d524bba5cf","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885349}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47218f05df
	fb2dc3f5-9631-41b0-9d8a-25d524bba5cf

/tidb/cdc/default/default/upstream/7365376430278356149
	{"id":7365376430278356149,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
1:02PM INF > Run case=sql/debezium/binary_column_test.sql
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

  0     0    0     0    0     0      0      0 --:--:--  0:00:03 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:04 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:06 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:07 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:08 --:--:--     0{"level":"warn","ts":"2024-05-05T13:02:35.863825+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a43340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:35.86467+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0011b9a40/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:35.917722+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a5f180/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/split_region/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

  0     0    0     0    0     0      0      0 --:--:--  0:00:09 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:10 --:--:--     0{"level":"warn","ts":"2024-05-05T13:02:37.851375+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a43340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":"2024-05-05T13:02:37.851433+0800","logger":"etcd-client","caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":"2024-05-05T13:02:37.853381+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0011b9a40/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":"2024-05-05T13:02:37.853416+0800","logger":"etcd-client","caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":"2024-05-05T13:02:37.906467+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a5f180/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"info","ts":"2024-05-05T13:02:37.90651+0800","logger":"etcd-client","caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
1:02PM INF > Run case=sql/debezium/binary_mode_test.sql
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c87b618000b	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:23184, start at 2024-05-05 13:02:36.944741204 +0800 CST m=+5.127793925	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:36.953 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:36.934 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:36.934 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c87b618000b	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:23184, start at 2024-05-05 13:02:36.944741204 +0800 CST m=+5.127793925	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:36.953 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:36.934 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:36.934 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c87b5fc0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:23264, start at 2024-05-05 13:02:36.964922985 +0800 CST m=+5.103025874	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:36.971 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:36.977 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:36.977 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/canal_json_adapter_compatibility/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/canal_json_adapter_compatibility/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/canal_json_adapter_compatibility/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/canal_json_adapter_compatibility/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/canal_json_adapter_compatibility/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
start tidb cluster in /tmp/tidb_cdc_test/split_region
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c87c0f00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:17757, start at 2024-05-05 13:02:37.663142362 +0800 CST m=+5.092237893	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:37.669 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:37.628 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:37.628 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c87c0f00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:17757, start at 2024-05-05 13:02:37.663142362 +0800 CST m=+5.092237893	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:37.669 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:37.628 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:37.628 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c87c1a80013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:17837, start at 2024-05-05 13:02:37.699699997 +0800 CST m=+5.077540824	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:37.707 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:37.674 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:37.674 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status/tiflash/log/error.log
arg matches is ArgMatches { args: {"data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/db/proxy"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/log/proxy.log"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_adapter_compatibility.cli.24696.out cli tso query --pd=http://127.0.0.1:2379
+ cd /tmp/tidb_cdc_test/synced_status
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.19222.out cli tso query --pd=http://127.0.0.1:2379

  0     0    0     0    0     0      0      0 --:--:--  0:00:11 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:12 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:13 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:14 --:--:--     0{"level":"warn","ts":"2024-05-05T13:02:41.865234+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a43340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:41.865504+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0011b9a40/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
{"level":"warn","ts":"2024-05-05T13:02:41.918362+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a5f180/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
+ set +x
+ tso='449546907925610497
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546907925610497 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:02:41 CST 2024] <<<<<< START cdc server in canal_json_adapter_compatibility case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_adapter_compatibility.2473724739.out server --log-file /tmp/tidb_cdc_test/canal_json_adapter_compatibility/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/canal_json_adapter_compatibility/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table ddl_attributes.attributes_t1_new exists
table ddl_attributes.finish_mark not exists for 1-th check, retry later
table ddl_attributes.finish_mark not exists for 2-th check, retry later
+ set +x
+ tso='449546908120645634
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546908120645634 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449546908120645634
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status --binary cdc.test
[Sun May  5 13:02:42 CST 2024] <<<<<< START cdc server in synced_status case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.1926419266.out server --log-file /tmp/tidb_cdc_test/synced_status/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
1:02PM INF > Run case=sql/debezium/connector_test.sql
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table ddl_attributes.finish_mark not exists for 3-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:02:44 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/dc11fc76-7752-4ba4-ae43-4b1ec96913c5
	{"id":"dc11fc76-7752-4ba4-ae43-4b1ec96913c5","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885362}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4721c1c1cc
	dc11fc76-7752-4ba4-ae43-4b1ec96913c5

/tidb/cdc/default/default/upstream/7365376488899950380
	{"id":7365376488899950380,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/dc11fc76-7752-4ba4-ae43-4b1ec96913c5
	{"id":"dc11fc76-7752-4ba4-ae43-4b1ec96913c5","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885362}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4721c1c1cc
	dc11fc76-7752-4ba4-ae43-4b1ec96913c5

/tidb/cdc/default/default/upstream/7365376488899950380
	{"id":7365376488899950380,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/dc11fc76-7752-4ba4-ae43-4b1ec96913c5
	{"id":"dc11fc76-7752-4ba4-ae43-4b1ec96913c5","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885362}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4721c1c1cc
	dc11fc76-7752-4ba4-ae43-4b1ec96913c5

/tidb/cdc/default/default/upstream/7365376488899950380
	{"id":7365376488899950380,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_adapter_compatibility.cli.24790.out cli changefeed create --start-ts=449546907925610497 '--sink-uri=kafka://127.0.0.1:9092/test?protocol=canal-json&kafka-version=2.4.1&max-message-bytes=10485760'
Create changefeed successfully!
ID: dc040f59-e7d2-4882-b76e-b659474f5b9f
Info: {"upstream_id":7365376488899950380,"namespace":"default","id":"dc040f59-e7d2-4882-b76e-b659474f5b9f","sink_uri":"kafka://127.0.0.1:9092/test?protocol=canal-json\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:02:45.387066668+08:00","start_ts":449546907925610497,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546907925610497,"checkpoint_ts":449546907925610497,"checkpoint_time":"2024-05-05 13:02:40.434"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:02:45 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fcff3686-93d4-4231-8aa2-0a51fb6ac4f4
	{"id":"fcff3686-93d4-4231-8aa2-0a51fb6ac4f4","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885363}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4721c4abd0
	fcff3686-93d4-4231-8aa2-0a51fb6ac4f4

/tidb/cdc/default/default/upstream/7365376493909101028
	{"id":7365376493909101028,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fcff3686-93d4-4231-8aa2-0a51fb6ac4f4
	{"id":"fcff3686-93d4-4231-8aa2-0a51fb6ac4f4","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885363}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4721c4abd0
	fcff3686-93d4-4231-8aa2-0a51fb6ac4f4

/tidb/cdc/default/default/upstream/7365376493909101028
	{"id":7365376493909101028,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fcff3686-93d4-4231-8aa2-0a51fb6ac4f4
	{"id":"fcff3686-93d4-4231-8aa2-0a51fb6ac4f4","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885363}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4721c4abd0
	fcff3686-93d4-4231-8aa2-0a51fb6ac4f4

/tidb/cdc/default/default/upstream/7365376493909101028
	{"id":7365376493909101028,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449546908120645634 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.19328.out cli changefeed create --start-ts=449546908120645634 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365376493909101028,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T13:02:46.170542415+08:00","start_ts":449546908120645634,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546908120645634,"checkpoint_ts":449546908120645634,"checkpoint_time":"2024-05-05 13:02:41.178"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table ddl_attributes.finish_mark not exists for 4-th check, retry later
+ set +x
+ set +x
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   221  100   221    0     0   2695      0 --:--:-- --:--:-- --:--:--  2728
+ synced_status='{"synced":true,"sink_checkpoint_ts":"2024-05-05 13:02:41.178","puller_resolved_ts":"1970-01-01 08:00:00.000","last_synced_ts":"1970-01-01 08:00:00.000","now_ts":"2024-05-05 13:02:47.000","info":"Data syncing is finished"}'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:02:41.178","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '13:02:47.000","info":"Data' syncing is 'finished"}'
++ jq .synced
+ status=true
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:02:41.178","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '13:02:47.000","info":"Data' syncing is 'finished"}'
++ jq -r .sink_checkpoint_ts
+ sink_checkpoint_ts='2024-05-05 13:02:41.178'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:02:41.178","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '13:02:47.000","info":"Data' syncing is 'finished"}'
++ jq -r .puller_resolved_ts
+ puller_resolved_ts='1970-01-01 08:00:00.000'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:02:41.178","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '13:02:47.000","info":"Data' syncing is 'finished"}'
++ jq -r .last_synced_ts
+ last_synced_ts='1970-01-01 08:00:00.000'
+ '[' true '!=' true ']'
+ '[' '1970-01-01 08:00:00.000' '!=' '1970-01-01 08:00:00.000' ']'
+ '[' '1970-01-01 08:00:00.000' '!=' '1970-01-01 08:00:00.000' ']'
++ date '+%Y-%m-%d %H:%M:%S'
+ current='2024-05-05 13:02:47'
+ echo 'sink_checkpoint_ts is 2024-05-05' 13:02:41.178
sink_checkpoint_ts is 2024-05-05 13:02:41.178
++ date -d '2024-05-05 13:02:41.178' +%s
+ checkpoint_timestamp=1714885361
++ date -d '2024-05-05 13:02:47' +%s
+ current_timestamp=1714885367
+ '[' 6 -gt 300 ']'
+ run_sql 'USE TEST;Create table t1(a int primary key, b int);insert into t1 values(1,2);insert into t1 values(2,3);'

  0     0    0     0    0     0      0      0 --:--:--  0:00:15 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:16 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:17 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:18 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:19 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:20 --:--:--     0{"level":"warn","ts":"2024-05-05T13:02:47.866185+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a43340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:47.86658+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0011b9a40/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:47.920272+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a5f180/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"warn","ts":1714885368.041645,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc003e58e00/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":1714885368.0416892,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":1714885368.11904,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000c8dc00/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"info","ts":1714885368.1190813,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ check_table_exists test.t1 127.0.0.1 3306
table test.t1 not exists for 1-th check, retry later

  0     0    0     0    0     0      0      0 --:--:--  0:00:21 --:--:--     0{"level":"warn","ts":1714885368.993235,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc001768a80/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":1714885368.993266,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
table ddl_attributes.finish_mark not exists for 5-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8871180002	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv, pid:25765, start at 2024-05-05 13:02:48.90306193 +0800 CST m=+5.161037757	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:48.909 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:48.902 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:48.902 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.t1 exists
+ sleep 5
table ddl_attributes.finish_mark not exists for 6-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8871180002	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv, pid:25765, start at 2024-05-05 13:02:48.90306193 +0800 CST m=+5.161037757	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:48.909 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:48.902 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:48.902 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c88702c0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-h55pm-s3lfv, pid:25850, start at 2024-05-05 13:02:48.883383591 +0800 CST m=+5.091853853	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:04:48.889 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:02:48.893 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:52:48.893 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/split_region/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/split_region/tiflash/log/error.log
arg matches is ArgMatches { args: {"log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/split_region/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/split_region/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/split_region/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table ddl_attributes.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...

  0     0    0     0    0     0      0      0 --:--:--  0:00:22 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:23 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:24 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:25 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:26 --:--:--     0{"level":"warn","ts":"2024-05-05T13:02:53.867612+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0011b9a40/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:53.86766+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a43340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:02:53.921509+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000a5f180/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.split_region.cli.27235.out cli tso query --pd=http://127.0.0.1:2379
wait process cdc.test exit for 2-th time...
valid ~~~ running cdc  
Failed to start cdc, the usage tips should be printed
 1st test case cdc_server_tips success! 
try an INVALID cdc server command
[Sun May  5 13:02:52 CST 2024] <<<<<< START cdc server in cdc_server_tips case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.cdc_server_tips.2630026302.out server --log-file /tmp/tidb_cdc_test/cdc_server_tips/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/cdc_server_tips/cdc_data --cluster-id default --pd None
+ [[ true != \n\o ]]
+ set +x
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:02:54 CST 2024] <<<<<< run test case ddl_attributes success! >>>>>>
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   243  100   243    0     0   3613      0 --:--:-- --:--:-- --:--:--  3626
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 13:02:54.628","puller_resolved_ts":"2024-05-05 13:02:47.829","last_synced_ts":"2024-05-05 13:02:48.328","now_ts":"2024-05-05 13:02:55.000","info":"The data syncing is not finished, please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:02:54.628","puller_resolved_ts":"2024-05-05' '13:02:47.829","last_synced_ts":"2024-05-05' '13:02:48.328","now_ts":"2024-05-05' '13:02:55.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:02:54.628","puller_resolved_ts":"2024-05-05' '13:02:47.829","last_synced_ts":"2024-05-05' '13:02:48.328","now_ts":"2024-05-05' '13:02:55.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq -r .info
+ info='The data syncing is not finished, please wait'
+ '[' 'The data syncing is not finished, please wait' '!=' 'The data syncing is not finished, please wait' ']'
+ sleep 130
+ set +x
+ tso='449546911574392833
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546911574392833 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:02:56 CST 2024] <<<<<< START cdc server in split_region case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.split_region.2728127283.out server --log-file /tmp/tidb_cdc_test/split_region/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/split_region/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
[2024/05/05 13:02:45.533 +08:00] [INFO] [main.go:86] ["running ddl test: 0 createDropSchemaDDL"]
[2024/05/05 13:02:46.020 +08:00] [INFO] [main.go:220] ["0 insert success: 100"]
[2024/05/05 13:02:46.024 +08:00] [INFO] [main.go:220] ["1 insert success: 100"]
[2024/05/05 13:02:46.468 +08:00] [INFO] [main.go:220] ["0 insert success: 200"]
[2024/05/05 13:02:46.470 +08:00] [INFO] [main.go:220] ["1 insert success: 200"]
[2024/05/05 13:02:46.474 +08:00] [INFO] [main.go:234] ["0 delete success: 100"]
[2024/05/05 13:02:46.476 +08:00] [INFO] [main.go:234] ["1 delete success: 100"]
[2024/05/05 13:02:46.635 +08:00] [INFO] [main.go:220] ["0 insert success: 300"]
[2024/05/05 13:02:46.916 +08:00] [INFO] [main.go:220] ["1 insert success: 300"]
[2024/05/05 13:02:47.081 +08:00] [INFO] [main.go:220] ["0 insert success: 400"]
[2024/05/05 13:02:47.085 +08:00] [INFO] [main.go:220] ["1 insert success: 400"]
[2024/05/05 13:02:47.088 +08:00] [INFO] [main.go:234] ["0 delete success: 200"]
[2024/05/05 13:02:47.374 +08:00] [INFO] [main.go:234] ["1 delete success: 200"]
[2024/05/05 13:02:47.532 +08:00] [INFO] [main.go:220] ["1 insert success: 500"]
[2024/05/05 13:02:47.539 +08:00] [INFO] [main.go:220] ["0 insert success: 500"]
[2024/05/05 13:02:47.978 +08:00] [INFO] [main.go:220] ["1 insert success: 600"]
[2024/05/05 13:02:47.984 +08:00] [INFO] [main.go:234] ["1 delete success: 300"]
[2024/05/05 13:02:48.277 +08:00] [INFO] [main.go:220] ["0 insert success: 600"]
[2024/05/05 13:02:48.286 +08:00] [INFO] [main.go:234] ["0 delete success: 300"]
[2024/05/05 13:02:48.427 +08:00] [INFO] [main.go:220] ["1 insert success: 700"]
[2024/05/05 13:02:48.738 +08:00] [INFO] [main.go:220] ["0 insert success: 700"]
[2024/05/05 13:02:48.873 +08:00] [INFO] [main.go:220] ["1 insert success: 800"]
[2024/05/05 13:02:48.879 +08:00] [INFO] [main.go:234] ["1 delete success: 400"]
[2024/05/05 13:02:49.218 +08:00] [INFO] [main.go:220] ["0 insert success: 800"]
[2024/05/05 13:02:49.235 +08:00] [INFO] [main.go:234] ["0 delete success: 400"]
[2024/05/05 13:02:49.361 +08:00] [INFO] [main.go:220] ["1 insert success: 900"]
[2024/05/05 13:02:49.713 +08:00] [INFO] [main.go:220] ["0 insert success: 900"]
[2024/05/05 13:02:49.841 +08:00] [INFO] [main.go:220] ["1 insert success: 1000"]
[2024/05/05 13:02:49.850 +08:00] [INFO] [main.go:234] ["1 delete success: 500"]
[2024/05/05 13:02:50.187 +08:00] [INFO] [main.go:220] ["0 insert success: 1000"]
[2024/05/05 13:02:50.202 +08:00] [INFO] [main.go:234] ["0 delete success: 500"]
[mysql] 2024/05/05 13:02:50 connection.go:299: invalid connection
[2024/05/05 13:02:55.615 +08:00] [INFO] [main.go:86] ["running ddl test: 1 truncateDDL"]
[2024/05/05 13:02:55.821 +08:00] [INFO] [main.go:220] ["0 insert success: 100"]
[2024/05/05 13:02:55.827 +08:00] [INFO] [main.go:220] ["1 insert success: 100"]
[2024/05/05 13:02:56.008 +08:00] [INFO] [main.go:220] ["0 insert success: 200"]
[2024/05/05 13:02:56.014 +08:00] [INFO] [main.go:234] ["0 delete success: 100"]
[2024/05/05 13:02:56.019 +08:00] [INFO] [main.go:220] ["1 insert success: 200"]
[2024/05/05 13:02:56.025 +08:00] [INFO] [main.go:234] ["1 delete success: 100"]
[2024/05/05 13:02:56.269 +08:00] [INFO] [main.go:220] ["0 insert success: 300"]
[2024/05/05 13:02:56.286 +08:00] [INFO] [main.go:220] ["1 insert success: 300"]
[2024/05/05 13:02:56.454 +08:00] [INFO] [main.go:220] ["0 insert success: 400"]
[2024/05/05 13:02:56.459 +08:00] [INFO] [main.go:234] ["0 delete success: 200"]
[2024/05/05 13:02:56.471 +08:00] [INFO] [main.go:220] ["1 insert success: 400"]
[2024/05/05 13:02:56.477 +08:00] [INFO] [main.go:234] ["1 delete success: 200"]
[2024/05/05 13:02:56.635 +08:00] [INFO] [main.go:220] ["0 insert success: 500"]
[2024/05/05 13:02:56.639 +08:00] [INFO] [main.go:220] ["1 insert success: 500"]
[2024/05/05 13:02:56.817 +08:00] [INFO] [main.go:220] ["1 insert success: 600"]
[2024/05/05 13:02:56.817 +08:00] [INFO] [main.go:220] ["0 insert success: 600"]
[2024/05/05 13:02:56.823 +08:00] [INFO] [main.go:234] ["0 delete success: 300"]
[2024/05/05 13:02:56.826 +08:00] [INFO] [main.go:234] ["1 delete success: 300"]
[2024/05/05 13:02:56.992 +08:00] [INFO] [main.go:220] ["1 insert success: 700"]
[2024/05/05 13:02:57.000 +08:00] [INFO] [main.go:220] ["0 insert success: 700"]
[2024/05/05 13:02:57.188 +08:00] [INFO] [main.go:220] ["0 insert success: 800"]
[2024/05/05 13:02:57.197 +08:00] [INFO] [main.go:234] ["0 delete success: 400"]
[2024/05/05 13:02:57.254 +08:00] [INFO] [main.go:220] ["1 insert success: 800"]
[2024/05/05 13:02:57.274 +08:00] [INFO] [main.go:234] ["1 delete success: 400"]

  0     0    0     0    0     0      0      0 --:--:--  0:00:27 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:28 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:29 --:--:--     0
100   135  100   135    0     0      4      0  0:00:33  0:00:30  0:00:03    27
100   135  100   135    0     0      4      0  0:00:33  0:00:30  0:00:03    33
+ synced_status='{
    "error_msg": "[CDC:ErrPDEtcdAPIError]etcd api call error: context deadline exceeded",
    "error_code": "CDC:ErrPDEtcdAPIError"
}'
++ echo '{' '"error_msg":' '"[CDC:ErrPDEtcdAPIError]etcd' api call error: context deadline 'exceeded",' '"error_code":' '"CDC:ErrPDEtcdAPIError"' '}'
++ jq -r .error_code
[2024/05/05 13:02:57.378 +08:00] [INFO] [main.go:220] ["0 insert success: 900"]
[2024/05/05 13:02:57.431 +08:00] [INFO] [main.go:220] ["1 insert success: 900"]
[2024/05/05 13:02:57.557 +08:00] [INFO] [main.go:220] ["0 insert success: 1000"]
[2024/05/05 13:02:57.566 +08:00] [INFO] [main.go:234] ["0 delete success: 500"]
[2024/05/05 13:02:57.601 +08:00] [INFO] [main.go:220] ["1 insert success: 1000"]
+ error_code=CDC:ErrPDEtcdAPIError
+ cleanup_process cdc.test
[2024/05/05 13:02:57.619 +08:00] [INFO] [main.go:234] ["1 delete success: 500"]
[2024/05/05 13:02:57.739 +08:00] [INFO] [main.go:220] ["0 insert success: 1100"]
[2024/05/05 13:02:57.767 +08:00] [INFO] [main.go:220] ["1 insert success: 1100"]
[2024/05/05 13:02:57.925 +08:00] [INFO] [main.go:220] ["0 insert success: 1200"]
[2024/05/05 13:02:57.930 +08:00] [INFO] [main.go:220] ["1 insert success: 1200"]
[2024/05/05 13:02:57.943 +08:00] [INFO] [main.go:234] ["0 delete success: 600"]
[2024/05/05 13:02:57.951 +08:00] [INFO] [main.go:234] ["1 delete success: 600"]
[2024/05/05 13:02:58.106 +08:00] [INFO] [main.go:220] ["1 insert success: 1300"]
wait process cdc.test exit for 1-th time...
[2024/05/05 13:02:58.115 +08:00] [INFO] [main.go:220] ["0 insert success: 1300"]
[2024/05/05 13:02:58.268 +08:00] [INFO] [main.go:220] ["1 insert success: 1400"]
[2024/05/05 13:02:58.285 +08:00] [INFO] [main.go:234] ["1 delete success: 700"]
[2024/05/05 13:02:58.301 +08:00] [INFO] [main.go:220] ["0 insert success: 1400"]
[2024/05/05 13:02:58.328 +08:00] [INFO] [main.go:234] ["0 delete success: 700"]
wait process cdc.test exit for 2-th time...
[2024/05/05 13:02:58.428 +08:00] [INFO] [main.go:220] ["1 insert success: 1500"]
[2024/05/05 13:02:58.483 +08:00] [INFO] [main.go:220] ["0 insert success: 1500"]
[2024/05/05 13:02:58.586 +08:00] [INFO] [main.go:220] ["1 insert success: 1600"]
[2024/05/05 13:02:58.606 +08:00] [INFO] [main.go:234] ["1 delete success: 800"]
[2024/05/05 13:02:58.662 +08:00] [INFO] [main.go:220] ["0 insert success: 1600"]
[2024/05/05 13:02:58.688 +08:00] [INFO] [main.go:234] ["0 delete success: 800"]
[2024/05/05 13:02:58.813 +08:00] [INFO] [main.go:220] ["1 insert success: 1700"]
[2024/05/05 13:02:58.844 +08:00] [INFO] [main.go:220] ["0 insert success: 1700"]
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
[2024/05/05 13:02:58.974 +08:00] [INFO] [main.go:220] ["1 insert success: 1800"]
[2024/05/05 13:02:58.999 +08:00] [INFO] [main.go:234] ["1 delete success: 900"]
[2024/05/05 13:02:59.031 +08:00] [INFO] [main.go:220] ["0 insert success: 1800"]
[2024/05/05 13:02:59.060 +08:00] [INFO] [main.go:234] ["0 delete success: 900"]
[2024/05/05 13:02:59.139 +08:00] [INFO] [main.go:220] ["1 insert success: 1900"]
table test.binary_columns not exists for 1-th check, retry later
[2024/05/05 13:02:59.214 +08:00] [INFO] [main.go:220] ["0 insert success: 1900"]
[2024/05/05 13:02:59.307 +08:00] [INFO] [main.go:220] ["1 insert success: 2000"]
[2024/05/05 13:02:59.331 +08:00] [INFO] [main.go:234] ["1 delete success: 1000"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:02:59 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8037c982-8dee-4ee6-8588-6de1611b6414
	{"id":"8037c982-8dee-4ee6-8588-6de1611b6414","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885376}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4721f437fa
	8037c982-8dee-4ee6-8588-6de1611b6414

/tidb/cdc/default/default/upstream/7365376541649199697
	{"id":7365376541649199697,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8037c982-8dee-4ee6-8588-6de1611b6414
	{"id":"8037c982-8dee-4ee6-8588-6de1611b6414","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885376}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4721f437fa
	8037c982-8dee-4ee6-8588-6de1611b6414

/tidb/cdc/default/default/upstream/7365376541649199697
	{"id":7365376541649199697,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/8037c982-8dee-4ee6-8588-6de1611b6414
	{"id":"8037c982-8dee-4ee6-8588-6de1611b6414","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885376}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4721f437fa
	8037c982-8dee-4ee6-8588-6de1611b6414

/tidb/cdc/default/default/upstream/7365376541649199697
	{"id":7365376541649199697,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.split_region.cli.27335.out cli changefeed create --start-ts=449546911574392833 '--sink-uri=kafka://127.0.0.1:9092/ticdc-split-region-test-4174?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' --config /tmp/tidb_cdc_test/split_region/pulsar_test.toml
[2024/05/05 13:02:59.398 +08:00] [INFO] [main.go:220] ["0 insert success: 2000"]
[2024/05/05 13:02:59.429 +08:00] [INFO] [main.go:234] ["0 delete success: 1000"]
[2024/05/05 13:02:59.468 +08:00] [INFO] [main.go:220] ["1 insert success: 2100"]
[2024/05/05 13:02:59.581 +08:00] [INFO] [main.go:220] ["0 insert success: 2100"]
[2024/05/05 13:02:59.631 +08:00] [INFO] [main.go:220] ["1 insert success: 2200"]
Create changefeed successfully!
ID: 2ac11332-2e8a-4ded-a1bb-e79a5682780d
Info: {"upstream_id":7365376541649199697,"namespace":"default","id":"2ac11332-2e8a-4ded-a1bb-e79a5682780d","sink_uri":"kafka://127.0.0.1:9092/ticdc-split-region-test-4174?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:02:59.613820052+08:00","start_ts":449546911574392833,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546911574392833,"checkpoint_ts":449546911574392833,"checkpoint_time":"2024-05-05 13:02:54.353"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
[2024/05/05 13:02:59.658 +08:00] [INFO] [main.go:234] ["1 delete success: 1100"]
[2024/05/05 13:02:59.754 +08:00] [INFO] [main.go:220] ["0 insert success: 2200"]
[2024/05/05 13:02:59.786 +08:00] [INFO] [main.go:234] ["0 delete success: 1100"]
[2024/05/05 13:02:59.788 +08:00] [INFO] [main.go:220] ["1 insert success: 2300"]
table test.binary_columns not exists for 2-th check, retry later
[2024/05/05 13:02:59.926 +08:00] [INFO] [main.go:220] ["0 insert success: 2300"]
[2024/05/05 13:02:59.944 +08:00] [INFO] [main.go:220] ["1 insert success: 2400"]
[2024/05/05 13:02:59.976 +08:00] [INFO] [main.go:234] ["1 delete success: 1200"]
[2024/05/05 13:03:00.104 +08:00] [INFO] [main.go:220] ["0 insert success: 2400"]
[2024/05/05 13:03:00.107 +08:00] [INFO] [main.go:220] ["1 insert success: 2500"]
[2024/05/05 13:03:00.142 +08:00] [INFO] [main.go:234] ["0 delete success: 1200"]
[2024/05/05 13:03:00.267 +08:00] [INFO] [main.go:220] ["1 insert success: 2600"]
[2024/05/05 13:03:00.282 +08:00] [INFO] [main.go:220] ["0 insert success: 2500"]
[2024/05/05 13:03:00.297 +08:00] [INFO] [main.go:234] ["1 delete success: 1300"]
[2024/05/05 13:03:00.428 +08:00] [INFO] [main.go:220] ["1 insert success: 2700"]
[2024/05/05 13:03:00.460 +08:00] [INFO] [main.go:220] ["0 insert success: 2600"]
[2024/05/05 13:03:00.499 +08:00] [INFO] [main.go:234] ["0 delete success: 1300"]
[2024/05/05 13:03:00.589 +08:00] [INFO] [main.go:220] ["1 insert success: 2800"]
[2024/05/05 13:03:00.617 +08:00] [INFO] [main.go:234] ["1 delete success: 1400"]
+ set +x
[Sun May  5 13:03:01 CST 2024] <<<<<< START kafka consumer in split_region case >>>>>>
table split_region.test1 not exists for 1-th check, retry later
table test.binary_columns not exists for 3-th check, retry later
table split_region.test1 not exists for 2-th check, retry later
table test.binary_columns not exists for 4-th check, retry later
table split_region.test1 exists
table split_region.test2 exists
check diff failed 1-th time, retry later
[2024/05/05 13:03:05.737 +08:00] [INFO] [main.go:86] ["running ddl test: 2 addDropColumnDDL"]
[2024/05/05 13:03:05.922 +08:00] [INFO] [main.go:220] ["1 insert success: 100"]
[2024/05/05 13:03:05.937 +08:00] [INFO] [main.go:220] ["0 insert success: 100"]
[2024/05/05 13:03:06.077 +08:00] [INFO] [main.go:220] ["1 insert success: 200"]
[2024/05/05 13:03:06.079 +08:00] [INFO] [main.go:234] ["1 delete success: 100"]
[2024/05/05 13:03:06.110 +08:00] [INFO] [main.go:220] ["0 insert success: 200"]
[2024/05/05 13:03:06.112 +08:00] [INFO] [main.go:234] ["0 delete success: 100"]
[2024/05/05 13:03:06.230 +08:00] [INFO] [main.go:220] ["1 insert success: 300"]
table test.binary_columns not exists for 5-th check, retry later
[2024/05/05 13:03:06.279 +08:00] [INFO] [main.go:220] ["0 insert success: 300"]
[2024/05/05 13:03:06.405 +08:00] [INFO] [main.go:220] ["1 insert success: 400"]
[2024/05/05 13:03:06.406 +08:00] [INFO] [main.go:234] ["1 delete success: 200"]
[2024/05/05 13:03:06.469 +08:00] [INFO] [main.go:220] ["0 insert success: 400"]
[2024/05/05 13:03:06.470 +08:00] [INFO] [main.go:234] ["0 delete success: 200"]
[2024/05/05 13:03:06.561 +08:00] [INFO] [main.go:220] ["1 insert success: 500"]
[2024/05/05 13:03:06.643 +08:00] [INFO] [main.go:220] ["0 insert success: 500"]
[2024/05/05 13:03:06.716 +08:00] [INFO] [main.go:220] ["1 insert success: 600"]
[2024/05/05 13:03:06.718 +08:00] [INFO] [main.go:234] ["1 delete success: 300"]
[2024/05/05 13:03:06.820 +08:00] [INFO] [main.go:220] ["0 insert success: 600"]
[2024/05/05 13:03:06.821 +08:00] [INFO] [main.go:234] ["0 delete success: 300"]
[2024/05/05 13:03:06.875 +08:00] [INFO] [main.go:220] ["1 insert success: 700"]
[2024/05/05 13:03:06.994 +08:00] [INFO] [main.go:220] ["0 insert success: 700"]
+ run_case_with_unavailable_tikv conf/changefeed-redo.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status_with_redo
+ mkdir -p /tmp/tidb_cdc_test/synced_status_with_redo
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status_with_redo
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
The 1 times to try to start tidb cluster...
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
[2024/05/05 13:03:07.038 +08:00] [INFO] [main.go:220] ["1 insert success: 800"]
[2024/05/05 13:03:07.040 +08:00] [INFO] [main.go:234] ["1 delete success: 400"]
[2024/05/05 13:03:07.178 +08:00] [INFO] [main.go:220] ["0 insert success: 800"]
[2024/05/05 13:03:07.180 +08:00] [INFO] [main.go:234] ["0 delete success: 400"]
[2024/05/05 13:03:07.198 +08:00] [INFO] [main.go:220] ["1 insert success: 900"]
check diff successfully
[2024/05/05 13:03:07.358 +08:00] [INFO] [main.go:220] ["1 insert success: 1000"]
[2024/05/05 13:03:07.359 +08:00] [INFO] [main.go:234] ["1 delete success: 500"]
[2024/05/05 13:03:07.364 +08:00] [INFO] [main.go:220] ["0 insert success: 900"]
[2024/05/05 13:03:07.519 +08:00] [INFO] [main.go:220] ["1 insert success: 1100"]
[2024/05/05 13:03:07.541 +08:00] [INFO] [main.go:220] ["0 insert success: 1000"]
[2024/05/05 13:03:07.542 +08:00] [INFO] [main.go:234] ["0 delete success: 500"]
[2024/05/05 13:03:07.675 +08:00] [INFO] [main.go:220] ["1 insert success: 1200"]
[2024/05/05 13:03:07.677 +08:00] [INFO] [main.go:234] ["1 delete success: 600"]
[2024/05/05 13:03:07.708 +08:00] [INFO] [main.go:220] ["0 insert success: 1100"]
[2024/05/05 13:03:07.837 +08:00] [INFO] [main.go:220] ["1 insert success: 1300"]
[2024/05/05 13:03:07.883 +08:00] [INFO] [main.go:220] ["0 insert success: 1200"]
[2024/05/05 13:03:07.885 +08:00] [INFO] [main.go:234] ["0 delete success: 600"]
[2024/05/05 13:03:08.000 +08:00] [INFO] [main.go:220] ["1 insert success: 1400"]
[2024/05/05 13:03:08.002 +08:00] [INFO] [main.go:234] ["1 delete success: 700"]
[2024/05/05 13:03:08.062 +08:00] [INFO] [main.go:220] ["0 insert success: 1300"]
[2024/05/05 13:03:08.155 +08:00] [INFO] [main.go:220] ["1 insert success: 1500"]
[2024/05/05 13:03:08.231 +08:00] [INFO] [main.go:220] ["0 insert success: 1400"]
[2024/05/05 13:03:08.232 +08:00] [INFO] [main.go:234] ["0 delete success: 700"]
table test.binary_columns exists
check diff failed 1-th time, retry later
[2024/05/05 13:03:08.313 +08:00] [INFO] [main.go:220] ["1 insert success: 1600"]
[2024/05/05 13:03:08.315 +08:00] [INFO] [main.go:234] ["1 delete success: 800"]
[2024/05/05 13:03:08.402 +08:00] [INFO] [main.go:220] ["0 insert success: 1500"]
[2024/05/05 13:03:08.469 +08:00] [INFO] [main.go:220] ["1 insert success: 1700"]
[2024/05/05 13:03:08.575 +08:00] [INFO] [main.go:220] ["0 insert success: 1600"]
[2024/05/05 13:03:08.577 +08:00] [INFO] [main.go:234] ["0 delete success: 800"]
[2024/05/05 13:03:08.626 +08:00] [INFO] [main.go:220] ["1 insert success: 1800"]
[2024/05/05 13:03:08.627 +08:00] [INFO] [main.go:234] ["1 delete success: 900"]
[2024/05/05 13:03:08.745 +08:00] [INFO] [main.go:220] ["0 insert success: 1700"]
[2024/05/05 13:03:08.784 +08:00] [INFO] [main.go:220] ["1 insert success: 1900"]
[2024/05/05 13:03:08.919 +08:00] [INFO] [main.go:220] ["0 insert success: 1800"]
[2024/05/05 13:03:08.920 +08:00] [INFO] [main.go:234] ["0 delete success: 900"]
[2024/05/05 13:03:08.940 +08:00] [INFO] [main.go:220] ["1 insert success: 2000"]
[2024/05/05 13:03:08.942 +08:00] [INFO] [main.go:234] ["1 delete success: 1000"]
[2024/05/05 13:03:09.102 +08:00] [INFO] [main.go:220] ["0 insert success: 1900"]
[2024/05/05 13:03:09.110 +08:00] [INFO] [main.go:220] ["1 insert success: 2100"]
[2024/05/05 13:03:09.272 +08:00] [INFO] [main.go:220] ["1 insert success: 2200"]
[2024/05/05 13:03:09.273 +08:00] [INFO] [main.go:234] ["1 delete success: 1100"]
[2024/05/05 13:03:09.278 +08:00] [INFO] [main.go:220] ["0 insert success: 2000"]
[2024/05/05 13:03:09.280 +08:00] [INFO] [main.go:234] ["0 delete success: 1000"]
\033[0;36m<<< Run all test success >>>\033[0m
[2024/05/05 13:03:09.435 +08:00] [INFO] [main.go:220] ["1 insert success: 2300"]
[2024/05/05 13:03:09.457 +08:00] [INFO] [main.go:220] ["0 insert success: 2100"]
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
start tidb cluster in /tmp/tidb_cdc_test/synced_status_with_redo
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[2024/05/05 13:03:09.594 +08:00] [INFO] [main.go:220] ["1 insert success: 2400"]
[2024/05/05 13:03:09.595 +08:00] [INFO] [main.go:234] ["1 delete success: 1200"]
[2024/05/05 13:03:09.629 +08:00] [INFO] [main.go:220] ["0 insert success: 2200"]
[2024/05/05 13:03:09.631 +08:00] [INFO] [main.go:234] ["0 delete success: 1100"]
[2024/05/05 13:03:09.750 +08:00] [INFO] [main.go:220] ["1 insert success: 2500"]
[2024/05/05 13:03:09.799 +08:00] [INFO] [main.go:220] ["0 insert success: 2300"]
[Pipeline] // withCredentials
[Pipeline] }
[2024/05/05 13:03:09.906 +08:00] [INFO] [main.go:220] ["1 insert success: 2600"]
[2024/05/05 13:03:09.908 +08:00] [INFO] [main.go:234] ["1 delete success: 1300"]
[2024/05/05 13:03:09.975 +08:00] [INFO] [main.go:220] ["0 insert success: 2400"]
[2024/05/05 13:03:09.976 +08:00] [INFO] [main.go:234] ["0 delete success: 1200"]
[2024/05/05 13:03:10.072 +08:00] [INFO] [main.go:220] ["1 insert success: 2700"]
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[2024/05/05 13:03:10.144 +08:00] [INFO] [main.go:220] ["0 insert success: 2500"]
[2024/05/05 13:03:10.240 +08:00] [INFO] [main.go:220] ["1 insert success: 2800"]
[2024/05/05 13:03:10.242 +08:00] [INFO] [main.go:234] ["1 delete success: 1400"]
[2024/05/05 13:03:10.320 +08:00] [INFO] [main.go:220] ["0 insert success: 2600"]
[2024/05/05 13:03:10.321 +08:00] [INFO] [main.go:234] ["0 delete success: 1300"]
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[2024/05/05 13:03:10.411 +08:00] [INFO] [main.go:220] ["1 insert success: 2900"]
[2024/05/05 13:03:10.494 +08:00] [INFO] [main.go:220] ["0 insert success: 2700"]
[2024/05/05 13:03:10.579 +08:00] [INFO] [main.go:220] ["1 insert success: 3000"]
[2024/05/05 13:03:10.581 +08:00] [INFO] [main.go:234] ["1 delete success: 1500"]
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
check diff successfully
[2024/05/05 13:03:10.670 +08:00] [INFO] [main.go:220] ["0 insert success: 2800"]
[2024/05/05 13:03:10.672 +08:00] [INFO] [main.go:234] ["0 delete success: 1400"]
Verifying downstream PD is started...
invalid ~~~ running cdc  
Failed to start cdc, the usage tips should be printed
 2nd test case cdc_server_tips success! 
[Sun May  5 13:03:12 CST 2024] <<<<<< run all test cases cdc_server_tips success! >>>>>> 
pass check, checkpoint tso not forward after 10s
run task successfully
wait process 8774 exit for 1-th time...
wait process 8774 exit for 2-th time...
wait process 8774 exit for 3-th time...
wait process 8774 exit for 4-th time...
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/_utils/kill_cdc_pid: line 19: kill: (8774) - No such process
wait process 8774 exit for 5-th time...
process 8774 already exit
[Sun May  5 13:03:07 CST 2024] <<<<<< START cdc server in ddl_only_block_related_table case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.ddl_only_block_related_table.94499451.out server --log-file /tmp/tidb_cdc_test/ddl_only_block_related_table/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/ddl_only_block_related_table/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:03:10 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-only-block-related-table
{UpstreamID:7365376232154429504 Namespace:default ID:ddl-only-block-related-table SinkURI:kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 13:01:43.516485982 +0800 CST StartTs:449546892969508868 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc001060630 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546892995985413}
{CheckpointTs:449546897072062474 MinTableBarrierTs:449546915749822467 AdminJobType:noop}
span: {table_id:110,start_key:7480000000000000ff6e5f720000000000fa,end_key:7480000000000000ff6e5f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing
span: {table_id:108,start_key:7480000000000000ff6c5f720000000000fa,end_key:7480000000000000ff6c5f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing
span: {table_id:112,start_key:7480000000000000ff705f720000000000fa,end_key:7480000000000000ff705f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fa0deafa-5ff7-43bf-acce-59a902366c92
	{"id":"fa0deafa-5ff7-43bf-acce-59a902366c92","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885388}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720d88ac7
	fa0deafa-5ff7-43bf-acce-59a902366c92

/tidb/cdc/default/default/changefeed/info/ddl-only-block-related-table
	{"upstream-id":7365376232154429504,"namespace":"default","changefeed-id":"ddl-only-block-related-table","sink-uri":"kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T13:01:43.516485982+08:00","start-ts":449546892969508868,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546892995985413}

/tidb/cdc/default/default/changefeed/status/ddl-only-block-related-table
	{"checkpoint-ts":449546897072062474,"min-table-barrier-ts":449546915749822467,"admin-job-type":0}

/tidb/cdc/default/default/task/position/fa0deafa-5ff7-43bf-acce-59a902366c92/ddl-only-block-related-table
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365376232154429504
	{"id":7365376232154429504,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-only-block-related-table
{UpstreamID:7365376232154429504 Namespace:default ID:ddl-only-block-related-table SinkURI:kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 13:01:43.516485982 +0800 CST StartTs:449546892969508868 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc001060630 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546892995985413}
{CheckpointTs:449546897072062474 MinTableBarrierTs:449546915749822467 AdminJobType:noop}
span: {table_id:110,start_key:7480000000000000ff6e5f720000000000fa,end_key:7480000000000000ff6e5f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing
span: {table_id:108,start_key:7480000000000000ff6c5f720000000000fa,end_key:7480000000000000ff6c5f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing
span: {table_id:112,start_key:7480000000000000ff705f720000000000fa,end_key:7480000000000000ff705f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fa0deafa-5ff7-43bf-acce-59a902366c92
	{"id":"fa0deafa-5ff7-43bf-acce-59a902366c92","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885388}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720d88ac7
	fa0deafa-5ff7-43bf-acce-59a902366c92

/tidb/cdc/default/default/changefeed/info/ddl-only-block-related-table
	{"upstream-id":7365376232154429504,"namespace":"default","changefeed-id":"ddl-only-block-related-table","sink-uri":"kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T13:01:43.516485982+08:00","start-ts":449546892969508868,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546892995985413}

/tidb/cdc/default/default/changefeed/status/ddl-only-block-related-table
	{"checkpoint-ts":449546897072062474,"min-table-barrier-ts":449546915749822467,"admin-job-type":0}
+ grep -q 'failed to get info:'

/tidb/cdc/default/default/task/position/fa0deafa-5ff7-43bf-acce-59a902366c92/ddl-only-block-related-table
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365376232154429504
	{"id":7365376232154429504,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:

changefeedID: default/ddl-only-block-related-table
{UpstreamID:7365376232154429504 Namespace:default ID:ddl-only-block-related-table SinkURI:kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760 CreateTime:2024-05-05 13:01:43.516485982 +0800 CST StartTs:449546892969508868 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc001060630 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546892995985413}
{CheckpointTs:449546897072062474 MinTableBarrierTs:449546915749822467 AdminJobType:noop}
span: {table_id:110,start_key:7480000000000000ff6e5f720000000000fa,end_key:7480000000000000ff6e5f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing
span: {table_id:106,start_key:7480000000000000ff6a5f720000000000fa,end_key:7480000000000000ff6a5f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing
span: {table_id:108,start_key:7480000000000000ff6c5f720000000000fa,end_key:7480000000000000ff6c5f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing
span: {table_id:112,start_key:7480000000000000ff705f720000000000fa,end_key:7480000000000000ff705f730000000000fa}, resolvedTs: 449546897072062474, checkpointTs: 449546897072062474, state: Preparing



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/fa0deafa-5ff7-43bf-acce-59a902366c92
	{"id":"fa0deafa-5ff7-43bf-acce-59a902366c92","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885388}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4720d88ac7
	fa0deafa-5ff7-43bf-acce-59a902366c92

/tidb/cdc/default/default/changefeed/info/ddl-only-block-related-table
	{"upstream-id":7365376232154429504,"namespace":"default","changefeed-id":"ddl-only-block-related-table","sink-uri":"kafka://127.0.0.1:9092/ticdc-common-1-test-11054?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create-time":"2024-05-05T13:01:43.516485982+08:00","start-ts":449546892969508868,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546892995985413}

/tidb/cdc/default/default/changefeed/status/ddl-only-block-related-table
	{"checkpoint-ts":449546897072062474,"min-table-barrier-ts":449546915749822467,"admin-job-type":0}

/tidb/cdc/default/default/task/position/fa0deafa-5ff7-43bf-acce-59a902366c92/ddl-only-block-related-table
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365376232154429504
	{"id":7365376232154429504,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
check diff failed 1-th time, retry later
check diff successfully
check_ts_forward ddl-only-block-related-table
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
changefeed is working normally rts: 449546916274110460->449546916798660600 checkpoint: 449546916274110460->449546916798660600
run task successfully
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 1-th time...
[2024/05/05 13:03:15.912 +08:00] [INFO] [main.go:86] ["running ddl test: 3 addDropColumnDDL2"]
[2024/05/05 13:03:16.102 +08:00] [INFO] [main.go:220] ["1 insert success: 100"]
[2024/05/05 13:03:16.112 +08:00] [INFO] [main.go:220] ["0 insert success: 100"]
wait process cdc.test exit for 2-th time...
[2024/05/05 13:03:16.268 +08:00] [INFO] [main.go:220] ["1 insert success: 200"]
[2024/05/05 13:03:16.269 +08:00] [INFO] [main.go:234] ["1 delete success: 100"]
[2024/05/05 13:03:16.290 +08:00] [INFO] [main.go:220] ["0 insert success: 200"]
[2024/05/05 13:03:16.292 +08:00] [INFO] [main.go:234] ["0 delete success: 100"]
[2024/05/05 13:03:16.430 +08:00] [INFO] [main.go:220] ["1 insert success: 300"]
[2024/05/05 13:03:16.472 +08:00] [INFO] [main.go:220] ["0 insert success: 300"]
[2024/05/05 13:03:16.692 +08:00] [INFO] [main.go:220] ["1 insert success: 400"]
[2024/05/05 13:03:16.694 +08:00] [INFO] [main.go:234] ["1 delete success: 200"]
[2024/05/05 13:03:16.744 +08:00] [INFO] [main.go:220] ["0 insert success: 400"]
[2024/05/05 13:03:16.747 +08:00] [INFO] [main.go:234] ["0 delete success: 200"]
[2024/05/05 13:03:16.852 +08:00] [INFO] [main.go:220] ["1 insert success: 500"]
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:03:16 CST 2024] <<<<<< run test case ddl_only_block_related_table success! >>>>>>
[2024/05/05 13:03:16.924 +08:00] [INFO] [main.go:220] ["0 insert success: 500"]
[2024/05/05 13:03:17.015 +08:00] [INFO] [main.go:220] ["1 insert success: 600"]
[2024/05/05 13:03:17.017 +08:00] [INFO] [main.go:234] ["1 delete success: 300"]
[2024/05/05 13:03:17.107 +08:00] [INFO] [main.go:220] ["0 insert success: 600"]
[2024/05/05 13:03:17.109 +08:00] [INFO] [main.go:234] ["0 delete success: 300"]
[2024/05/05 13:03:17.185 +08:00] [INFO] [main.go:220] ["1 insert success: 700"]
[2024/05/05 13:03:17.284 +08:00] [INFO] [main.go:220] ["0 insert success: 700"]
[2024/05/05 13:03:17.346 +08:00] [INFO] [main.go:220] ["1 insert success: 800"]
[2024/05/05 13:03:17.348 +08:00] [INFO] [main.go:234] ["1 delete success: 400"]
[2024/05/05 13:03:17.467 +08:00] [INFO] [main.go:220] ["0 insert success: 800"]
[2024/05/05 13:03:17.469 +08:00] [INFO] [main.go:234] ["0 delete success: 400"]
[2024/05/05 13:03:17.509 +08:00] [INFO] [main.go:220] ["1 insert success: 900"]
[2024/05/05 13:03:17.643 +08:00] [INFO] [main.go:220] ["0 insert success: 900"]
[2024/05/05 13:03:17.675 +08:00] [INFO] [main.go:220] ["1 insert success: 1000"]
[2024/05/05 13:03:17.677 +08:00] [INFO] [main.go:234] ["1 delete success: 500"]
[2024/05/05 13:03:17.815 +08:00] [INFO] [main.go:220] ["0 insert success: 1000"]
[2024/05/05 13:03:17.817 +08:00] [INFO] [main.go:234] ["0 delete success: 500"]
[2024/05/05 13:03:17.829 +08:00] [INFO] [main.go:220] ["1 insert success: 1100"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 13:03:17.992 +08:00] [INFO] [main.go:220] ["1 insert success: 1200"]
[2024/05/05 13:03:17.993 +08:00] [INFO] [main.go:234] ["1 delete success: 600"]
[2024/05/05 13:03:17.994 +08:00] [INFO] [main.go:220] ["0 insert success: 1100"]
[2024/05/05 13:03:18.154 +08:00] [INFO] [main.go:220] ["1 insert success: 1300"]
[2024/05/05 13:03:18.176 +08:00] [INFO] [main.go:220] ["0 insert success: 1200"]
[2024/05/05 13:03:18.179 +08:00] [INFO] [main.go:234] ["0 delete success: 600"]
[2024/05/05 13:03:18.314 +08:00] [INFO] [main.go:220] ["1 insert success: 1400"]
[2024/05/05 13:03:18.315 +08:00] [INFO] [main.go:234] ["1 delete success: 700"]
[2024/05/05 13:03:18.355 +08:00] [INFO] [main.go:220] ["0 insert success: 1300"]
[2024/05/05 13:03:18.478 +08:00] [INFO] [main.go:220] ["1 insert success: 1500"]
[2024/05/05 13:03:18.529 +08:00] [INFO] [main.go:220] ["0 insert success: 1400"]
[2024/05/05 13:03:18.531 +08:00] [INFO] [main.go:234] ["0 delete success: 700"]
[2024/05/05 13:03:18.638 +08:00] [INFO] [main.go:220] ["1 insert success: 1600"]
[2024/05/05 13:03:18.640 +08:00] [INFO] [main.go:234] ["1 delete success: 800"]
[2024/05/05 13:03:18.700 +08:00] [INFO] [main.go:220] ["0 insert success: 1500"]
[2024/05/05 13:03:18.799 +08:00] [INFO] [main.go:220] ["1 insert success: 1700"]
[2024/05/05 13:03:18.873 +08:00] [INFO] [main.go:220] ["0 insert success: 1600"]
[2024/05/05 13:03:18.875 +08:00] [INFO] [main.go:234] ["0 delete success: 800"]
[2024/05/05 13:03:18.956 +08:00] [INFO] [main.go:220] ["1 insert success: 1800"]
[2024/05/05 13:03:18.958 +08:00] [INFO] [main.go:234] ["1 delete success: 900"]
[2024/05/05 13:03:19.052 +08:00] [INFO] [main.go:220] ["0 insert success: 1700"]
[2024/05/05 13:03:19.120 +08:00] [INFO] [main.go:220] ["1 insert success: 1900"]
[2024/05/05 13:03:19.223 +08:00] [INFO] [main.go:220] ["0 insert success: 1800"]
[2024/05/05 13:03:19.224 +08:00] [INFO] [main.go:234] ["0 delete success: 900"]
[2024/05/05 13:03:19.275 +08:00] [INFO] [main.go:220] ["1 insert success: 2000"]
[2024/05/05 13:03:19.276 +08:00] [INFO] [main.go:234] ["1 delete success: 1000"]
[2024/05/05 13:03:19.393 +08:00] [INFO] [main.go:220] ["0 insert success: 1900"]
[2024/05/05 13:03:19.431 +08:00] [INFO] [main.go:220] ["1 insert success: 2100"]
check diff failed 1-th time, retry later
[2024/05/05 13:03:19.570 +08:00] [INFO] [main.go:220] ["0 insert success: 2000"]
[2024/05/05 13:03:19.572 +08:00] [INFO] [main.go:234] ["0 delete success: 1000"]
[2024/05/05 13:03:19.594 +08:00] [INFO] [main.go:220] ["1 insert success: 2200"]
[2024/05/05 13:03:19.596 +08:00] [INFO] [main.go:234] ["1 delete success: 1100"]
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 13:03:19.745 +08:00] [INFO] [main.go:220] ["0 insert success: 2100"]
[2024/05/05 13:03:19.751 +08:00] [INFO] [main.go:220] ["1 insert success: 2300"]
[2024/05/05 13:03:19.912 +08:00] [INFO] [main.go:220] ["1 insert success: 2400"]
[2024/05/05 13:03:19.913 +08:00] [INFO] [main.go:234] ["1 delete success: 1200"]
[2024/05/05 13:03:19.917 +08:00] [INFO] [main.go:220] ["0 insert success: 2200"]
[2024/05/05 13:03:19.919 +08:00] [INFO] [main.go:234] ["0 delete success: 1100"]
[2024/05/05 13:03:20.074 +08:00] [INFO] [main.go:220] ["1 insert success: 2500"]
[2024/05/05 13:03:20.083 +08:00] [INFO] [main.go:220] ["0 insert success: 2300"]
[2024/05/05 13:03:20.235 +08:00] [INFO] [main.go:220] ["1 insert success: 2600"]
[2024/05/05 13:03:20.237 +08:00] [INFO] [main.go:234] ["1 delete success: 1300"]
[2024/05/05 13:03:20.250 +08:00] [INFO] [main.go:220] ["0 insert success: 2400"]
[2024/05/05 13:03:20.252 +08:00] [INFO] [main.go:234] ["0 delete success: 1200"]
[2024/05/05 13:03:20.399 +08:00] [INFO] [main.go:220] ["1 insert success: 2700"]
[2024/05/05 13:03:20.424 +08:00] [INFO] [main.go:220] ["0 insert success: 2500"]
[2024/05/05 13:03:20.565 +08:00] [INFO] [main.go:220] ["1 insert success: 2800"]
[2024/05/05 13:03:20.566 +08:00] [INFO] [main.go:234] ["1 delete success: 1400"]
[2024/05/05 13:03:20.601 +08:00] [INFO] [main.go:220] ["0 insert success: 2600"]
[2024/05/05 13:03:20.603 +08:00] [INFO] [main.go:234] ["0 delete success: 1300"]
[2024/05/05 13:03:20.725 +08:00] [INFO] [main.go:220] ["1 insert success: 2900"]
[2024/05/05 13:03:20.779 +08:00] [INFO] [main.go:220] ["0 insert success: 2700"]
[2024/05/05 13:03:20.897 +08:00] [INFO] [main.go:220] ["1 insert success: 3000"]
[2024/05/05 13:03:20.899 +08:00] [INFO] [main.go:234] ["1 delete success: 1500"]
check diff failed 2-th time, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8a5dcc0009	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:13655, start at 2024-05-05 13:03:20.443608894 +0800 CST m=+5.144140278	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:20.450 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:20.435 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:20.435 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8a5dcc0009	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:13655, start at 2024-05-05 13:03:20.443608894 +0800 CST m=+5.144140278	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:20.450 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:20.435 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:20.435 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8a5e6c0010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:13734, start at 2024-05-05 13:03:20.492139729 +0800 CST m=+5.147388408	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:20.497 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:20.475 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:20.475 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash-proxy.toml"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
1:03PM INF > Run case=sql/debezium/connector_test_ro.sql
+ cd /tmp/tidb_cdc_test/synced_status_with_redo
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.15111.out cli tso query --pd=http://127.0.0.1:2379
check diff successfully
wait process cdc.test exit for 1-th time...
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/event_filter/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:03:25 CST 2024] <<<<<< run test case split_region success! >>>>>>
+ set +x
+ tso='449546919315767298
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546919315767298 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449546919315767298
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status_with_redo --binary cdc.test
[Sun May  5 13:03:25 CST 2024] <<<<<< START cdc server in synced_status_with_redo case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.1514515147.out server --log-file /tmp/tidb_cdc_test/synced_status_with_redo/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status_with_redo/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table test.binary_columns exists
check diff failed 1-th time, retry later
[2024/05/05 13:03:26.140 +08:00] [INFO] [main.go:86] ["running ddl test: 4 modifyColumnDDL"]
[2024/05/05 13:03:26.321 +08:00] [INFO] [main.go:220] ["1 insert success: 100"]
[2024/05/05 13:03:26.330 +08:00] [INFO] [main.go:220] ["0 insert success: 100"]
[2024/05/05 13:03:26.482 +08:00] [INFO] [main.go:220] ["1 insert success: 200"]
[2024/05/05 13:03:26.483 +08:00] [INFO] [main.go:234] ["1 delete success: 100"]
[2024/05/05 13:03:26.497 +08:00] [INFO] [main.go:234] ["0 delete success: 100"]
[2024/05/05 13:03:26.497 +08:00] [INFO] [main.go:220] ["0 insert success: 200"]
check diff failed 2-th time, retry later
[2024/05/05 13:03:26.636 +08:00] [INFO] [main.go:220] ["1 insert success: 300"]
[2024/05/05 13:03:26.662 +08:00] [INFO] [main.go:220] ["0 insert success: 300"]
start tidb cluster in /tmp/tidb_cdc_test/event_filter
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
[2024/05/05 13:03:26.796 +08:00] [INFO] [main.go:220] ["1 insert success: 400"]
[2024/05/05 13:03:26.798 +08:00] [INFO] [main.go:234] ["1 delete success: 200"]
[2024/05/05 13:03:26.954 +08:00] [INFO] [main.go:234] ["0 delete success: 200"]
[2024/05/05 13:03:26.955 +08:00] [INFO] [main.go:220] ["0 insert success: 400"]
[2024/05/05 13:03:27.084 +08:00] [INFO] [main.go:220] ["1 insert success: 500"]
[2024/05/05 13:03:27.137 +08:00] [INFO] [main.go:220] ["0 insert success: 500"]
[2024/05/05 13:03:27.239 +08:00] [INFO] [main.go:220] ["1 insert success: 600"]
[2024/05/05 13:03:27.241 +08:00] [INFO] [main.go:234] ["1 delete success: 300"]
[2024/05/05 13:03:27.301 +08:00] [INFO] [main.go:234] ["0 delete success: 300"]
[2024/05/05 13:03:27.301 +08:00] [INFO] [main.go:220] ["0 insert success: 600"]
[2024/05/05 13:03:27.394 +08:00] [INFO] [main.go:220] ["1 insert success: 700"]
[2024/05/05 13:03:27.473 +08:00] [INFO] [main.go:220] ["0 insert success: 700"]
[2024/05/05 13:03:27.547 +08:00] [INFO] [main.go:220] ["1 insert success: 800"]
[2024/05/05 13:03:27.549 +08:00] [INFO] [main.go:234] ["1 delete success: 400"]
[2024/05/05 13:03:27.635 +08:00] [INFO] [main.go:234] ["0 delete success: 400"]
[2024/05/05 13:03:27.636 +08:00] [INFO] [main.go:220] ["0 insert success: 800"]
[2024/05/05 13:03:27.700 +08:00] [INFO] [main.go:220] ["1 insert success: 900"]
[2024/05/05 13:03:27.804 +08:00] [INFO] [main.go:220] ["0 insert success: 900"]
[2024/05/05 13:03:27.865 +08:00] [INFO] [main.go:220] ["1 insert success: 1000"]
[2024/05/05 13:03:27.867 +08:00] [INFO] [main.go:234] ["1 delete success: 500"]
[2024/05/05 13:03:27.977 +08:00] [INFO] [main.go:234] ["0 delete success: 500"]
[2024/05/05 13:03:27.978 +08:00] [INFO] [main.go:220] ["0 insert success: 1000"]
[2024/05/05 13:03:28.036 +08:00] [INFO] [main.go:220] ["1 insert success: 1100"]
[2024/05/05 13:03:28.153 +08:00] [INFO] [main.go:220] ["0 insert success: 1100"]
[2024/05/05 13:03:28.195 +08:00] [INFO] [main.go:220] ["1 insert success: 1200"]
[2024/05/05 13:03:28.197 +08:00] [INFO] [main.go:234] ["1 delete success: 600"]
check diff failed 3-th time, retry later
[2024/05/05 13:03:28.322 +08:00] [INFO] [main.go:234] ["0 delete success: 600"]
[2024/05/05 13:03:28.323 +08:00] [INFO] [main.go:220] ["0 insert success: 1200"]
[2024/05/05 13:03:28.355 +08:00] [INFO] [main.go:220] ["1 insert success: 1300"]
[2024/05/05 13:03:28.489 +08:00] [INFO] [main.go:220] ["0 insert success: 1300"]
[2024/05/05 13:03:28.522 +08:00] [INFO] [main.go:220] ["1 insert success: 1400"]
[2024/05/05 13:03:28.523 +08:00] [INFO] [main.go:234] ["1 delete success: 700"]
Verifying downstream PD is started...
[2024/05/05 13:03:28.655 +08:00] [INFO] [main.go:234] ["0 delete success: 700"]
[2024/05/05 13:03:28.656 +08:00] [INFO] [main.go:220] ["0 insert success: 1400"]
[2024/05/05 13:03:28.689 +08:00] [INFO] [main.go:220] ["1 insert success: 1500"]
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:03:28 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1ca18c25-b0fc-49bc-b07c-2e1e6134f015
	{"id":"1ca18c25-b0fc-49bc-b07c-2e1e6134f015","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885405}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47226d6fd3
	1ca18c25-b0fc-49bc-b07c-2e1e6134f015

/tidb/cdc/default/default/upstream/7365376670324031201
	{"id":7365376670324031201,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1ca18c25-b0fc-49bc-b07c-2e1e6134f015
	{"id":"1ca18c25-b0fc-49bc-b07c-2e1e6134f015","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885405}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47226d6fd3
	1ca18c25-b0fc-49bc-b07c-2e1e6134f015

/tidb/cdc/default/default/upstream/7365376670324031201
	{"id":7365376670324031201,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/1ca18c25-b0fc-49bc-b07c-2e1e6134f015
	{"id":"1ca18c25-b0fc-49bc-b07c-2e1e6134f015","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885405}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47226d6fd3
	1ca18c25-b0fc-49bc-b07c-2e1e6134f015

/tidb/cdc/default/default/upstream/7365376670324031201
	{"id":7365376670324031201,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed-redo.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449546919315767298 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.15199.out cli changefeed create --start-ts=449546919315767298 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
[2024/05/05 13:03:28.830 +08:00] [INFO] [main.go:220] ["0 insert success: 1500"]
[2024/05/05 13:03:28.855 +08:00] [INFO] [main.go:220] ["1 insert success: 1600"]
[2024/05/05 13:03:28.857 +08:00] [INFO] [main.go:234] ["1 delete success: 800"]
[2024/05/05 13:03:28.994 +08:00] [INFO] [main.go:234] ["0 delete success: 800"]
[2024/05/05 13:03:28.995 +08:00] [INFO] [main.go:220] ["0 insert success: 1600"]
[2024/05/05 13:03:29.016 +08:00] [INFO] [main.go:220] ["1 insert success: 1700"]
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365376670324031201,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T13:03:28.850315192+08:00","start_ts":449546919315767298,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"eventual","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"storage":"file:///tmp/tidb_cdc_test/synced_status/redo","use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546919315767298,"checkpoint_ts":449546919315767298,"checkpoint_time":"2024-05-05 13:03:23.884"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
[2024/05/05 13:03:29.166 +08:00] [INFO] [main.go:220] ["0 insert success: 1700"]
[2024/05/05 13:03:29.180 +08:00] [INFO] [main.go:220] ["1 insert success: 1800"]
[2024/05/05 13:03:29.182 +08:00] [INFO] [main.go:234] ["1 delete success: 900"]
[2024/05/05 13:03:29.327 +08:00] [INFO] [main.go:234] ["0 delete success: 900"]
[2024/05/05 13:03:29.328 +08:00] [INFO] [main.go:220] ["0 insert success: 1800"]
[2024/05/05 13:03:29.345 +08:00] [INFO] [main.go:220] ["1 insert success: 1900"]
[2024/05/05 13:03:29.492 +08:00] [INFO] [main.go:220] ["0 insert success: 1900"]
[2024/05/05 13:03:29.507 +08:00] [INFO] [main.go:220] ["1 insert success: 2000"]
[2024/05/05 13:03:29.509 +08:00] [INFO] [main.go:234] ["1 delete success: 1000"]
[2024/05/05 13:03:29.656 +08:00] [INFO] [main.go:234] ["0 delete success: 1000"]
[2024/05/05 13:03:29.657 +08:00] [INFO] [main.go:220] ["0 insert success: 2000"]
[2024/05/05 13:03:29.676 +08:00] [INFO] [main.go:220] ["1 insert success: 2100"]
[2024/05/05 13:03:29.820 +08:00] [INFO] [main.go:220] ["0 insert success: 2100"]
[2024/05/05 13:03:29.835 +08:00] [INFO] [main.go:220] ["1 insert success: 2200"]
[2024/05/05 13:03:29.837 +08:00] [INFO] [main.go:234] ["1 delete success: 1100"]
[2024/05/05 13:03:29.985 +08:00] [INFO] [main.go:234] ["0 delete success: 1100"]
[2024/05/05 13:03:29.986 +08:00] [INFO] [main.go:220] ["0 insert success: 2200"]
[2024/05/05 13:03:29.992 +08:00] [INFO] [main.go:220] ["1 insert success: 2300"]
check diff failed 4-th time, retry later
[2024/05/05 13:03:30.158 +08:00] [INFO] [main.go:220] ["0 insert success: 2300"]
[2024/05/05 13:03:30.158 +08:00] [INFO] [main.go:220] ["1 insert success: 2400"]
[2024/05/05 13:03:30.160 +08:00] [INFO] [main.go:234] ["1 delete success: 1200"]
[2024/05/05 13:03:30.319 +08:00] [INFO] [main.go:220] ["1 insert success: 2500"]
[2024/05/05 13:03:30.327 +08:00] [INFO] [main.go:234] ["0 delete success: 1200"]
[2024/05/05 13:03:30.328 +08:00] [INFO] [main.go:220] ["0 insert success: 2400"]
+ set +x
+ run_sql 'USE TEST;Create table t1(a int primary key, b int);insert into t1 values(1,2);insert into t1 values(2,3);'
[2024/05/05 13:03:30.483 +08:00] [INFO] [main.go:220] ["1 insert success: 2600"]
[2024/05/05 13:03:30.485 +08:00] [INFO] [main.go:234] ["1 delete success: 1300"]
[2024/05/05 13:03:30.506 +08:00] [INFO] [main.go:220] ["0 insert success: 2500"]
[2024/05/05 13:03:30.642 +08:00] [INFO] [main.go:220] ["1 insert success: 2700"]
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
[2024/05/05 13:03:30.674 +08:00] [INFO] [main.go:234] ["0 delete success: 1300"]
[2024/05/05 13:03:30.675 +08:00] [INFO] [main.go:220] ["0 insert success: 2600"]
[2024/05/05 13:03:30.807 +08:00] [INFO] [main.go:220] ["1 insert success: 2800"]
[2024/05/05 13:03:30.809 +08:00] [INFO] [main.go:234] ["1 delete success: 1400"]
[2024/05/05 13:03:30.846 +08:00] [INFO] [main.go:220] ["0 insert success: 2700"]
+ check_table_exists test.t1 127.0.0.1 3306
table test.t1 exists
+ sleep 5
[2024/05/05 13:03:30.963 +08:00] [INFO] [main.go:220] ["1 insert success: 2900"]
[2024/05/05 13:03:31.012 +08:00] [INFO] [main.go:234] ["0 delete success: 1400"]
[2024/05/05 13:03:31.013 +08:00] [INFO] [main.go:220] ["0 insert success: 2800"]
[2024/05/05 13:03:31.134 +08:00] [INFO] [main.go:220] ["1 insert success: 3000"]
[2024/05/05 13:03:31.135 +08:00] [INFO] [main.go:234] ["1 delete success: 1500"]
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/changefeed_auto_stop/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
check diff failed 5-th time, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
check diff successfully
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
start tidb cluster in /tmp/tidb_cdc_test/changefeed_auto_stop
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:03:36 CST 2024] <<<<<< run test case canal_json_adapter_compatibility success! >>>>>>
+ kill_tikv
++ ps aux
++ grep tikv-server
++ grep /tmp/tidb_cdc_test/synced_status_with_redo
+ info='jenkins    12952 25.1  0.5 4695592 2221348 ?     Sl   13:03   0:05 tikv-server --pd 127.0.0.1:2379 -A 127.0.0.1:20160 --status-addr 127.0.0.1:20181 --log-file /tmp/tidb_cdc_test/synced_status_with_redo/tikv1.log --log-level debug -C /tmp/tidb_cdc_test/synced_status_with_redo/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status_with_redo/tikv1
jenkins    12953 33.0  0.5 4723752 2276824 ?     Sl   13:03   0:07 tikv-server --pd 127.0.0.1:2379 -A 127.0.0.1:20161 --status-addr 127.0.0.1:20182 --log-file /tmp/tidb_cdc_test/synced_status_with_redo/tikv2.log --log-level debug -C /tmp/tidb_cdc_test/synced_status_with_redo/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status_with_redo/tikv2
jenkins    12954 25.0  0.5 4694056 2199824 ?     Sl   13:03   0:05 tikv-server --pd 127.0.0.1:2379 -A 127.0.0.1:20162 --status-addr 127.0.0.1:20183 --log-file /tmp/tidb_cdc_test/synced_status_with_redo/tikv3.log --log-level debug -C /tmp/tidb_cdc_test/synced_status_with_redo/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status_with_redo/tikv3
jenkins    12956 32.1  0.5 4719144 2263684 ?     Sl   13:03   0:07 tikv-server --pd 127.0.0.1:2479 -A 127.0.0.1:21160 --status-addr 127.0.0.1:21180 --log-file /tmp/tidb_cdc_test/synced_status_with_redo/tikv_down.log --log-level debug -C /tmp/tidb_cdc_test/synced_status_with_redo/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status_with_redo/tikv_down'
++ ps aux
++ grep tikv-server
++ grep /tmp/tidb_cdc_test/synced_status_with_redo
++ awk '{print $2}'
++ xargs kill -9
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   243  100   243    0     0   3558      0 --:--:-- --:--:-- --:--:--  3573
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 13:03:34.335","puller_resolved_ts":"2024-05-05 13:03:30.284","last_synced_ts":"2024-05-05 13:03:30.784","now_ts":"2024-05-05 13:03:35.000","info":"The data syncing is not finished, please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:03:34.335","puller_resolved_ts":"2024-05-05' '13:03:30.284","last_synced_ts":"2024-05-05' '13:03:30.784","now_ts":"2024-05-05' '13:03:35.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:03:34.335","puller_resolved_ts":"2024-05-05' '13:03:30.284","last_synced_ts":"2024-05-05' '13:03:30.784","now_ts":"2024-05-05' '13:03:35.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq -r .info
+ info='The data syncing is not finished, please wait'
+ target_message='The data syncing is not finished, please wait'
+ '[' 'The data syncing is not finished, please wait' '!=' 'The data syncing is not finished, please wait' ']'
+ sleep 130
[2024/05/05 13:03:36.269 +08:00] [INFO] [main.go:86] ["running ddl test: 5 addDropIndexDDL"]
[2024/05/05 13:03:36.442 +08:00] [INFO] [main.go:220] ["1 insert success: 100"]
[2024/05/05 13:03:36.450 +08:00] [INFO] [main.go:220] ["0 insert success: 100"]
[2024/05/05 13:03:36.584 +08:00] [INFO] [main.go:220] ["1 insert success: 200"]
[2024/05/05 13:03:36.586 +08:00] [INFO] [main.go:234] ["1 delete success: 100"]
[2024/05/05 13:03:36.597 +08:00] [INFO] [main.go:234] ["0 delete success: 100"]
[2024/05/05 13:03:36.598 +08:00] [INFO] [main.go:220] ["0 insert success: 200"]
[2024/05/05 13:03:36.733 +08:00] [INFO] [main.go:220] ["1 insert success: 300"]
[2024/05/05 13:03:36.753 +08:00] [INFO] [main.go:220] ["0 insert success: 300"]
[2024/05/05 13:03:36.927 +08:00] [INFO] [main.go:220] ["1 insert success: 400"]
[2024/05/05 13:03:36.929 +08:00] [INFO] [main.go:234] ["1 delete success: 200"]
[2024/05/05 13:03:36.958 +08:00] [INFO] [main.go:234] ["0 delete success: 200"]
[2024/05/05 13:03:36.960 +08:00] [INFO] [main.go:220] ["0 insert success: 400"]
[2024/05/05 13:03:37.138 +08:00] [INFO] [main.go:220] ["1 insert success: 500"]
[2024/05/05 13:03:37.183 +08:00] [INFO] [main.go:220] ["0 insert success: 500"]
[2024/05/05 13:03:37.343 +08:00] [INFO] [main.go:220] ["1 insert success: 600"]
[2024/05/05 13:03:37.344 +08:00] [INFO] [main.go:234] ["1 delete success: 300"]
[2024/05/05 13:03:37.399 +08:00] [INFO] [main.go:234] ["0 delete success: 300"]
[2024/05/05 13:03:37.400 +08:00] [INFO] [main.go:220] ["0 insert success: 600"]
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[2024/05/05 13:03:37.558 +08:00] [INFO] [main.go:220] ["1 insert success: 700"]
[2024/05/05 13:03:37.636 +08:00] [INFO] [main.go:220] ["0 insert success: 700"]
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[2024/05/05 13:03:37.776 +08:00] [INFO] [main.go:220] ["1 insert success: 800"]
[2024/05/05 13:03:37.778 +08:00] [INFO] [main.go:234] ["1 delete success: 400"]
[2024/05/05 13:03:37.871 +08:00] [INFO] [main.go:234] ["0 delete success: 400"]
[2024/05/05 13:03:37.872 +08:00] [INFO] [main.go:220] ["0 insert success: 800"]
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[2024/05/05 13:03:38.015 +08:00] [INFO] [main.go:220] ["1 insert success: 900"]
[2024/05/05 13:03:38.207 +08:00] [INFO] [main.go:220] ["0 insert success: 900"]
[Pipeline] // node
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
Verifying downstream PD is started...
[2024/05/05 13:03:38.288 +08:00] [INFO] [main.go:220] ["1 insert success: 1000"]
[2024/05/05 13:03:38.292 +08:00] [INFO] [main.go:234] ["1 delete success: 500"]
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
[2024/05/05 13:03:38.534 +08:00] [INFO] [main.go:234] ["0 delete success: 500"]
[2024/05/05 13:03:38.535 +08:00] [INFO] [main.go:220] ["0 insert success: 1000"]
[2024/05/05 13:03:38.571 +08:00] [INFO] [main.go:220] ["1 insert success: 1100"]
[2024/05/05 13:03:38.870 +08:00] [INFO] [main.go:220] ["1 insert success: 1200"]
[2024/05/05 13:03:38.873 +08:00] [INFO] [main.go:234] ["1 delete success: 600"]
[2024/05/05 13:03:38.876 +08:00] [INFO] [main.go:220] ["0 insert success: 1100"]
[2024/05/05 13:03:39.086 +08:00] [INFO] [main.go:220] ["1 insert success: 1300"]
[2024/05/05 13:03:39.153 +08:00] [INFO] [main.go:234] ["0 delete success: 600"]
[2024/05/05 13:03:39.154 +08:00] [INFO] [main.go:220] ["0 insert success: 1200"]
[2024/05/05 13:03:39.352 +08:00] [INFO] [main.go:220] ["1 insert success: 1400"]
[2024/05/05 13:03:39.355 +08:00] [INFO] [main.go:234] ["1 delete success: 700"]
[2024/05/05 13:03:39.471 +08:00] [INFO] [main.go:220] ["0 insert success: 1300"]
[2024/05/05 13:03:39.677 +08:00] [INFO] [main.go:220] ["1 insert success: 1500"]
[2024/05/05 13:03:39.823 +08:00] [INFO] [main.go:234] ["0 delete success: 700"]
[2024/05/05 13:03:39.825 +08:00] [INFO] [main.go:220] ["0 insert success: 1400"]
[2024/05/05 13:03:39.989 +08:00] [INFO] [main.go:220] ["1 insert success: 1600"]
[2024/05/05 13:03:39.995 +08:00] [INFO] [main.go:234] ["1 delete success: 800"]
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8b6e740013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:27346, start at 2024-05-05 13:03:37.909599454 +0800 CST m=+5.194129437	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:37.916 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:37.885 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:37.885 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8b6e740013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:27346, start at 2024-05-05 13:03:37.909599454 +0800 CST m=+5.194129437	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:37.916 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:37.885 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:37.885 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8b6e6c000f	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:27423, start at 2024-05-05 13:03:37.89758497 +0800 CST m=+5.133646297	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:37.903 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:37.883 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:37.883 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/event_filter/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/event_filter/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/event_filter/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/event_filter/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/event_filter/tiflash/log/proxy.log"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[2024/05/05 13:03:40.203 +08:00] [INFO] [main.go:220] ["0 insert success: 1500"]
[2024/05/05 13:03:40.327 +08:00] [INFO] [main.go:220] ["1 insert success: 1700"]
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
[2024/05/05 13:03:40.616 +08:00] [INFO] [main.go:234] ["0 delete success: 800"]
[2024/05/05 13:03:40.617 +08:00] [INFO] [main.go:220] ["0 insert success: 1600"]
[2024/05/05 13:03:40.707 +08:00] [INFO] [main.go:220] ["1 insert success: 1800"]
[2024/05/05 13:03:40.711 +08:00] [INFO] [main.go:234] ["1 delete success: 900"]
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[2024/05/05 13:03:41.060 +08:00] [INFO] [main.go:220] ["0 insert success: 1700"]
[2024/05/05 13:03:41.090 +08:00] [INFO] [main.go:220] ["1 insert success: 1900"]
[Sun May  5 13:03:41 CST 2024] <<<<<< START cdc server in event_filter case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.event_filter.2887628878.out server --log-file /tmp/tidb_cdc_test/event_filter/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/event_filter/cdc_data --cluster-id default --addr 127.0.0.1:8300 --pd http://127.0.0.1:2379
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:03:44 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4ed4c57a-bdfb-4404-91ee-a1608eb925cf
	{"id":"4ed4c57a-bdfb-4404-91ee-a1608eb925cf","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885422}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4722af76c9
	4ed4c57a-bdfb-4404-91ee-a1608eb925cf

/tidb/cdc/default/default/upstream/7365376741686448409
	{"id":7365376741686448409,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4ed4c57a-bdfb-4404-91ee-a1608eb925cf
	{"id":"4ed4c57a-bdfb-4404-91ee-a1608eb925cf","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885422}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4722af76c9
	4ed4c57a-bdfb-4404-91ee-a1608eb925cf

/tidb/cdc/default/default/upstream/7365376741686448409
	{"id":7365376741686448409,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4ed4c57a-bdfb-4404-91ee-a1608eb925cf
	{"id":"4ed4c57a-bdfb-4404-91ee-a1608eb925cf","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885422}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4722af76c9
	4ed4c57a-bdfb-4404-91ee-a1608eb925cf

/tidb/cdc/default/default/upstream/7365376741686448409
	{"id":7365376741686448409,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.event_filter.cli.28933.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-event-filter-12661?protocol=open-protocol&partition-num=4&kafka-version=2.4.1&max-message-bytes=10485760' --server=127.0.0.1:8300 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/event_filter/conf/cf.toml
Create changefeed successfully!
ID: 99a0e7ca-e987-43f4-8de8-9f11ecb6d243
Info: {"upstream_id":7365376741686448409,"namespace":"default","id":"99a0e7ca-e987-43f4-8de8-9f11ecb6d243","sink_uri":"kafka://127.0.0.1:9092/ticdc-event-filter-12661?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:03:45.451820691+08:00","start_ts":449546924926173189,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["event_filter.*"],"event_filters":[{"matcher":["event_filter.t1"],"ignore_event":["drop table","delete"],"ignore_sql":null,"ignore_insert_value_expr":"id = 2 or city = 'tokyo'","ignore_update_new_value_expr":"","ignore_update_old_value_expr":"","ignore_delete_value_expr":""},{"matcher":["event_filter.t_truncate"],"ignore_event":["truncate table"],"ignore_sql":null,"ignore_insert_value_expr":"","ignore_update_new_value_expr":"","ignore_update_old_value_expr":"","ignore_delete_value_expr":""},{"matcher":["event_filter.t_alter"],"ignore_event":["alter table"],"ignore_sql":null,"ignore_insert_value_expr":"","ignore_update_new_value_expr":"","ignore_update_old_value_expr":"","ignore_delete_value_expr":""}]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546924926173189,"checkpoint_ts":449546924926173189,"checkpoint_time":"2024-05-05 13:03:45.286"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_basic/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
[2024/05/05 13:03:46.356 +08:00] [INFO] [main.go:78] ["runDDLTest take 1m0.822943532s"]
table mark.finish_mark_0 exists
table mark.finish_mark_1 not exists for 1-th check, retry later
+ set +x
[Sun May  5 13:03:46 CST 2024] <<<<<< START kafka consumer in event_filter case >>>>>>
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8be544000c	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:10812, start at 2024-05-05 13:03:45.499872873 +0800 CST m=+4.972313374	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:45.505 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:45.489 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:45.489 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8be544000c	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:10812, start at 2024-05-05 13:03:45.499872873 +0800 CST m=+4.972313374	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:45.505 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:45.489 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:45.489 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8be8fc0013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-n0psn-7r3km, pid:10901, start at 2024-05-05 13:03:45.749264898 +0800 CST m=+5.158228036	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:45.755 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:45.727 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:45.727 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/changefeed_auto_stop/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/changefeed_auto_stop/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/changefeed_auto_stop/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/changefeed_auto_stop/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/changefeed_auto_stop/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table mark.finish_mark_1 not exists for 2-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/canal_json_basic
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
***************** properties *****************
"readproportion"="0"
"mysql.port"="4000"
"threadcount"="4"
"scanproportion"="0"
"recordcount"="20"
"requestdistribution"="uniform"
"mysql.user"="root"
"dotransactions"="false"
"workload"="core"
"insertproportion"="0"
"mysql.host"="127.0.0.1"
"operationcount"="0"
"readallfields"="true"
"mysql.db"="changefeed_auto_stop_1"
"updateproportion"="0"
**********************************************
Run finished, takes 8.452811ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 3928.4, Avg(us): 1599, Min(us): 909, Max(us): 3399, 95th(us): 4000, 99th(us): 4000
***************** properties *****************
"mysql.user"="root"
"mysql.db"="changefeed_auto_stop_2"
"operationcount"="0"
"mysql.host"="127.0.0.1"
"requestdistribution"="uniform"
"recordcount"="20"
"workload"="core"
"readproportion"="0"
"scanproportion"="0"
"insertproportion"="0"
"dotransactions"="false"
"mysql.port"="4000"
"threadcount"="4"
"readallfields"="true"
"updateproportion"="0"
**********************************************
Run finished, takes 7.56868ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 4548.4, Avg(us): 1425, Min(us): 856, Max(us): 3277, 95th(us): 4000, 99th(us): 4000
table event_filter.t1 does not exists
table event_filter.t1 exists
table event_filter.t_normal not exists for 1-th check, retry later
***************** properties *****************
"requestdistribution"="uniform"
"dotransactions"="false"
"readproportion"="0"
"insertproportion"="0"
"mysql.port"="4000"
"mysql.host"="127.0.0.1"
"mysql.db"="changefeed_auto_stop_3"
"scanproportion"="0"
"operationcount"="0"
"recordcount"="20"
"mysql.user"="root"
"readallfields"="true"
"workload"="core"
"updateproportion"="0"
"threadcount"="4"
**********************************************
Run finished, takes 8.125304ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 4102.8, Avg(us): 1524, Min(us): 898, Max(us): 3244, 95th(us): 4000, 99th(us): 4000
***************** properties *****************
"insertproportion"="0"
"requestdistribution"="uniform"
"mysql.port"="4000"
"readproportion"="0"
"updateproportion"="0"
"mysql.user"="root"
"readallfields"="true"
"dotransactions"="false"
"workload"="core"
"mysql.host"="127.0.0.1"
"mysql.db"="changefeed_auto_stop_4"
"threadcount"="4"
"recordcount"="20"
"operationcount"="0"
"scanproportion"="0"
**********************************************
Run finished, takes 7.673752ms
INSERT - Takes(s): 0.0, Count: 20, OPS: 4725.5, Avg(us): 1448, Min(us): 861, Max(us): 3417, 95th(us): 4000, 99th(us): 4000
[Sun May  5 13:03:50 CST 2024] <<<<<< START cdc server in changefeed_auto_stop case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8301/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_auto_stop.1244412446.out server --log-file /tmp/tidb_cdc_test/changefeed_auto_stop/cdc1.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_auto_stop/cdc_data1 --cluster-id default --addr 127.0.0.1:8301 --pd http://127.0.0.1:2379
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8301/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8301 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8301; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table mark.finish_mark_1 not exists for 3-th check, retry later
table event_filter.t_normal exists
table event_filter.t_truncate not exists for 1-th check, retry later
table mark.finish_mark_1 not exists for 4-th check, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8301/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8301 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8301 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8301
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:03:53 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/29dd714f-3865-476a-af66-f4fa31318849
	{"id":"29dd714f-3865-476a-af66-f4fa31318849","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885430}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4722d20a31
	29dd714f-3865-476a-af66-f4fa31318849

/tidb/cdc/default/default/upstream/7365376783338559389
	{"id":7365376783338559389,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/29dd714f-3865-476a-af66-f4fa31318849
	{"id":"29dd714f-3865-476a-af66-f4fa31318849","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885430}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4722d20a31
	29dd714f-3865-476a-af66-f4fa31318849

/tidb/cdc/default/default/upstream/7365376783338559389
	{"id":7365376783338559389,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/29dd714f-3865-476a-af66-f4fa31318849
	{"id":"29dd714f-3865-476a-af66-f4fa31318849","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885430}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4722d20a31
	29dd714f-3865-476a-af66-f4fa31318849

/tidb/cdc/default/default/upstream/7365376783338559389
	{"id":7365376783338559389,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 13:03:53 CST 2024] <<<<<< START cdc server in changefeed_auto_stop case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8302/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/processor/pipeline/ProcessorSyncResolvedError=1*return(true);github.com/pingcap/tiflow/cdc/processor/ProcessorUpdatePositionDelaying=sleep(1000)'
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.changefeed_auto_stop.1250712509.out server --log-file /tmp/tidb_cdc_test/changefeed_auto_stop/cdc2.log --log-level debug --data-dir /tmp/tidb_cdc_test/changefeed_auto_stop/cdc_data2 --cluster-id default --addr 127.0.0.1:8302 --pd http://127.0.0.1:2379
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8302/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8302 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8302; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table event_filter.t_truncate exists
table event_filter.t_alter not exists for 1-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table mark.finish_mark_1 not exists for 5-th check, retry later
table event_filter.t_alter exists
table mark.finish_mark_1 not exists for 6-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8302/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8302 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8302 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8302
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:03:56 GMT
< Content-Length: 1271
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/29dd714f-3865-476a-af66-f4fa31318849
	{"id":"29dd714f-3865-476a-af66-f4fa31318849","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885430}

/tidb/cdc/default/__cdc_meta__/capture/96ebd405-dbbc-4934-8b84-30b5f0135cbd
	{"id":"96ebd405-dbbc-4934-8b84-30b5f0135cbd","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885433}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4722d20a31
	29dd714f-3865-476a-af66-f4fa31318849

/tidb/cdc/default/__cdc_meta__/owner/22318f4722d20a65
	96ebd405-dbbc-4934-8b84-30b5f0135cbd

/tidb/cdc/default/default/upstream/7365376783338559389
	{"id":7365376783338559389,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/29dd714f-3865-476a-af66-f4fa31318849
	{"id":"29dd714f-3865-476a-af66-f4fa31318849","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885430}

/tidb/cdc/default/__cdc_meta__/capture/96ebd405-dbbc-4934-8b84-30b5f0135cbd
	{"id":"96ebd405-dbbc-4934-8b84-30b5f0135cbd","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885433}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4722d20a31
	29dd714f-3865-476a-af66-f4fa31318849

/tidb/cdc/default/__cdc_meta__/owner/22318f4722d20a65
	96ebd405-dbbc-4934-8b84-30b5f0135cbd

/tidb/cdc/default/default/upstream/7365376783338559389
	{"id":7365376783338559389,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/29dd714f-3865-476a-af66-f4fa31318849
	{"id":"29dd714f-3865-476a-af66-f4fa31318849","address":"127.0.0.1:8301","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885430}

/tidb/cdc/default/__cdc_meta__/capture/96ebd405-dbbc-4934-8b84-30b5f0135cbd
	{"id":"96ebd405-dbbc-4934-8b84-30b5f0135cbd","address":"127.0.0.1:8302","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885433}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4722d20a31
	29dd714f-3865-476a-af66-f4fa31318849

/tidb/cdc/default/__cdc_meta__/owner/22318f4722d20a65
	96ebd405-dbbc-4934-8b84-30b5f0135cbd

/tidb/cdc/default/default/upstream/7365376783338559389
	{"id":7365376783338559389,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
[Sun May  5 13:03:56 CST 2024] <<<<<< START kafka consumer in changefeed_auto_stop case >>>>>>
check_changefeed_state http://127.0.0.1:2379 c8ea02dd-3cf1-4148-919d-3d03c58f11c2 normal null
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=c8ea02dd-3cf1-4148-919d-3d03c58f11c2
+ expected_state=normal
+ error_msg=null
+ tls_dir=null
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c c8ea02dd-3cf1-4148-919d-3d03c58f11c2 -s
+ info='{
  "upstream_id": 7365376783338559389,
  "namespace": "default",
  "id": "c8ea02dd-3cf1-4148-919d-3d03c58f11c2",
  "state": "normal",
  "checkpoint_tso": 449546926041071617,
  "checkpoint_time": "2024-05-05 13:03:49.539",
  "error": null
}'
+ echo '{
  "upstream_id": 7365376783338559389,
  "namespace": "default",
  "id": "c8ea02dd-3cf1-4148-919d-3d03c58f11c2",
  "state": "normal",
  "checkpoint_tso": 449546926041071617,
  "checkpoint_time": "2024-05-05 13:03:49.539",
  "error": null
}'
{
  "upstream_id": 7365376783338559389,
  "namespace": "default",
  "id": "c8ea02dd-3cf1-4148-919d-3d03c58f11c2",
  "state": "normal",
  "checkpoint_tso": 449546926041071617,
  "checkpoint_time": "2024-05-05 13:03:49.539",
  "error": null
}
++ echo '{' '"upstream_id":' 7365376783338559389, '"namespace":' '"default",' '"id":' '"c8ea02dd-3cf1-4148-919d-3d03c58f11c2",' '"state":' '"normal",' '"checkpoint_tso":' 449546926041071617, '"checkpoint_time":' '"2024-05-05' '13:03:49.539",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365376783338559389, '"namespace":' '"default",' '"id":' '"c8ea02dd-3cf1-4148-919d-3d03c58f11c2",' '"state":' '"normal",' '"checkpoint_tso":' 449546926041071617, '"checkpoint_time":' '"2024-05-05' '13:03:49.539",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
run task successfully
table changefeed_auto_stop_1.usertable not exists for 1-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table event_filter.finish_mark exists
check diff failed 1-th time, retry later
table mark.finish_mark_1 not exists for 7-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table changefeed_auto_stop_1.usertable not exists for 2-th check, retry later
check diff failed 2-th time, retry later
1:03PM INF > Run case=sql/debezium/datetime_key_test.sql
1:04PM INF > Run case=sql/debezium/db_default_charset.sql
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8cc6d00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:26266, start at 2024-05-05 13:03:59.946285635 +0800 CST m=+5.086759681	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:59.954 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:59.924 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:59.924 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8cc6d00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:26266, start at 2024-05-05 13:03:59.946285635 +0800 CST m=+5.086759681	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:59.954 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:59.924 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:59.924 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8cc76c0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:26342, start at 2024-05-05 13:03:59.986795982 +0800 CST m=+5.082286798	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:05:59.993 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:03:59.963 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:53:59.963 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/canal_json_basic/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/canal_json_basic/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/canal_json_basic/tiflash-proxy.toml"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/canal_json_basic/tiflash/db/proxy"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/canal_json_basic/tiflash/log/proxy.log"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table mark.finish_mark_1 not exists for 8-th check, retry later
table changefeed_auto_stop_1.usertable exists
table changefeed_auto_stop_2.usertable not exists for 1-th check, retry later
check diff successfully
wait process cdc.test exit for 1-th time...
table mark.finish_mark_1 not exists for 9-th check, retry later
wait process cdc.test exit for 2-th time...
table changefeed_auto_stop_2.usertable exists
table changefeed_auto_stop_3.usertable not exists for 1-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:04:03 CST 2024] <<<<<< run test case event_filter success! >>>>>>
[Sun May  5 13:04:03 CST 2024] <<<<<< START cdc server in canal_json_basic case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_basic.2772127723.out server --log-file /tmp/tidb_cdc_test/canal_json_basic/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/canal_json_basic/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table mark.finish_mark_1 not exists for 10-th check, retry later
table changefeed_auto_stop_3.usertable exists
table changefeed_auto_stop_4.usertable not exists for 1-th check, retry later
1:04PM INF > Run case=sql/debezium/db_default_charset_noutf.sql
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:04:06 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b0a22fde-2967-4cb4-88dd-85343f887c53
	{"id":"b0a22fde-2967-4cb4-88dd-85343f887c53","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885443}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47230622c3
	b0a22fde-2967-4cb4-88dd-85343f887c53

/tidb/cdc/default/default/upstream/7365376844063090003
	{"id":7365376844063090003,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b0a22fde-2967-4cb4-88dd-85343f887c53
	{"id":"b0a22fde-2967-4cb4-88dd-85343f887c53","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885443}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47230622c3
	b0a22fde-2967-4cb4-88dd-85343f887c53

/tidb/cdc/default/default/upstream/7365376844063090003
	{"id":7365376844063090003,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b0a22fde-2967-4cb4-88dd-85343f887c53
	{"id":"b0a22fde-2967-4cb4-88dd-85343f887c53","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885443}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47230622c3
	b0a22fde-2967-4cb4-88dd-85343f887c53

/tidb/cdc/default/default/upstream/7365376844063090003
	{"id":7365376844063090003,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_basic.cli.27781.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-canal-json-basic?protocol=canal-json&enable-tidb-extension=true'
Create changefeed successfully!
ID: d71bec41-3c44-4ec4-b20a-6ee517005be6
Info: {"upstream_id":7365376844063090003,"namespace":"default","id":"d71bec41-3c44-4ec4-b20a-6ee517005be6","sink_uri":"kafka://127.0.0.1:9092/ticdc-canal-json-basic?protocol=canal-json\u0026enable-tidb-extension=true","create_time":"2024-05-05T13:04:06.673156499+08:00","start_ts":449546930493587461,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546930493587461,"checkpoint_ts":449546930493587461,"checkpoint_time":"2024-05-05 13:04:06.524"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table mark.finish_mark_1 not exists for 11-th check, retry later
table changefeed_auto_stop_4.usertable exists
check diff failed 1-th time, retry later
+ set +x
table mark.finish_mark_1 exists
table mark.finish_mark_2 not exists for 1-th check, retry later
check diff failed 2-th time, retry later
1:04PM INF > Run case=sql/debezium/decimal_column_test.sql
table mark.finish_mark_2 not exists for 2-th check, retry later
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:04:12 CST 2024] <<<<<< run test case changefeed_auto_stop success! >>>>>>
table mark.finish_mark_2 not exists for 3-th check, retry later
[Sun May  5 13:04:13 CST 2024] <<<<<< START kafka consumer in canal_json_basic case >>>>>>
table mark.finish_mark_2 not exists for 4-th check, retry later
1:04PM INF > Run case=sql/debezium/enum_column_test.sql
table mark.finish_mark_2 not exists for 5-th check, retry later
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/sql_mode/run.sh using Sink-Type: kafka... <<=================
+++ dirname /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/sql_mode/run.sh
++ cd /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/sql_mode
++ pwd
+ CUR=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/sql_mode
+ source /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/sql_mode/../_utils/test_prepare
++ UP_TIDB_HOST=127.0.0.1
++ UP_TIDB_PORT=4000
++ UP_TIDB_OTHER_PORT=4001
++ UP_TIDB_STATUS=10080
++ UP_TIDB_OTHER_STATUS=10081
++ DOWN_TIDB_HOST=127.0.0.1
++ DOWN_TIDB_PORT=3306
++ DOWN_TIDB_STATUS=20080
++ TLS_TIDB_HOST=127.0.0.1
++ TLS_TIDB_PORT=3307
++ TLS_TIDB_STATUS=30080
++ UP_PD_HOST_1=127.0.0.1
++ UP_PD_PORT_1=2379
++ UP_PD_PEER_PORT_1=2380
++ UP_PD_HOST_2=127.0.0.1
++ UP_PD_PORT_2=2679
++ UP_PD_PEER_PORT_2=2680
++ UP_PD_HOST_3=127.0.0.1
++ UP_PD_PORT_3=2779
++ UP_PD_PEER_PORT_3=2780
++ DOWN_PD_HOST=127.0.0.1
++ DOWN_PD_PORT=2479
++ DOWN_PD_PEER_PORT=2480
++ TLS_PD_HOST=127.0.0.1
++ TLS_PD_PORT=2579
++ TLS_PD_PEER_PORT=2580
++ UP_TIKV_HOST_1=127.0.0.1
++ UP_TIKV_PORT_1=20160
++ UP_TIKV_STATUS_PORT_1=20181
++ UP_TIKV_HOST_2=127.0.0.1
++ UP_TIKV_PORT_2=20161
++ UP_TIKV_STATUS_PORT_2=20182
++ UP_TIKV_HOST_3=127.0.0.1
++ UP_TIKV_PORT_3=20162
++ UP_TIKV_STATUS_PORT_3=20183
++ DOWN_TIKV_HOST=127.0.0.1
++ DOWN_TIKV_PORT=21160
++ DOWN_TIKV_STATUS_PORT=21180
++ TLS_TIKV_HOST=127.0.0.1
++ TLS_TIKV_PORT=22160
++ TLS_TIKV_STATUS_PORT=22180
+++ cat /tmp/tidb_cdc_test/KAFKA_VERSION
+++ echo 2.4.1
++ KAFKA_VERSION=2.4.1
+ WORK_DIR=/tmp/tidb_cdc_test/sql_mode
+ CDC_BINARY=cdc.test
+ SINK_TYPE=kafka
+ CDC_COUNT=3
+ DB_COUNT=4
+ rm -rf /tmp/tidb_cdc_test/sql_mode
+ mkdir -p /tmp/tidb_cdc_test/sql_mode
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/sql_mode
The 1 times to try to start tidb cluster...
table mark.finish_mark_2 exists
table mark.finish_mark_3 not exists for 1-th check, retry later
table mark.finish_mark_3 not exists for 2-th check, retry later
table mark.finish_mark_3 not exists for 3-th check, retry later
1:04PM INF > Run case=sql/debezium/multitable_dbz_871.sql
start tidb cluster in /tmp/tidb_cdc_test/sql_mode
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table mark.finish_mark_3 not exists for 4-th check, retry later
table mark.finish_mark_3 not exists for 5-th check, retry later
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
1:04PM INF > Run case=sql/debezium/mysql_dbz_6533.sql
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
table mark.finish_mark_3 not exists for 6-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
table mark.finish_mark_3 exists
table mark.finish_mark_4 not exists for 1-th check, retry later
table mark.finish_mark_4 not exists for 2-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 3-th check, retry later
table mark.finish_mark_4 not exists for 3-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee0a00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30321, start at 2024-05-05 13:04:34.366368738 +0800 CST m=+5.031299933	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.372 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.344 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.344 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee0a00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30321, start at 2024-05-05 13:04:34.366368738 +0800 CST m=+5.031299933	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.372 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.344 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.344 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee2e4000a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30404, start at 2024-05-05 13:04:34.497788452 +0800 CST m=+5.111831189	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.506 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.489 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.489 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/sql_mode/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/sql_mode/tiflash/log/error.log
arg matches is ArgMatches { args: {"addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/sql_mode/tiflash/db/proxy"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/sql_mode/tiflash-proxy.toml"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/sql_mode/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table test.finish_mark not exists for 4-th check, retry later
table mark.finish_mark_4 not exists for 4-th check, retry later
+ trap stop_tidb_cluster EXIT
+ run_sql 'set global sql_mode='\''NO_BACKSLASH_ESCAPES'\'';' 127.0.0.1 4000
+ run_sql 'set global sql_mode='\''NO_BACKSLASH_ESCAPES'\'';' 127.0.0.1 3306
+ cd /tmp/tidb_cdc_test/sql_mode
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.sql_mode.cli.31804.out cli tso query --pd=http://127.0.0.1:2379
table test.finish_mark not exists for 5-th check, retry later
table mark.finish_mark_4 not exists for 5-th check, retry later
+ set +x
+ tso='449546938743521281
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546938743521281 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449546938743521281
+ run_cdc_server --workdir /tmp/tidb_cdc_test/sql_mode --binary cdc.test
[Sun May  5 13:04:39 CST 2024] <<<<<< START cdc server in sql_mode case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ GO_FAILPOINTS=
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.sql_mode.3184131843.out server --log-file /tmp/tidb_cdc_test/sql_mode/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/sql_mode/cdc_data --cluster-id default
+ [[ no != \n\o ]]
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table test.finish_mark not exists for 6-th check, retry later
table mark.finish_mark_4 not exists for 6-th check, retry later
table test.finish_mark not exists for 7-th check, retry later
1:04PM INF > Run case=sql/debezium/nationalized_character_test.sql
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:04:42 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b5514219-29a8-447b-8694-896acdb6f4c3
	{"id":"b5514219-29a8-447b-8694-896acdb6f4c3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885479}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472390afce
	b5514219-29a8-447b-8694-896acdb6f4c3

/tidb/cdc/default/default/upstream/7365376996821268329
	{"id":7365376996821268329,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b5514219-29a8-447b-8694-896acdb6f4c3
	{"id":"b5514219-29a8-447b-8694-896acdb6f4c3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885479}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472390afce
	b5514219-29a8-447b-8694-896acdb6f4c3

/tidb/cdc/default/default/upstream/7365376996821268329
	{"id":7365376996821268329,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b5514219-29a8-447b-8694-896acdb6f4c3
	{"id":"b5514219-29a8-447b-8694-896acdb6f4c3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885479}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472390afce
	b5514219-29a8-447b-8694-896acdb6f4c3

/tidb/cdc/default/default/upstream/7365376996821268329
	{"id":7365376996821268329,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449546938743521281 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.sql_mode.cli.31901.out cli changefeed create --start-ts=449546938743521281 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365376996821268329,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T13:04:42.952189588+08:00","start_ts":449546938743521281,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546938743521281,"checkpoint_ts":449546938743521281,"checkpoint_time":"2024-05-05 13:04:37.995"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
table mark.finish_mark_4 not exists for 7-th check, retry later
table test.finish_mark not exists for 8-th check, retry later
+ set +x
+ run_sql 'use test; create table t1(id bigint primary key, a text, b text as ((regexp_replace(a, '\''^[1-9]\d{9,29}$'\'', '\''aaaaa'\''))), c text); insert into t1 (id, a, c) values(1,123456, '\''ab\\\\c'\''); insert into t1 (id, a, c) values(2,1234567890123, '\''ab\\c'\'');' 127.0.0.1 4000
+ '[' kafka == mysql ']'
+ stop_tidb_cluster
table mark.finish_mark_4 not exists for 8-th check, retry later
table test.finish_mark not exists for 9-th check, retry later
table mark.finish_mark_4 not exists for 9-th check, retry later
1:04PM INF > Run case=sql/debezium/numeric_column_test.sql
table test.finish_mark not exists for 10-th check, retry later
table mark.finish_mark_4 exists
table mark.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 11-th check, retry later
table mark.finish_mark not exists for 2-th check, retry later
table test.finish_mark exists
check diff successfully
1:04PM INF > Run case=sql/debezium/readbinlog_test.sql
table mark.finish_mark not exists for 3-th check, retry later
1:04PM INF > Run case=sql/debezium/real_test.sql
table mark.finish_mark not exists for 4-th check, retry later
table mark.finish_mark not exists for 5-th check, retry later
table mark.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/sql_mode
The 1 times to try to start tidb cluster...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 13:05:01 CST 2024] <<<<<< run test case multi_source success! >>>>>>
table test.finish_mark not exists for 1-th check, retry later
1:05PM INF > Run case=sql/debezium/regression_test.sql
table test.finish_mark not exists for 2-th check, retry later
start tidb cluster in /tmp/tidb_cdc_test/sql_mode
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table test.finish_mark not exists for 3-th check, retry later
table test.finish_mark not exists for 4-th check, retry later
table test.finish_mark exists
check diff successfully
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:05:12 CST 2024] <<<<<< run test case canal_json_basic success! >>>>>>
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee0a00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30321, start at 2024-05-05 13:04:34.366368738 +0800 CST m=+5.031299933	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.372 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	60m	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.344 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.344 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee0a00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30321, start at 2024-05-05 13:04:34.366368738 +0800 CST m=+5.031299933	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.372 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	60m	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.344 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.344 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee2e4000a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30404, start at 2024-05-05 13:04:34.497788452 +0800 CST m=+5.111831189	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.506 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	60m	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.489 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.489 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
tidb_disable_column_tracking_time	2024-05-05 05:04:35 UTC	Record the last time tidb_enable_column_tracking is set off
ERROR 1396 (HY000) at line 1: Operation CREATE USER failed for 'normal'@'%'
start tidb cluster failed
The 2 times to try to start tidb cluster...
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   221  100   221    0     0   2615      0 --:--:-- --:--:-- --:--:--  2600
100   221  100   221    0     0   2612      0 --:--:-- --:--:-- --:--:--  2600
+ synced_status='{"synced":true,"sink_checkpoint_ts":"2024-05-05 13:05:04.729","puller_resolved_ts":"2024-05-05 13:04:56.729","last_synced_ts":"2024-05-05 13:02:48.328","now_ts":"2024-05-05 13:05:05.000","info":"Data syncing is finished"}'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:05:04.729","puller_resolved_ts":"2024-05-05' '13:04:56.729","last_synced_ts":"2024-05-05' '13:02:48.328","now_ts":"2024-05-05' '13:05:05.000","info":"Data' syncing is 'finished"}'
++ jq .synced
+ status=true
+ '[' true '!=' true ']'
+ kill_pd
++ ps aux
++ grep pd-server
++ grep /tmp/tidb_cdc_test/synced_status
+ info='jenkins    16924  7.2  0.0 13793048 141848 ?     Sl   13:02   0:11 pd-server --advertise-client-urls http://127.0.0.1:2379 --client-urls http://0.0.0.0:2379 --advertise-peer-urls http://127.0.0.1:2380 --peer-urls http://0.0.0.0:2380 --config /tmp/tidb_cdc_test/synced_status/pd-config.toml --log-file /tmp/tidb_cdc_test/synced_status/pd1.log --data-dir /tmp/tidb_cdc_test/synced_status/pd1 --name=pd1 --initial-cluster=pd1=http://127.0.0.1:2380
jenkins    16985  4.8  0.0 13391060 137908 ?     Sl   13:02   0:07 pd-server --advertise-client-urls http://127.0.0.1:2479 --client-urls http://0.0.0.0:2479 --advertise-peer-urls http://127.0.0.1:2480 --peer-urls http://0.0.0.0:2480 --config /tmp/tidb_cdc_test/synced_status/pd-config.toml --log-file /tmp/tidb_cdc_test/synced_status/down_pd.log --data-dir /tmp/tidb_cdc_test/synced_status/down_pd'
++ ps aux
++ grep pd-server
++ grep /tmp/tidb_cdc_test/synced_status
++ awk '{print $2}'
++ xargs kill -9
+ sleep 20
{"level":"warn","ts":1714885512.697925,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0041fbdc0/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":1714885512.6979825,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":1714885512.737895,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0047fce00/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"info","ts":1714885512.7379453,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":1714885512.7778869,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc002232540/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":1714885512.7779334,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":"2024-05-05T13:05:17.63142+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000ea3340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:17.631492+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000aa0700/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:17.681359+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000cd1880/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:23.632417+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000aa0700/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:23.633091+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000ea3340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:23.682595+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000cd1880/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
start tidb cluster in /tmp/tidb_cdc_test/sql_mode
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_content_compatible/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
start tidb cluster in /tmp/tidb_cdc_test/canal_json_content_compatible
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:01 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:02 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:03 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:29.632913+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000aa0700/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:29.633741+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000ea3340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}

  0     0    0     0    0     0      0      0 --:--:--  0:00:04 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:29.683547+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000cd1880/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee0a00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30321, start at 2024-05-05 13:04:34.366368738 +0800 CST m=+5.031299933	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.372 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	60m	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.344 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.344 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee0a00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30321, start at 2024-05-05 13:04:34.366368738 +0800 CST m=+5.031299933	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.372 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	60m	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.344 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.344 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee2e4000a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30404, start at 2024-05-05 13:04:34.497788452 +0800 CST m=+5.111831189	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.506 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	60m	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.489 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.489 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
tidb_disable_column_tracking_time	2024-05-05 05:04:35 UTC	Record the last time tidb_enable_column_tracking is set off
ERROR 1396 (HY000) at line 1: Operation CREATE USER failed for 'normal'@'%'
start tidb cluster failed
The 3 times to try to start tidb cluster...
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)

  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:06 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:07 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:08 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:09 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:35.633418+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000aa0700/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:35.634655+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000ea3340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}

  0     0    0     0    0     0      0      0 --:--:--  0:00:10 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:35.685007+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000cd1880/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}

  0     0    0     0    0     0      0      0 --:--:--  0:00:11 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:37.620948+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000aa0700/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":"2024-05-05T13:05:37.62101+0800","logger":"etcd-client","caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":"2024-05-05T13:05:37.622062+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000ea3340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":"2024-05-05T13:05:37.622123+0800","logger":"etcd-client","caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}

  0     0    0     0    0     0      0      0 --:--:--  0:00:12 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:37.675343+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000cd1880/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"info","ts":"2024-05-05T13:05:37.675406+0800","logger":"etcd-client","caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c92ca800013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:29077, start at 2024-05-05 13:05:38.486589544 +0800 CST m=+5.140954198	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:07:38.493 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:05:38.464 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:55:38.464 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c92ca800013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:29077, start at 2024-05-05 13:05:38.486589544 +0800 CST m=+5.140954198	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:07:38.493 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:05:38.464 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:55:38.464 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c92cb380006	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:29167, start at 2024-05-05 13:05:38.51657341 +0800 CST m=+5.123387091	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:07:38.524 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:05:38.510 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:55:38.510 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/canal_json_content_compatible/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/canal_json_content_compatible/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/canal_json_content_compatible/tiflash/db/proxy"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/canal_json_content_compatible/tiflash-proxy.toml"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/canal_json_content_compatible/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Sun May  5 13:05:41 CST 2024] <<<<<< START cdc server in canal_json_content_compatible case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_content_compatible.3059130593.out server --log-file /tmp/tidb_cdc_test/canal_json_content_compatible/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/canal_json_content_compatible/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3

  0     0    0     0    0     0      0      0 --:--:--  0:00:13 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:14 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:15 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:41.634545+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000aa0700/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:41.635495+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000ea3340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}

  0     0    0     0    0     0      0      0 --:--:--  0:00:16 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:41.686295+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000cd1880/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
start tidb cluster in /tmp/tidb_cdc_test/sql_mode
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:05:44 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/033e64c3-a106-4849-9911-317e06b16e1d
	{"id":"033e64c3-a106-4849-9911-317e06b16e1d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885541}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472486dcbe
	033e64c3-a106-4849-9911-317e06b16e1d

/tidb/cdc/default/default/upstream/7365377269763013429
	{"id":7365377269763013429,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/033e64c3-a106-4849-9911-317e06b16e1d
	{"id":"033e64c3-a106-4849-9911-317e06b16e1d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885541}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472486dcbe
	033e64c3-a106-4849-9911-317e06b16e1d

/tidb/cdc/default/default/upstream/7365377269763013429
	{"id":7365377269763013429,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/033e64c3-a106-4849-9911-317e06b16e1d
	{"id":"033e64c3-a106-4849-9911-317e06b16e1d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885541}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472486dcbe
	033e64c3-a106-4849-9911-317e06b16e1d

/tidb/cdc/default/default/upstream/7365377269763013429
	{"id":7365377269763013429,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_content_compatible.cli.30653.out cli changefeed create '--sink-uri=kafka://127.0.0.1:9092/ticdc-canal-json-content-compatible?protocol=canal-json&enable-tidb-extension=true&content-compatible=true'
Create changefeed successfully!
ID: f73a1ef5-7c51-4488-80c1-43c4e8638ad8
Info: {"upstream_id":7365377269763013429,"namespace":"default","id":"f73a1ef5-7c51-4488-80c1-43c4e8638ad8","sink_uri":"kafka://127.0.0.1:9092/ticdc-canal-json-content-compatible?protocol=canal-json\u0026enable-tidb-extension=true\u0026content-compatible=true","create_time":"2024-05-05T13:05:45.175359405+08:00","start_ts":449546956325519362,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":true,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546956325519362,"checkpoint_ts":449546956325519362,"checkpoint_time":"2024-05-05 13:05:45.065"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x

  0     0    0     0    0     0      0      0 --:--:--  0:00:17 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:18 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:19 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:20 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:21 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:47.63633+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000ea3340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:47.636652+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000aa0700/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}

  0     0    0     0    0     0      0      0 --:--:--  0:00:22 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:47.687376+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000cd1880/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"warn","ts":1714885547.6988363,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0041fbdc0/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":1714885547.6988714,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":1714885547.7382953,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc0047fce00/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
{"level":"info","ts":1714885547.7383265,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
{"level":"warn","ts":1714885547.7790418,"caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc002232540/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"info","ts":1714885547.7790754,"caller":"v3@v3.5.12/client.go:210","msg":"Auto sync endpoints failed.","error":"context deadline exceeded"}
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
[Sun May  5 13:05:51 CST 2024] <<<<<< START kafka consumer in canal_json_content_compatible case >>>>>>
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee0a00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30321, start at 2024-05-05 13:04:34.366368738 +0800 CST m=+5.031299933	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.372 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	60m	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.344 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.344 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee0a00013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30321, start at 2024-05-05 13:04:34.366368738 +0800 CST m=+5.031299933	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.372 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	60m	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.344 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.344 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c8ee2e4000a	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-bxr1t-f9p6k, pid:30404, start at 2024-05-05 13:04:34.497788452 +0800 CST m=+5.111831189	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:06:34.506 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	60m	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:04:34.489 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:54:34.489 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
tidb_disable_column_tracking_time	2024-05-05 05:04:35 UTC	Record the last time tidb_enable_column_tracking is set off
ERROR 1396 (HY000) at line 1: Operation CREATE USER failed for 'normal'@'%'
start tidb cluster failed
+ run_sql 'set global sql_mode='\''ANSI_QUOTES'\'';' 127.0.0.1 4000
+ run_sql 'set global sql_mode='\''ANSI_QUOTES'\'';' 127.0.0.1 3306
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.sql_mode.cli.36025.out cli tso query --pd=http://127.0.0.1:2379

  0     0    0     0    0     0      0      0 --:--:--  0:00:23 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:24 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:25 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:26 --:--:--     0
  0     0    0     0    0     0      0      0 --:--:--  0:00:27 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:53.637306+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000aa0700/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}
{"level":"warn","ts":"2024-05-05T13:05:53.637362+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000ea3340/127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2379: connect: connection refused\""}

  0     0    0     0    0     0      0      0 --:--:--  0:00:28 --:--:--     0{"level":"warn","ts":"2024-05-05T13:05:53.688444+0800","logger":"etcd-client","caller":"v3@v3.5.12/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"etcd-endpoints://0xc000cd1880/127.0.0.1:2479","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = latest balancer error: last connection error: connection error: desc = \"transport: Error while dialing: dial tcp 127.0.0.1:2479: connect: connection refused\""}
+ set +x
+ tso='449546958593064963
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546958593064963 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449546958593064963
+ run_cdc_server --workdir /tmp/tidb_cdc_test/sql_mode --binary cdc.test
[Sun May  5 13:05:55 CST 2024] <<<<<< START cdc server in sql_mode case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.sql_mode.3606136063.out server --log-file /tmp/tidb_cdc_test/sql_mode/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/sql_mode/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3

  0     0    0     0    0     0      0      0 --:--:--  0:00:29 --:--:--     0
100   135  100   135    0     0      4      0  0:00:33  0:00:30  0:00:03    27
100   135  100   135    0     0      4      0  0:00:33  0:00:30  0:00:03    33
+ synced_status='{
    "error_msg": "[CDC:ErrPDEtcdAPIError]etcd api call error: context deadline exceeded",
    "error_code": "CDC:ErrPDEtcdAPIError"
}'
++ echo '{' '"error_msg":' '"[CDC:ErrPDEtcdAPIError]etcd' api call error: context deadline 'exceeded",' '"error_code":' '"CDC:ErrPDEtcdAPIError"' '}'
++ jq -r .error_code
+ error_code=CDC:ErrPDEtcdAPIError
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
table test.finish_mark not exists for 1-th check, retry later
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   723  100   723    0     0   8736      0 --:--:-- --:--:-- --:--:--  8817
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 13:03:35.334","puller_resolved_ts":"2024-05-05 13:03:35.334","last_synced_ts":"2024-05-05 13:03:30.784","now_ts":"2024-05-05 13:05:46.000","info":"Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' \u003e '\''Resolved-Ts'\'' \u003e '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:03:35.334","puller_resolved_ts":"2024-05-05' '13:03:35.334","last_synced_ts":"2024-05-05' '13:03:30.784","now_ts":"2024-05-05' '13:05:46.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:03:35.334","puller_resolved_ts":"2024-05-05' '13:03:35.334","last_synced_ts":"2024-05-05' '13:03:30.784","now_ts":"2024-05-05' '13:05:46.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq -r .info
+ info='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ target_message='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ '[' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' '!=' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' ']'
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
+ stop_tidb_cluster
+ run_case_with_unavailable_tidb conf/changefeed-redo.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status_with_redo
+ mkdir -p /tmp/tidb_cdc_test/synced_status_with_redo
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status_with_redo
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
The 1 times to try to start tidb cluster...
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
start tidb cluster in /tmp/tidb_cdc_test/synced_status_with_redo
Starting Upstream PD...
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:05:58 GMT
< Content-Type: text/plain; charset=utf-8
< Transfer-Encoding: chunked
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** processors info ***:

changefeedID: default/test-1
{UpstreamID:7365376996821268329 Namespace:default ID:test-1 SinkURI:mysql://root@127.0.0.1:3306/?max-txn-row=1 CreateTime:2024-05-05 13:04:42.952189588 +0800 CST StartTs:449546938743521281 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc00411b4d0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546940041134082}
{CheckpointTs:449546940421242888 MinTableBarrierTs:449546940421242888 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b5514219-29a8-447b-8694-896acdb6f4c3
	{"id":"b5514219-29a8-447b-8694-896acdb6f4c3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885479}

/tidb/cdc/default/__cdc_meta__/capture/c4300765-9991-46c6-abf1-b1e06c718ea6
	{"id":"c4300765-9991-46c6-abf1-b1e06c718ea6","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885555}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472390afce
	b5514219-29a8-447b-8694-896acdb6f4c3

/tidb/cdc/default/__cdc_meta__/owner/22318f4724c2ced2
	c4300765-9991-46c6-abf1-b1e06c718ea6

/tidb/cdc/default/default/changefeed/info/test-1
	{"upstream-id":7365376996821268329,"namespace":"default","changefeed-id":"test-1","sink-uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create-time":"2024-05-05T13:04:42.952189588+08:00","start-ts":449546938743521281,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546940041134082}

/tidb/cdc/default/default/changefeed/status/test-1
	{"checkpoint-ts":449546940421242888,"min-table-barrier-ts":449546940421242888,"admin-job-type":0}

/tidb/cdc/default/default/task/position/b5514219-29a8-447b-8694-896acdb6f4c3/test-1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/c4300765-9991-46c6-abf1-b1e06c718ea6/test-1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365376996821268329
	{"id":7365376996821268329,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** processors info ***:

changefeedID: default/test-1
{UpstreamID:7365376996821268329 Namespace:default ID:test-1 SinkURI:mysql://root@127.0.0.1:3306/?max-txn-row=1 CreateTime:2024-05-05 13:04:42.952189588 +0800 CST StartTs:449546938743521281 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc00411b4d0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546940041134082}
{CheckpointTs:449546940421242888 MinTableBarrierTs:449546940421242888 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b5514219-29a8-447b-8694-896acdb6f4c3
	{"id":"b5514219-29a8-447b-8694-896acdb6f4c3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885479}

/tidb/cdc/default/__cdc_meta__/capture/c4300765-9991-46c6-abf1-b1e06c718ea6
	{"id":"c4300765-9991-46c6-abf1-b1e06c718ea6","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885555}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472390afce
	b5514219-29a8-447b-8694-896acdb6f4c3

/tidb/cdc/default/__cdc_meta__/owner/22318f4724c2ced2
	c4300765-9991-46c6-abf1-b1e06c718ea6

/tidb/cdc/default/default/changefeed/info/test-1
	{"upstream-id":7365376996821268329,"namespace":"default","changefeed-id":"test-1","sink-uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create-time":"2024-05-05T13:04:42.952189588+08:00","start-ts":449546938743521281,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546940041134082}

/tidb/cdc/default/default/changefeed/status/test-1
	{"checkpoint-ts":449546940421242888,"min-table-barrier-ts":449546940421242888,"admin-job-type":0}

/tidb/cdc/default/default/task/position/b5514219-29a8-447b-8694-896acdb6f4c3/test-1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/c4300765-9991-46c6-abf1-b1e06c718ea6/test-1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null+ grep -q 'failed to get info:'
}

/tidb/cdc/default/default/upstream/7365376996821268329
	{"id":7365376996821268329,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** processors info ***:

changefeedID: default/test-1
{UpstreamID:7365376996821268329 Namespace:default ID:test-1 SinkURI:mysql://root@127.0.0.1:3306/?max-txn-row=1 CreateTime:2024-05-05 13:04:42.952189588 +0800 CST StartTs:449546938743521281 TargetTs:0 AdminJobType:noop Engine:unified SortDir: Config:0xc00411b4d0 State:normal Error:<nil> Warning:<nil> CreatorVersion:v8.2.0-alpha-53-g0de8dc3e4 Epoch:449546940041134082}
{CheckpointTs:449546940421242888 MinTableBarrierTs:449546940421242888 AdminJobType:noop}



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/b5514219-29a8-447b-8694-896acdb6f4c3
	{"id":"b5514219-29a8-447b-8694-896acdb6f4c3","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885479}

/tidb/cdc/default/__cdc_meta__/capture/c4300765-9991-46c6-abf1-b1e06c718ea6
	{"id":"c4300765-9991-46c6-abf1-b1e06c718ea6","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885555}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472390afce
	b5514219-29a8-447b-8694-896acdb6f4c3

/tidb/cdc/default/__cdc_meta__/owner/22318f4724c2ced2
	c4300765-9991-46c6-abf1-b1e06c718ea6

/tidb/cdc/default/default/changefeed/info/test-1
	{"upstream-id":7365376996821268329,"namespace":"default","changefeed-id":"test-1","sink-uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create-time":"2024-05-05T13:04:42.952189588+08:00","start-ts":449546938743521281,"target-ts":0,"admin-job-type":0,"sort-engine":"","sort-dir":"","config":{"memory-quota":1073741824,"case-sensitive":false,"force-replicate":false,"check-gc-safe-point":true,"enable-sync-point":false,"enable-table-monitor":false,"ignore-ineligible-table":false,"bdr-mode":false,"sync-point-interval":600000000000,"sync-point-retention":86400000000000,"filter":{"rules":["*.*"],"ignore-txn-start-ts":null,"event-filters":null},"mounter":{"worker-num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include-commit-ts":false,"binary-encoding-method":"base64","output-old-value":false,"output-handle-key":false},"encoder-concurrency":32,"terminator":"\r\n","date-separator":"day","enable-partition-separator":true,"enable-kafka-sink-v2":false,"only-output-updated-columns":false,"delete-only-output-handle-key-columns":false,"content-compatible":false,"advance-timeout-in-sec":150,"send-bootstrap-interval-in-sec":120,"send-bootstrap-in-msg-count":10000,"send-bootstrap-to-all-partition":true,"debezium-disable-schema":false,"open":{"output-old-value":true},"debezium":{"output-old-value":true}},"consistent":{"level":"none","max-log-size":64,"flush-interval":2000,"meta-flush-interval":200,"encoding-worker-num":16,"flush-worker-num":8,"storage":"","use-file-backend":false,"compression":"","memory-usage":{"memory-quota-percentage":50}},"scheduler":{"enable-table-across-nodes":false,"region-threshold":100000,"write-key-threshold":0,"region-per-span":0},"integrity":{"integrity-check-level":"none","corruption-handle-level":"warn"},"changefeed-error-stuck-duration":1800000000000,"synced-status":{"synced-check-interval":300,"checkpoint-interval":15},"sql-mode":""},"state":"normal","error":null,"warning":null,"creator-version":"v8.2.0-alpha-53-g0de8dc3e4","epoch":449546940041134082}

/tidb/cdc/default/default/changefeed/status/test-1
	{"checkpoint-ts":449546940421242888,"min-table-barrier-ts":449546940421242888,"admin-job-type":0}

/tidb/cdc/default/default/task/position/b5514219-29a8-447b-8694-896acdb6f4c3/test-1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/task/position/c4300765-9991-46c6-abf1-b1e06c718ea6/test-1
	{"checkpoint-ts":0,"resolved-ts":0,"count":0,"error":null,"warning":null}

/tidb/cdc/default/default/upstream/7365376996821268329
	{"id":7365376996821268329,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449546958593064963 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-2
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.sql_mode.cli.36126.out cli changefeed create --start-ts=449546958593064963 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-2
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table test.finish_mark not exists for 2-th check, retry later
table test.finish_mark not exists for 3-th check, retry later
Create changefeed successfully!
ID: test-2
Info: {"upstream_id":7365376996821268329,"namespace":"default","id":"test-2","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T13:06:01.22074738+08:00","start_ts":449546958593064963,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546958593064963,"checkpoint_ts":449546958593064963,"checkpoint_time":"2024-05-05 13:05:53.715"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
+ run_sql 'use test; create table t2(id bigint primary key, a date); insert into t2 values(1, '\''2023-02-08'\'');' 127.0.0.1 4000
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
table test.finish_mark not exists for 4-th check, retry later
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 5-th check, retry later
+ run_case_with_unavailable_tikv conf/changefeed.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status
+ mkdir -p /tmp/tidb_cdc_test/synced_status
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
The 1 times to try to start tidb cluster...
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 6-th check, retry later
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
start tidb cluster in /tmp/tidb_cdc_test/synced_status
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
table test.finish_mark not exists for 7-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Verifying downstream PD is started...
table test.finish_mark not exists for 8-th check, retry later
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c94b578001b	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:16452, start at 2024-05-05 13:06:09.928512619 +0800 CST m=+5.155682645	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:08:09.937 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:06:09.937 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:56:09.937 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c94b578001b	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:16452, start at 2024-05-05 13:06:09.928512619 +0800 CST m=+5.155682645	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:08:09.937 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:06:09.937 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:56:09.937 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c94b5400014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:16540, start at 2024-05-05 13:06:09.908946666 +0800 CST m=+5.089108365	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:08:09.914 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:06:09.872 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:56:09.872 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/proxy.log"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
1:06PM INF > Run case=sql/debezium/skip_messages_test.sql
1:06PM INF > Run case=sql/debezium/strategy_test.sql
table test.finish_mark not exists for 9-th check, retry later
+ cd /tmp/tidb_cdc_test/synced_status_with_redo
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.17930.out cli tso query --pd=http://127.0.0.1:2379
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
+ tso='449546963749699585
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546963749699585 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449546963749699585
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status_with_redo --binary cdc.test
[Sun May  5 13:06:14 CST 2024] <<<<<< START cdc server in synced_status_with_redo case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.1796617968.out server --log-file /tmp/tidb_cdc_test/synced_status_with_redo/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status_with_redo/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table test.finish_mark not exists for 10-th check, retry later
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 11-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:06:17 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/91bed9d6-4874-42de-938b-046d0c35982e
	{"id":"91bed9d6-4874-42de-938b-046d0c35982e","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885575}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4725019acf
	91bed9d6-4874-42de-938b-046d0c35982e

/tidb/cdc/default/default/upstream/7365377401794059898
	{"id":7365377401794059898,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/91bed9d6-4874-42de-938b-046d0c35982e
	{"id":"91bed9d6-4874-42de-938b-046d0c35982e","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885575}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4725019acf
	91bed9d6-4874-42de-938b-046d0c35982e

/tidb/cdc/default/default/upstream/7365377401794059898
	{"id":7365377401794059898,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/91bed9d6-4874-42de-938b-046d0c35982e
	{"id":"91bed9d6-4874-42de-938b-046d0c35982e","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885575}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4725019acf
	91bed9d6-4874-42de-938b-046d0c35982e

/tidb/cdc/default/default/upstream/7365377401794059898
	{"id":7365377401794059898,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed-redo.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449546963749699585 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.18020.out cli changefeed create --start-ts=449546963749699585 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365377401794059898,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T13:06:18.356342029+08:00","start_ts":449546963749699585,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"eventual","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"storage":"file:///tmp/tidb_cdc_test/synced_status/redo","use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546963749699585,"checkpoint_ts":449546963749699585,"checkpoint_time":"2024-05-05 13:06:13.386"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
table test.finish_mark not exists for 12-th check, retry later
+ set +x
+ run_sql 'USE TEST;Create table t1(a int primary key, b int);insert into t1 values(1,2);insert into t1 values(2,3);'
+ check_table_exists test.t1 127.0.0.1 3306
table test.t1 not exists for 1-th check, retry later
1:06PM INF > Run case=sql/debezium/table_column_comment_test.sql
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c9543f40017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:20561, start at 2024-05-05 13:06:19.041524808 +0800 CST m=+5.052593574	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:08:19.047 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:06:19.005 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:56:19.005 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c9543f40017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:20561, start at 2024-05-05 13:06:19.041524808 +0800 CST m=+5.052593574	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:08:19.047 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:06:19.005 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:56:19.005 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c9546300014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:20646, start at 2024-05-05 13:06:19.175879059 +0800 CST m=+5.135872975	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:08:19.182 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:06:19.148 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:56:19.148 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status/tiflash/log/error.log
arg matches is ArgMatches { args: {"config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash-proxy.toml"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/db/proxy"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/log/proxy.log"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
table test.finish_mark not exists for 13-th check, retry later
+ cd /tmp/tidb_cdc_test/synced_status
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.22097.out cli tso query --pd=http://127.0.0.1:2379
table test.t1 exists
+ sleep 5
table test.finish_mark not exists for 14-th check, retry later
+ set +x
+ tso='449546966166405121
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546966166405121 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449546966166405121
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status --binary cdc.test
[Sun May  5 13:06:24 CST 2024] <<<<<< START cdc server in synced_status case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS=
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.2213422136.out server --log-file /tmp/tidb_cdc_test/synced_status/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
table test.finish_mark not exists for 15-th check, retry later
1:06PM INF > Run case=sql/debezium/timestamp_column_test.sql
table test.finish_mark not exists for 16-th check, retry later
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:06:27 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ef81b05f-ac11-48ab-bbc3-2693a1e2b320
	{"id":"ef81b05f-ac11-48ab-bbc3-2693a1e2b320","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885584}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47252569ce
	ef81b05f-ac11-48ab-bbc3-2693a1e2b320

/tidb/cdc/default/default/upstream/7365377432529670929
	{"id":7365377432529670929,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ef81b05f-ac11-48ab-bbc3-2693a1e2b320
	{"id":"ef81b05f-ac11-48ab-bbc3-2693a1e2b320","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885584}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47252569ce
	ef81b05f-ac11-48ab-bbc3-2693a1e2b320

/tidb/cdc/default/default/upstream/7365377432529670929
	{"id":7365377432529670929,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/ef81b05f-ac11-48ab-bbc3-2693a1e2b320
	{"id":"ef81b05f-ac11-48ab-bbc3-2693a1e2b320","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885584}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47252569ce
	ef81b05f-ac11-48ab-bbc3-2693a1e2b320

/tidb/cdc/default/default/upstream/7365377432529670929
	{"id":7365377432529670929,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449546966166405121 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.22196.out cli changefeed create --start-ts=449546966166405121 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365377432529670929,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T13:06:27.591428353+08:00","start_ts":449546966166405121,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546966166405121,"checkpoint_ts":449546966166405121,"checkpoint_time":"2024-05-05 13:06:22.605"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ kill_tidb
++ ps aux
++ grep tidb-server
++ grep /tmp/tidb_cdc_test/synced_status_with_redo
+ info='jenkins    16452 14.3  0.0 2713792 246660 ?      Sl   13:06   0:03 tidb-server -P 4000 -config /tmp/tidb_cdc_test/synced_status_with_redo/tidb-config-1714885564766964300.toml --store tikv --path 127.0.0.1:2379 --status=10080 --log-file /tmp/tidb_cdc_test/synced_status_with_redo/tidb.log
jenkins    16456  4.0  0.0 2670728 202584 ?      Sl   13:06   0:00 tidb-server -P 4001 -config /tmp/tidb_cdc_test/synced_status_with_redo/tidb-config-1714885564769770525.toml --store tikv --path 127.0.0.1:2379 --status=10081 --log-file /tmp/tidb_cdc_test/synced_status_with_redo/tidb_other.log
jenkins    16540 14.2  0.0 2813136 254116 ?      Sl   13:06   0:03 tidb-server -P 3306 -config /tmp/tidb_cdc_test/synced_status_with_redo/tidb-config-1714885564813735693.toml --store tikv --path 127.0.0.1:2479 --status=20080 --log-file /tmp/tidb_cdc_test/synced_status_with_redo/tidb_down.log'
++ ps aux
++ grep tidb-server
++ grep /tmp/tidb_cdc_test/synced_status_with_redo
++ awk '{print $2}'
++ xargs kill -9
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   243  100   243    0     0   2887      0 --:--:-- --:--:-- --:--:--  2892
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 13:06:25.837","puller_resolved_ts":"2024-05-05 13:06:19.837","last_synced_ts":"2024-05-05 13:06:19.887","now_ts":"2024-05-05 13:06:26.000","info":"The data syncing is not finished, please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:06:25.837","puller_resolved_ts":"2024-05-05' '13:06:19.837","last_synced_ts":"2024-05-05' '13:06:19.887","now_ts":"2024-05-05' '13:06:26.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:06:25.837","puller_resolved_ts":"2024-05-05' '13:06:19.837","last_synced_ts":"2024-05-05' '13:06:19.887","now_ts":"2024-05-05' '13:06:26.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq -r .info
+ info='The data syncing is not finished, please wait'
+ target_message='The data syncing is not finished, please wait'
+ '[' 'The data syncing is not finished, please wait' '!=' 'The data syncing is not finished, please wait' ']'
+ sleep 130
+ set +x
+ run_sql 'USE TEST;Create table t1(a int primary key, b int);insert into t1 values(1,2);insert into t1 values(2,3);'
+ check_table_exists test.t1 127.0.0.1 3306
table test.finish_mark exists
check diff successfully
table test.t1 not exists for 1-th check, retry later
table test.t1 exists
+ sleep 5
1:06PM INF > Run case=sql/debezium/tinyint_test.sql
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
+ kill_tikv
++ ps aux
++ grep tikv-server
++ grep /tmp/tidb_cdc_test/synced_status
+ info='jenkins    19919 23.6  0.5 4699172 2229620 ?     Sl   13:06   0:05 tikv-server --pd 127.0.0.1:2379 -A 127.0.0.1:20160 --status-addr 127.0.0.1:20181 --log-file /tmp/tidb_cdc_test/synced_status/tikv1.log --log-level debug -C /tmp/tidb_cdc_test/synced_status/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status/tikv1
jenkins    19920 30.3  0.5 4730408 2275204 ?     Sl   13:06   0:07 tikv-server --pd 127.0.0.1:2379 -A 127.0.0.1:20161 --status-addr 127.0.0.1:20182 --log-file /tmp/tidb_cdc_test/synced_status/tikv2.log --log-level debug -C /tmp/tidb_cdc_test/synced_status/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status/tikv2
jenkins    19921 23.5  0.5 4699176 2228960 ?     Sl   13:06   0:05 tikv-server --pd 127.0.0.1:2379 -A 127.0.0.1:20162 --status-addr 127.0.0.1:20183 --log-file /tmp/tidb_cdc_test/synced_status/tikv3.log --log-level debug -C /tmp/tidb_cdc_test/synced_status/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status/tikv3
jenkins    19923 29.9  0.5 4723752 2263660 ?     Sl   13:06   0:07 tikv-server --pd 127.0.0.1:2479 -A 127.0.0.1:21160 --status-addr 127.0.0.1:21180 --log-file /tmp/tidb_cdc_test/synced_status/tikv_down.log --log-level debug -C /tmp/tidb_cdc_test/synced_status/tikv-config.toml -s /tmp/tidb_cdc_test/synced_status/tikv_down'
++ ps aux
++ grep tikv-server
++ grep /tmp/tidb_cdc_test/synced_status
++ awk '{print $2}'
++ xargs kill -9
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   243  100   243    0     0   2796      0 --:--:-- --:--:-- --:--:--  2825
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 13:06:35.005","puller_resolved_ts":"2024-05-05 13:06:29.055","last_synced_ts":"2024-05-05 13:06:29.105","now_ts":"2024-05-05 13:06:36.000","info":"The data syncing is not finished, please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:06:35.005","puller_resolved_ts":"2024-05-05' '13:06:29.055","last_synced_ts":"2024-05-05' '13:06:29.105","now_ts":"2024-05-05' '13:06:36.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:06:35.005","puller_resolved_ts":"2024-05-05' '13:06:29.055","last_synced_ts":"2024-05-05' '13:06:29.105","now_ts":"2024-05-05' '13:06:36.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq -r .info
+ info='The data syncing is not finished, please wait'
+ target_message='The data syncing is not finished, please wait'
+ '[' 'The data syncing is not finished, please wait' '!=' 'The data syncing is not finished, please wait' ']'
+ sleep 130
1:06PM INF > Run case=sql/debezium/topic_name_sanitization_test.sql
table test.finish_mark not exists for 3-th check, retry later
table test.finish_mark not exists for 4-th check, retry later
table test.finish_mark not exists for 5-th check, retry later
table test.finish_mark not exists for 6-th check, retry later
table test.finish_mark not exists for 7-th check, retry later
1:06PM INF > Run case=sql/debezium/unsigned_integer_test.sql
table test.finish_mark not exists for 8-th check, retry later
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
cdc.test: no process found
wait process cdc.test exit for 4-th time...
process cdc.test already exit
[Sun May  5 13:06:51 CST 2024] <<<<<< run test case canal_json_content_compatible success! >>>>>>
+ '[' kafka == mysql ']'
+ stop_tidb_cluster
+ stop_tidb_cluster
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_topics/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/multi_topics
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c98c2bc0012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:32058, start at 2024-05-05 13:07:16.294969796 +0800 CST m=+5.106722751	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:09:16.301 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:07:16.271 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:57:16.271 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c98c2bc0012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:32058, start at 2024-05-05 13:07:16.294969796 +0800 CST m=+5.106722751	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:09:16.301 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:07:16.271 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:57:16.271 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c98c3640005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:32143, start at 2024-05-05 13:07:16.317083499 +0800 CST m=+5.077089154	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:09:16.323 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:07:16.313 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:57:16.313 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/multi_topics/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/multi_topics/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/multi_topics/tiflash/db/proxy"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/multi_topics/tiflash/log/proxy.log"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/multi_topics/tiflash-proxy.toml"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_topics.cli.33528.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
+ tso='449546981165236225
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449546981165236225 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:07:21 CST 2024] <<<<<< START cdc server in multi_topics case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_topics.3356733569.out server --log-file /tmp/tidb_cdc_test/multi_topics/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_topics/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:07:24 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/a0d9c031-cb97-49b2-8cd1-f5db4ed71f9b
	{"id":"a0d9c031-cb97-49b2-8cd1-f5db4ed71f9b","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885641}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472608d3d1
	a0d9c031-cb97-49b2-8cd1-f5db4ed71f9b

/tidb/cdc/default/default/upstream/7365377683737648840
	{"id":7365377683737648840,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/a0d9c031-cb97-49b2-8cd1-f5db4ed71f9b
	{"id":"a0d9c031-cb97-49b2-8cd1-f5db4ed71f9b","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885641}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472608d3d1
	a0d9c031-cb97-49b2-8cd1-f5db4ed71f9b

/tidb/cdc/default/default/upstream/7365377683737648840
	{"id":7365377683737648840,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/a0d9c031-cb97-49b2-8cd1-f5db4ed71f9b
	{"id":"a0d9c031-cb97-49b2-8cd1-f5db4ed71f9b","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885641}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472608d3d1
	a0d9c031-cb97-49b2-8cd1-f5db4ed71f9b

/tidb/cdc/default/default/upstream/7365377683737648840
	{"id":7365377683737648840,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_topics.cli.33628.out cli changefeed create --start-ts=449546981165236225 '--sink-uri=kafka://127.0.0.1:9092/multi_topics?protocol=canal-json&enable-tidb-extension=true&kafka-version=2.4.1' --config /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_topics/conf/changefeed.toml
Create changefeed successfully!
ID: 34b22df4-e74f-4f5d-9869-18b397c06e71
Info: {"upstream_id":7365377683737648840,"namespace":"default","id":"34b22df4-e74f-4f5d-9869-18b397c06e71","sink_uri":"kafka://127.0.0.1:9092/multi_topics?protocol=canal-json\u0026enable-tidb-extension=true\u0026kafka-version=2.4.1","create_time":"2024-05-05T13:07:24.795090381+08:00","start_ts":449546981165236225,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"dispatchers":[{"matcher":["workload.*"],"topic":"workload"},{"matcher":["test.*"],"topic":"{schema}_{table}"}],"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449546981165236225,"checkpoint_ts":449546981165236225,"checkpoint_time":"2024-05-05 13:07:19.821"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
+ set +x
1:07PM INF > Run case=sql/dml.sql
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   221  100   221    0     0   2583      0 --:--:-- --:--:-- --:--:--  2600
+ synced_status='{"synced":true,"sink_checkpoint_ts":"2024-05-05 13:08:35.937","puller_resolved_ts":"2024-05-05 13:08:29.937","last_synced_ts":"2024-05-05 13:06:19.887","now_ts":"2024-05-05 13:08:37.000","info":"Data syncing is finished"}'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:08:35.937","puller_resolved_ts":"2024-05-05' '13:08:29.937","last_synced_ts":"2024-05-05' '13:06:19.887","now_ts":"2024-05-05' '13:08:37.000","info":"Data' syncing is 'finished"}'
++ jq .synced
+ status=true
+ '[' true '!=' true ']'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:08:35.937","puller_resolved_ts":"2024-05-05' '13:08:29.937","last_synced_ts":"2024-05-05' '13:06:19.887","now_ts":"2024-05-05' '13:08:37.000","info":"Data' syncing is 'finished"}'
++ jq -r .info
+ info='Data syncing is finished'
+ target_message='Data syncing is finished'
+ '[' 'Data syncing is finished' '!=' 'Data syncing is finished' ']'
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
+ run_case_with_failpoint conf/changefeed-redo.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status_with_redo
+ mkdir -p /tmp/tidb_cdc_test/synced_status_with_redo
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status_with_redo
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
The 1 times to try to start tidb cluster...
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
start tidb cluster in /tmp/tidb_cdc_test/synced_status_with_redo
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
1:08PM INF > All tests pass failed=0 passed=219
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   723  100   723    0     0   8718      0 --:--:-- --:--:-- --:--:--  8817
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 13:06:36.006","puller_resolved_ts":"2024-05-05 13:06:36.006","last_synced_ts":"2024-05-05 13:06:29.105","now_ts":"2024-05-05 13:08:46.000","info":"Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' \u003e '\''Resolved-Ts'\'' \u003e '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:06:36.006","puller_resolved_ts":"2024-05-05' '13:06:36.006","last_synced_ts":"2024-05-05' '13:06:29.105","now_ts":"2024-05-05' '13:08:46.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:06:36.006","puller_resolved_ts":"2024-05-05' '13:06:36.006","last_synced_ts":"2024-05-05' '13:06:29.105","now_ts":"2024-05-05' '13:08:46.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq -r .info
+ info='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ target_message='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ '[' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' '!=' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' ']'
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
+ run_case_with_unavailable_tidb conf/changefeed.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status
+ mkdir -p /tmp/tidb_cdc_test/synced_status
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
The 1 times to try to start tidb cluster...
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
[Sun May  5 13:08:57 CST 2024] <<<<<< run test case debezium success! >>>>>>
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
start tidb cluster in /tmp/tidb_cdc_test/synced_status
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c9f18a00010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:19219, start at 2024-05-05 13:09:00.091293484 +0800 CST m=+5.169502183	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:11:00.097 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:09:00.072 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:59:00.072 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c9f18a00010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:19219, start at 2024-05-05 13:09:00.091293484 +0800 CST m=+5.169502183	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:11:00.097 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:09:00.072 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:59:00.072 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c9f18840014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-l25q9-6mpjx, pid:19298, start at 2024-05-05 13:09:00.08847432 +0800 CST m=+5.117874778	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:11:00.095 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:09:00.065 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:59:00.065 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash-proxy.toml"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/log/proxy.log"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status_with_redo/tiflash/db/proxy"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
Verifying downstream PD is started...
+ cd /tmp/tidb_cdc_test/synced_status_with_redo
+ export 'GO_FAILPOINTS=github.com/pingcap/tiflow/cdc/owner/ChangefeedOwnerNotUpdateCheckpoint=return(true)'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ChangefeedOwnerNotUpdateCheckpoint=return(true)'
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.20747.out cli tso query --pd=http://127.0.0.1:2379
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
+ set +x
+ tso='449547008349831169
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449547008349831169 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449547008349831169
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status_with_redo --binary cdc.test
[Sun May  5 13:09:04 CST 2024] <<<<<< START cdc server in synced_status_with_redo case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ChangefeedOwnerNotUpdateCheckpoint=return(true)'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.2079120793.out server --log-file /tmp/tidb_cdc_test/synced_status_with_redo/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status_with_redo/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:09:08 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f2e38baf-0f12-45b8-8f07-56c6df46a64f
	{"id":"f2e38baf-0f12-45b8-8f07-56c6df46a64f","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885745}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47279c26ce
	f2e38baf-0f12-45b8-8f07-56c6df46a64f

/tidb/cdc/default/default/upstream/7365378131572866120
	{"id":7365378131572866120,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f2e38baf-0f12-45b8-8f07-56c6df46a64f
	{"id":"f2e38baf-0f12-45b8-8f07-56c6df46a64f","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885745}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47279c26ce
	f2e38baf-0f12-45b8-8f07-56c6df46a64f

/tidb/cdc/default/default/upstream/7365378131572866120
	{"id":7365378131572866120,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/f2e38baf-0f12-45b8-8f07-56c6df46a64f
	{"id":"f2e38baf-0f12-45b8-8f07-56c6df46a64f","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885745}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47279c26ce
	f2e38baf-0f12-45b8-8f07-56c6df46a64f

/tidb/cdc/default/default/upstream/7365378131572866120
	{"id":7365378131572866120,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed-redo.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449547008349831169 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status_with_redo.cli.20843.out cli changefeed create --start-ts=449547008349831169 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status_with_redo/conf/changefeed-redo.toml
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365378131572866120,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T13:09:08.48184173+08:00","start_ts":449547008349831169,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"eventual","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"storage":"file:///tmp/tidb_cdc_test/synced_status/redo","use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547008349831169,"checkpoint_ts":449547008349831169,"checkpoint_time":"2024-05-05 13:09:03.522"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ set +x
+ sleep 20
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/lossy_ddl/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 13:09:09 CST 2024] <<<<<< run test case lossy_ddl success! >>>>>>
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c9fbeb00012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:23516, start at 2024-05-05 13:09:10.72405483 +0800 CST m=+5.045347290	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:11:10.729 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:09:10.700 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:59:10.700 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c9fbeb00012	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:23516, start at 2024-05-05 13:09:10.72405483 +0800 CST m=+5.045347290	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:11:10.729 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:09:10.700 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:59:10.700 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1c9fc0340015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:23605, start at 2024-05-05 13:09:10.839491668 +0800 CST m=+5.106681654	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:11:10.845 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:09:10.847 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-12:59:10.847 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash-proxy.toml"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/log/proxy.log"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/storage_csv_update/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 13:09:12 CST 2024] <<<<<< run test case storage_csv_update success! >>>>>>
+ cd /tmp/tidb_cdc_test/synced_status
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.24933.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
+ tso='449547011175481345
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449547011175481345 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449547011175481345
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status --binary cdc.test
[Sun May  5 13:09:15 CST 2024] <<<<<< START cdc server in synced_status case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.2497124973.out server --log-file /tmp/tidb_cdc_test/synced_status/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status/cdc_data --cluster-id default
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:09:18 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/48b4d179-e09c-4b5f-96bc-17ff800b86bd
	{"id":"48b4d179-e09c-4b5f-96bc-17ff800b86bd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885756}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4727c60ed2
	48b4d179-e09c-4b5f-96bc-17ff800b86bd

/tidb/cdc/default/default/upstream/7365378171086810899
	{"id":7365378171086810899,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/48b4d179-e09c-4b5f-96bc-17ff800b86bd
	{"id":"48b4d179-e09c-4b5f-96bc-17ff800b86bd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885756}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4727c60ed2
	48b4d179-e09c-4b5f-96bc-17ff800b86bd

/tidb/cdc/default/default/upstream/7365378171086810899
	{"id":7365378171086810899,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/48b4d179-e09c-4b5f-96bc-17ff800b86bd
	{"id":"48b4d179-e09c-4b5f-96bc-17ff800b86bd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885756}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4727c60ed2
	48b4d179-e09c-4b5f-96bc-17ff800b86bd

/tidb/cdc/default/default/upstream/7365378171086810899
	{"id":7365378171086810899,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449547011175481345 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.25036.out cli changefeed create --start-ts=449547011175481345 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365378171086810899,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T13:09:19.27245605+08:00","start_ts":449547011175481345,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547011175481345,"checkpoint_ts":449547011175481345,"checkpoint_time":"2024-05-05 13:09:14.301"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
+ run_sql 'USE TEST;Create table t1(a int primary key, b int);insert into t1 values(1,2);insert into t1 values(2,3);'
+ check_table_exists test.t1 127.0.0.1 3306
table test.t1 not exists for 1-th check, retry later
table test.t1 exists
+ sleep 5
+ kill_tidb
++ ps aux
++ grep tidb-server
++ grep /tmp/tidb_cdc_test/synced_status
+ info='jenkins    23516 12.4  0.0 2656644 243536 ?      Sl   13:09   0:02 tidb-server -P 4000 -config /tmp/tidb_cdc_test/synced_status/tidb-config-1714885745672123195.toml --store tikv --path 127.0.0.1:2379 --status=10080 --log-file /tmp/tidb_cdc_test/synced_status/tidb.log
jenkins    23520  3.5  0.0 2679692 201732 ?      Sl   13:09   0:00 tidb-server -P 4001 -config /tmp/tidb_cdc_test/synced_status/tidb-config-1714885745675145272.toml --store tikv --path 127.0.0.1:2379 --status=10081 --log-file /tmp/tidb_cdc_test/synced_status/tidb_other.log
jenkins    23605 14.1  0.0 2845376 257568 ?      Sl   13:09   0:03 tidb-server -P 3306 -config /tmp/tidb_cdc_test/synced_status/tidb-config-1714885745725876720.toml --store tikv --path 127.0.0.1:2479 --status=20080 --log-file /tmp/tidb_cdc_test/synced_status/tidb_down.log'
++ ps aux
++ grep tidb-server
++ grep /tmp/tidb_cdc_test/synced_status
++ awk '{print $2}'
++ xargs kill -9
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   243  100   243    0     0   2493      0 --:--:-- --:--:-- --:--:--  2505
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 13:09:26.700","puller_resolved_ts":"2024-05-05 13:09:20.700","last_synced_ts":"2024-05-05 13:09:20.800","now_ts":"2024-05-05 13:09:27.000","info":"The data syncing is not finished, please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:09:26.700","puller_resolved_ts":"2024-05-05' '13:09:20.700","last_synced_ts":"2024-05-05' '13:09:20.800","now_ts":"2024-05-05' '13:09:27.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:09:26.700","puller_resolved_ts":"2024-05-05' '13:09:20.700","last_synced_ts":"2024-05-05' '13:09:20.800","now_ts":"2024-05-05' '13:09:27.000","info":"The' data syncing is not finished, please 'wait"}'
++ jq -r .info
+ info='The data syncing is not finished, please wait'
+ target_message='The data syncing is not finished, please wait'
+ '[' 'The data syncing is not finished, please wait' '!=' 'The data syncing is not finished, please wait' ']'
+ sleep 130
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   723  100   723    0     0   9284      0 --:--:-- --:--:-- --:--:--  9389
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 13:09:03.522","puller_resolved_ts":"1970-01-01 08:00:00.000","last_synced_ts":"1970-01-01 08:00:00.000","now_ts":"2024-05-05 13:09:29.000","info":"Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' \u003e '\''Resolved-Ts'\'' \u003e '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:09:03.522","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '13:09:29.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:09:03.522","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '13:09:29.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq -r .info
+ info='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ target_message='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ '[' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' '!=' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' ']'
+ export GO_FAILPOINTS=
+ GO_FAILPOINTS=
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
[Sun May  5 13:09:26 CST 2024] <<<<<< START kafka consumer in multi_topics case >>>>>>
schema registry uri found: 1
[Sun May  5 13:09:26 CST 2024] <<<<<< START kafka consumer in multi_topics case >>>>>>
schema registry uri found: 2
[Sun May  5 13:09:26 CST 2024] <<<<<< START kafka consumer in multi_topics case >>>>>>
schema registry uri found: 3
[Sun May  5 13:09:26 CST 2024] <<<<<< START kafka consumer in multi_topics case >>>>>>
table test.table1 not exists for 1-th check, retry later
table test.table1 not exists for 2-th check, retry later
table test.table1 exists
table test.table2 exists
table test.table3 exists
check diff successfully
table test.table10 not exists for 1-th check, retry later
table test.table10 exists
table test.table20 exists
check diff successfully
+ check_logs /tmp/tidb_cdc_test/synced_status_with_redo
++ date
+ echo '[Sun May  5 13:09:40 CST 2024] <<<<<< run test case synced_status_with_redo success! >>>>>>'
[Sun May  5 13:09:40 CST 2024] <<<<<< run test case synced_status_with_redo success! >>>>>>
+ stop_tidb_cluster
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Sun May  5 13:10:03 CST 2024] <<<<<< START kafka consumer in multi_topics case >>>>>>
schema registry uri found: 10
[Sun May  5 13:10:03 CST 2024] <<<<<< START kafka consumer in multi_topics case >>>>>>
schema registry uri found: 20
[Sun May  5 13:10:03 CST 2024] <<<<<< START kafka consumer in multi_topics case >>>>>>
schema registry uri found: finish
table test.finish not exists for 1-th check, retry later
table test.finish not exists for 2-th check, retry later
table test.finish exists
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:10:09 CST 2024] <<<<<< run test case multi_topics success! >>>>>>
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/avro_basic/run.sh using Sink-Type: kafka... <<=================
Starting schema registry...
* About to connect() to 127.0.0.1 port 8088 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8088; Connection refused
* Closing connection 0
* About to connect() to 127.0.0.1 port 8088 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8088; Connection refused
* Closing connection 0
* About to connect() to 127.0.0.1 port 8088 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8088 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8088
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:10:24 GMT
< Content-Type: application/vnd.schemaregistry.v1+json
< Vary: Accept-Encoding, User-Agent
< Content-Length: 2
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/avro_basic
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1ca520d00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:35386, start at 2024-05-05 13:10:38.948067705 +0800 CST m=+5.113490804	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:12:38.954 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:10:38.951 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:00:38.951 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1ca520d00014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:35386, start at 2024-05-05 13:10:38.948067705 +0800 CST m=+5.113490804	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:12:38.954 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:10:38.951 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:00:38.951 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1ca5216c0011	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:35471, start at 2024-05-05 13:10:38.958249735 +0800 CST m=+5.070275211	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:12:38.964 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:10:38.939 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:00:38.939 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/avro_basic/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/avro_basic/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/avro_basic/tiflash/log/proxy.log"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/avro_basic/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/avro_basic/tiflash-proxy.toml"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
 51    49    0     0  100    25      0    235 --:--:-- --:--:-- --:--:--   233
100    49  100    24  100    25    225    234 --:--:-- --:--:-- --:--:--   233
{"compatibility":"NONE"}+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.avro_basic.cli.36797.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
+ tso='449547034322534401
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449547034322534401 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:10:44 CST 2024] <<<<<< START cdc server in avro_basic case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.avro_basic.3683336835.out server --log-file /tmp/tidb_cdc_test/avro_basic/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/avro_basic/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:10:47 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9b9b18a9-9623-4a20-becf-225a2680b30d
	{"id":"9b9b18a9-9623-4a20-becf-225a2680b30d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885844}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47291e97d3
	9b9b18a9-9623-4a20-becf-225a2680b30d

/tidb/cdc/default/default/upstream/7365378556511099995
	{"id":7365378556511099995,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9b9b18a9-9623-4a20-becf-225a2680b30d
	{"id":"9b9b18a9-9623-4a20-becf-225a2680b30d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885844}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47291e97d3
	9b9b18a9-9623-4a20-becf-225a2680b30d

/tidb/cdc/default/default/upstream/7365378556511099995
	{"id":7365378556511099995,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/9b9b18a9-9623-4a20-becf-225a2680b30d
	{"id":"9b9b18a9-9623-4a20-becf-225a2680b30d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885844}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f47291e97d3
	9b9b18a9-9623-4a20-becf-225a2680b30d

/tidb/cdc/default/default/upstream/7365378556511099995
	{"id":7365378556511099995,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.avro_basic.cli.36885.out cli changefeed create --start-ts=449547034322534401 '--sink-uri=kafka://127.0.0.1:9092/ticdc-avro-test?protocol=avro&enable-tidb-extension=true&avro-enable-watermark=true&avro-decimal-handling-mode=string&avro-bigint-unsigned-handling-mode=string' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/avro_basic/conf/changefeed.toml --schema-registry=http://127.0.0.1:8088
Create changefeed successfully!
ID: ac27c165-48e5-44ad-9d99-6fdc5f55c931
Info: {"upstream_id":7365378556511099995,"namespace":"default","id":"ac27c165-48e5-44ad-9d99-6fdc5f55c931","sink_uri":"kafka://127.0.0.1:9092/ticdc-avro-test?protocol=avro\u0026enable-tidb-extension=true\u0026avro-enable-watermark=true\u0026avro-decimal-handling-mode=string\u0026avro-bigint-unsigned-handling-mode=string","create_time":"2024-05-05T13:10:47.556573094+08:00","start_ts":449547034322534401,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"avro","schema_registry":"http://127.0.0.1:8088","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"correctness","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547034322534401,"checkpoint_ts":449547034322534401,"checkpoint_time":"2024-05-05 13:10:42.600"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
+ set +x
[Sun May  5 13:10:48 CST 2024] <<<<<< START kafka consumer in avro_basic case >>>>>>
schema registry uri found: http://127.0.0.1:8088
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:10:56 CST 2024] <<<<<< run test case avro_basic success! >>>>>>
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_handle_key_only/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/canal_json_handle_key_only
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1ca7ae600015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:38097, start at 2024-05-05 13:11:20.761597207 +0800 CST m=+5.115714815	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:13:20.767 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:11:20.728 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:01:20.728 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1ca7ae600015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:38097, start at 2024-05-05 13:11:20.761597207 +0800 CST m=+5.115714815	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:13:20.767 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:11:20.728 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:01:20.728 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1ca7afd40017	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:38183, start at 2024-05-05 13:11:20.859663126 +0800 CST m=+5.166712635	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:13:20.869 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:11:20.872 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:01:20.872 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/canal_json_handle_key_only/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/canal_json_handle_key_only/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/canal_json_handle_key_only/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/canal_json_handle_key_only/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/canal_json_handle_key_only/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_handle_key_only.cli.39561.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
+ tso='449547045235326977
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449547045235326977 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:11:25 CST 2024] <<<<<< START cdc server in canal_json_handle_key_only case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_handle_key_only.3959739599.out server --log-file /tmp/tidb_cdc_test/canal_json_handle_key_only/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/canal_json_handle_key_only/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:11:28 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/e91bb065-d02e-4f34-acff-72fa59d3c3fd
	{"id":"e91bb065-d02e-4f34-acff-72fa59d3c3fd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885886}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4729c5b1d0
	e91bb065-d02e-4f34-acff-72fa59d3c3fd

/tidb/cdc/default/default/upstream/7365378740184408947
	{"id":7365378740184408947,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/e91bb065-d02e-4f34-acff-72fa59d3c3fd
	{"id":"e91bb065-d02e-4f34-acff-72fa59d3c3fd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885886}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4729c5b1d0
	e91bb065-d02e-4f34-acff-72fa59d3c3fd

/tidb/cdc/default/default/upstream/7365378740184408947
	{"id":7365378740184408947,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/e91bb065-d02e-4f34-acff-72fa59d3c3fd
	{"id":"e91bb065-d02e-4f34-acff-72fa59d3c3fd","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885886}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f4729c5b1d0
	e91bb065-d02e-4f34-acff-72fa59d3c3fd

/tidb/cdc/default/default/upstream/7365378740184408947
	{"id":7365378740184408947,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_handle_key_only.cli.39657.out cli changefeed create --start-ts=449547045235326977 '--sink-uri=kafka://127.0.0.1:9092/canal-json-handle-key-only?protocol=canal-json&enable-tidb-extension=true&max-message-bytes=1000&kafka-version=2.4.1' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_handle_key_only/conf/changefeed.toml
Create changefeed successfully!
ID: 14840d24-a965-4c48-979a-1098a4a7c34c
Info: {"upstream_id":7365378740184408947,"namespace":"default","id":"14840d24-a965-4c48-979a-1098a4a7c34c","sink_uri":"kafka://127.0.0.1:9092/canal-json-handle-key-only?protocol=canal-json\u0026enable-tidb-extension=true\u0026max-message-bytes=1000\u0026kafka-version=2.4.1","create_time":"2024-05-05T13:11:29.229659798+08:00","start_ts":449547045235326977,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"handle-key-only","large_message_handle_compression":"snappy","claim_check_storage_uri":""}},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547045235326977,"checkpoint_ts":449547045235326977,"checkpoint_time":"2024-05-05 13:11:24.229"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
+ set +x
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:11:38 CST 2024] <<<<<< run test case canal_json_handle_key_only success! >>>>>>
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   221  100   221    0     0   2567      0 --:--:-- --:--:-- --:--:--  2600
+ synced_status='{"synced":true,"sink_checkpoint_ts":"2024-05-05 13:11:36.850","puller_resolved_ts":"2024-05-05 13:11:29.850","last_synced_ts":"2024-05-05 13:09:20.800","now_ts":"2024-05-05 13:11:38.000","info":"Data syncing is finished"}'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:11:36.850","puller_resolved_ts":"2024-05-05' '13:11:29.850","last_synced_ts":"2024-05-05' '13:09:20.800","now_ts":"2024-05-05' '13:11:38.000","info":"Data' syncing is 'finished"}'
++ jq .synced
+ status=true
+ '[' true '!=' true ']'
++ echo '{"synced":true,"sink_checkpoint_ts":"2024-05-05' '13:11:36.850","puller_resolved_ts":"2024-05-05' '13:11:29.850","last_synced_ts":"2024-05-05' '13:09:20.800","now_ts":"2024-05-05' '13:11:38.000","info":"Data' syncing is 'finished"}'
++ jq -r .info
+ info='Data syncing is finished'
+ target_message='Data syncing is finished'
+ '[' 'Data syncing is finished' '!=' 'Data syncing is finished' ']'
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
+ run_case_with_failpoint conf/changefeed.toml
+ rm -rf /tmp/tidb_cdc_test/synced_status
+ mkdir -p /tmp/tidb_cdc_test/synced_status
+ start_tidb_cluster --workdir /tmp/tidb_cdc_test/synced_status
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
The 1 times to try to start tidb cluster...
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_handle_key_only/run.sh: line 1: 39693 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/changefeed.toml" 2>&1
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/open_protocol_handle_key_only/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
chdir: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory
start tidb cluster in /tmp/tidb_cdc_test/synced_status
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
start tidb cluster in /tmp/tidb_cdc_test/open_protocol_handle_key_only
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caa41f00010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:26224, start at 2024-05-05 13:12:02.953287196 +0800 CST m=+5.121370931	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:14:02.959 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:12:02.940 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:02:02.940 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caa41f00010	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:26224, start at 2024-05-05 13:12:02.953287196 +0800 CST m=+5.121370931	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:14:02.959 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:12:02.940 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:02:02.940 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caa436c0014	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-g1835-cmdmf, pid:26308, start at 2024-05-05 13:12:03.055067565 +0800 CST m=+5.170940554	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:14:03.061 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:12:03.035 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:02:03.035 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/synced_status/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/synced_status/tiflash/log/error.log
arg matches is ArgMatches { args: {"data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/db/proxy"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash-proxy.toml"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/synced_status/tiflash/log/proxy.log"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caa41600021	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:40794, start at 2024-05-05 13:12:02.952859707 +0800 CST m=+5.068541865	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:14:02.959 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:12:02.954 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:02:02.954 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caa41600021	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:40794, start at 2024-05-05 13:12:02.952859707 +0800 CST m=+5.068541865	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:14:02.959 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:12:02.954 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:02:02.954 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caa41f40015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:40876, start at 2024-05-05 13:12:02.983810788 +0800 CST m=+5.051222763	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:14:02.991 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:12:02.991 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:02:02.991 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/open_protocol_handle_key_only/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/open_protocol_handle_key_only/tiflash/log/error.log
arg matches is ArgMatches { args: {"config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/open_protocol_handle_key_only/tiflash-proxy.toml"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/open_protocol_handle_key_only/tiflash/db/proxy"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/open_protocol_handle_key_only/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.open_protocol_handle_key_only.cli.42304.out cli tso query --pd=http://127.0.0.1:2379
+ cd /tmp/tidb_cdc_test/synced_status
+ export 'GO_FAILPOINTS=github.com/pingcap/tiflow/cdc/owner/ChangefeedOwnerNotUpdateCheckpoint=return(true)'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ChangefeedOwnerNotUpdateCheckpoint=return(true)'
++ run_cdc_cli_tso_query 127.0.0.1 2379
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.27750.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
+ tso='449547056300687361
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449547056300687361 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ start_ts=449547056300687361
+ run_cdc_server --workdir /tmp/tidb_cdc_test/synced_status --binary cdc.test
[Sun May  5 13:12:07 CST 2024] <<<<<< START cdc server in synced_status case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ GO_FAILPOINTS='github.com/pingcap/tiflow/cdc/owner/ChangefeedOwnerNotUpdateCheckpoint=return(true)'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.2779127793.out server --log-file /tmp/tidb_cdc_test/synced_status/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/synced_status/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ set +x
+ tso='449547056317464577
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449547056317464577 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:12:07 CST 2024] <<<<<< START cdc server in open_protocol_handle_key_only case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.open_protocol_handle_key_only.4235042352.out server --log-file /tmp/tidb_cdc_test/open_protocol_handle_key_only/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/open_protocol_handle_key_only/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:12:10 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/62b1b569-0155-4676-98bc-29dce987ac9d
	{"id":"62b1b569-0155-4676-98bc-29dce987ac9d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885928}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472a6487cd
	62b1b569-0155-4676-98bc-29dce987ac9d

/tidb/cdc/default/default/upstream/7365378913255274657
	{"id":7365378913255274657,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/62b1b569-0155-4676-98bc-29dce987ac9d
	{"id":"62b1b569-0155-4676-98bc-29dce987ac9d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885928}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472a6487cd
	62b1b569-0155-4676-98bc-29dce987ac9d

/tidb/cdc/default/default/upstream/7365378913255274657
	{"id":7365378913255274657,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/62b1b569-0155-4676-98bc-29dce987ac9d
	{"id":"62b1b569-0155-4676-98bc-29dce987ac9d","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885928}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472a6487cd
	62b1b569-0155-4676-98bc-29dce987ac9d

/tidb/cdc/default/default/upstream/7365378913255274657
	{"id":7365378913255274657,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ config_path=conf/changefeed.toml
+ SINK_URI='mysql://root@127.0.0.1:3306/?max-txn-row=1'
+ run_cdc_cli changefeed create --start-ts=449547056300687361 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.synced_status.cli.27845.out cli changefeed create --start-ts=449547056300687361 '--sink-uri=mysql://root@127.0.0.1:3306/?max-txn-row=1' --changefeed-id=test-1 --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/synced_status/conf/changefeed.toml
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:12:11 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/d999a5e2-1eef-45e6-b588-cd7e9d4551ba
	{"id":"d999a5e2-1eef-45e6-b588-cd7e9d4551ba","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885928}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472a6aa3ce
	d999a5e2-1eef-45e6-b588-cd7e9d4551ba

/tidb/cdc/default/default/upstream/7365378915437054021
	{"id":7365378915437054021,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/d999a5e2-1eef-45e6-b588-cd7e9d4551ba
	{"id":"d999a5e2-1eef-45e6-b588-cd7e9d4551ba","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885928}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472a6aa3ce
	d999a5e2-1eef-45e6-b588-cd7e9d4551ba

/tidb/cdc/default/default/upstream/7365378915437054021
	{"id":7365378915437054021,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/d999a5e2-1eef-45e6-b588-cd7e9d4551ba
	{"id":"d999a5e2-1eef-45e6-b588-cd7e9d4551ba","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885928}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472a6aa3ce
	d999a5e2-1eef-45e6-b588-cd7e9d4551ba

/tidb/cdc/default/default/upstream/7365378915437054021
	{"id":7365378915437054021,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.open_protocol_handle_key_only.cli.42407.out cli changefeed create --start-ts=449547056317464577 '--sink-uri=kafka://127.0.0.1:9092/open-protocol-handle-key-only?protocol=open-protocol&max-message-bytes=800&kafka-version=2.4.1' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/open_protocol_handle_key_only/conf/changefeed.toml
Create changefeed successfully!
ID: 87cc7772-ff4f-411f-a39f-31333349716a
Info: {"upstream_id":7365378915437054021,"namespace":"default","id":"87cc7772-ff4f-411f-a39f-31333349716a","sink_uri":"kafka://127.0.0.1:9092/open-protocol-handle-key-only?protocol=open-protocol\u0026max-message-bytes=800\u0026kafka-version=2.4.1","create_time":"2024-05-05T13:12:11.463588086+08:00","start_ts":449547056317464577,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"handle-key-only","large_message_handle_compression":"lz4","claim_check_storage_uri":""}},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547056317464577,"checkpoint_ts":449547056317464577,"checkpoint_time":"2024-05-05 13:12:06.504"}
PASS
Create changefeed successfully!
ID: test-1
Info: {"upstream_id":7365378913255274657,"namespace":"default","id":"test-1","sink_uri":"mysql://root@127.0.0.1:3306/?max-txn-row=1","create_time":"2024-05-05T13:12:11.420134427+08:00","start_ts":449547056300687361,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":120,"checkpoint_interval":20}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547056300687361,"checkpoint_ts":449547056300687361,"checkpoint_time":"2024-05-05 13:12:06.440"}
PASS
coverage: 2.4% of statements in github.com/pingcap/tiflow/...
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
+ set +x
+ set +x
+ sleep 20
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:12:19 CST 2024] <<<<<< run test case open_protocol_handle_key_only success! >>>>>>
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/open_protocol_handle_key_only/run.sh: line 1: 42450 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/changefeed.toml" 2>&1
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_claim_check/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/canal_json_claim_check
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
++ curl -X GET http://127.0.0.1:8300/api/v2/changefeeds/test-1/synced
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100   723  100   723    0     0  10315      0 --:--:-- --:--:-- --:--:-- 10328
+ synced_status='{"synced":false,"sink_checkpoint_ts":"2024-05-05 13:12:06.440","puller_resolved_ts":"1970-01-01 08:00:00.000","last_synced_ts":"1970-01-01 08:00:00.000","now_ts":"2024-05-05 13:12:32.000","info":"Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' \u003e '\''Resolved-Ts'\'' \u003e '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait"}'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:12:06.440","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '13:12:32.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq .synced
+ status=false
+ '[' false '!=' false ']'
++ echo '{"synced":false,"sink_checkpoint_ts":"2024-05-05' '13:12:06.440","puller_resolved_ts":"1970-01-01' '08:00:00.000","last_synced_ts":"1970-01-01' '08:00:00.000","now_ts":"2024-05-05' '13:12:32.000","info":"Please' check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view ''\''TiKV-Details'\''' '\u003e' ''\''Resolved-Ts'\''' '\u003e' ''\''Max' Leader Resolved TS 'gap'\''' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please 'wait"}'
++ jq -r .info
+ info='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ target_message='Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait'
+ '[' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' '!=' 'Please check whether PD is online and TiKV Regions are all available. If PD is offline or some TiKV regions are not available, it means that the data syncing process is complete. To check whether TiKV regions are all available, you can view '\''TiKV-Details'\'' > '\''Resolved-Ts'\'' > '\''Max Leader Resolved TS gap'\'' on Grafana. If the gap is large, such as a few minutes, it means that some regions in TiKV are unavailable. Otherwise, if the gap is small and PD is online, it means the data syncing is incomplete, so please wait' ']'
+ export GO_FAILPOINTS=
+ GO_FAILPOINTS=
+ cleanup_process cdc.test
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
+ stop_tidb_cluster
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
+ check_logs /tmp/tidb_cdc_test/synced_status
++ date
+ echo '[Sun May  5 13:12:43 CST 2024] <<<<<< run test case synced_status success! >>>>>>'
[Sun May  5 13:12:43 CST 2024] <<<<<< run test case synced_status success! >>>>>>
+ stop_tidb_cluster
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caccb740004	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:43568, start at 2024-05-05 13:12:44.511552498 +0800 CST m=+5.089547451	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:14:44.518 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:12:44.509 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:02:44.509 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caccb740004	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:43568, start at 2024-05-05 13:12:44.511552498 +0800 CST m=+5.089547451	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:14:44.518 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:12:44.509 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:02:44.509 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1cacca800015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:43645, start at 2024-05-05 13:12:44.488946999 +0800 CST m=+5.018674508	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:14:44.497 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:12:44.498 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:02:44.498 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/canal_json_claim_check/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/canal_json_claim_check/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/canal_json_claim_check/tiflash-proxy.toml"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/canal_json_claim_check/tiflash/log/proxy.log"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/canal_json_claim_check/tiflash/db/proxy"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_claim_check.cli.45077.out cli tso query --pd=http://127.0.0.1:2379
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
+ set +x
+ tso='449547067198013441
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449547067198013441 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:12:49 CST 2024] <<<<<< START cdc server in canal_json_claim_check case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ GO_FAILPOINTS=
+ [[ no != \n\o ]]
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_claim_check.4511745119.out server --log-file /tmp/tidb_cdc_test/canal_json_claim_check/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/canal_json_claim_check/cdc_data --cluster-id default
+ (( i = 0 ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:12:52 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4207fda0-f65e-458d-8270-89a95c2e8d37
	{"id":"4207fda0-f65e-458d-8270-89a95c2e8d37","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885969}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472b090bd2
	4207fda0-f65e-458d-8270-89a95c2e8d37

/tidb/cdc/default/default/upstream/7365379090266196926
	{"id":7365379090266196926,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4207fda0-f65e-458d-8270-89a95c2e8d37
	{"id":"4207fda0-f65e-458d-8270-89a95c2e8d37","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885969}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472b090bd2
	4207fda0-f65e-458d-8270-89a95c2e8d37

/tidb/cdc/default/default/upstream/7365379090266196926
	{"id":7365379090266196926,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/4207fda0-f65e-458d-8270-89a95c2e8d37
	{"id":"4207fda0-f65e-458d-8270-89a95c2e8d37","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714885969}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472b090bd2
	4207fda0-f65e-458d-8270-89a95c2e8d37

/tidb/cdc/default/default/upstream/7365379090266196926
	{"id":7365379090266196926,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.canal_json_claim_check.cli.45165.out cli changefeed create --start-ts=449547067198013441 '--sink-uri=kafka://127.0.0.1:9092/canal-json-claim-check?protocol=canal-json&enable-tidb-extension=true&max-message-bytes=1000&kafka-version=2.4.1' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_claim_check/conf/changefeed.toml
Create changefeed successfully!
ID: e3db1860-76c1-48b7-9317-9d42b9d5db37
Info: {"upstream_id":7365379090266196926,"namespace":"default","id":"e3db1860-76c1-48b7-9317-9d42b9d5db37","sink_uri":"kafka://127.0.0.1:9092/canal-json-claim-check?protocol=canal-json\u0026enable-tidb-extension=true\u0026max-message-bytes=1000\u0026kafka-version=2.4.1","create_time":"2024-05-05T13:12:53.002760723+08:00","start_ts":449547067198013441,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"canal-json","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"claim-check","large_message_handle_compression":"snappy","claim_check_storage_uri":"file:///tmp/canal-json-claim-check"}},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547067198013441,"checkpoint_ts":449547067198013441,"checkpoint_time":"2024-05-05 13:12:48.010"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
+ set +x
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:13:00 CST 2024] <<<<<< run test case canal_json_claim_check success! >>>>>>
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_claim_check/run.sh: line 1: 45210 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/changefeed.toml" 2>&1
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/open_protocol_claim_check/run.sh using Sink-Type: kafka... <<=================
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/open_protocol_claim_check
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caf54c80013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:46366, start at 2024-05-05 13:13:26.084287549 +0800 CST m=+5.091604009	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:15:26.090 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:13:26.066 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:03:26.066 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caf54c80013	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:46366, start at 2024-05-05 13:13:26.084287549 +0800 CST m=+5.091604009	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:15:26.090 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:13:26.066 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:03:26.066 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1caf556c0015	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:46451, start at 2024-05-05 13:13:26.140550376 +0800 CST m=+5.089679164	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:15:26.146 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:13:26.107 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:03:26.107 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/open_protocol_claim_check/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/open_protocol_claim_check/tiflash/log/error.log
arg matches is ArgMatches { args: {"engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/open_protocol_claim_check/tiflash-proxy.toml"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/open_protocol_claim_check/tiflash/log/proxy.log"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/open_protocol_claim_check/tiflash/db/proxy"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.open_protocol_claim_check.cli.47844.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
+ tso='449547078105038849
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449547078105038849 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
+ pd_host=127.0.0.1
+ pd_port=2379
+ is_tls=false
+ '[' false == true ']'
++ run_cdc_cli tso query --pd=http://127.0.0.1:2379
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.open_protocol_claim_check.cli.47898.out cli tso query --pd=http://127.0.0.1:2379
+ set +x
+ tso='449547079467925505
PASS
coverage: 1.8% of statements in github.com/pingcap/tiflow/...'
+ echo 449547079467925505 PASS coverage: 1.8% of statements in github.com/pingcap/tiflow/...
+ awk -F ' ' '{print $1}'
+ set +x
[Sun May  5 13:13:36 CST 2024] <<<<<< START cdc server in open_protocol_claim_check case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ (( i <= 50 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.open_protocol_claim_check.4793447936.out server --log-file /tmp/tidb_cdc_test/open_protocol_claim_check/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/open_protocol_claim_check/cdc_data --cluster-id default
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:13:39 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/bc772afe-ff85-427e-bcd0-b27fa6ac27c8
	{"id":"bc772afe-ff85-427e-bcd0-b27fa6ac27c8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714886016}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472bab632e
	bc772afe-ff85-427e-bcd0-b27fa6ac27c8

/tidb/cdc/default/default/upstream/7365379266645012790
	{"id":7365379266645012790,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/bc772afe-ff85-427e-bcd0-b27fa6ac27c8
	{"id":"bc772afe-ff85-427e-bcd0-b27fa6ac27c8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714886016}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472bab632e
	bc772afe-ff85-427e-bcd0-b27fa6ac27c8

/tidb/cdc/default/default/upstream/7365379266645012790
	{"id":7365379266645012790,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/bc772afe-ff85-427e-bcd0-b27fa6ac27c8
	{"id":"bc772afe-ff85-427e-bcd0-b27fa6ac27c8","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714886016}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472bab632e
	bc772afe-ff85-427e-bcd0-b27fa6ac27c8

/tidb/cdc/default/default/upstream/7365379266645012790
	{"id":7365379266645012790,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.open_protocol_claim_check.cli.47988.out cli changefeed create --start-ts=449547078105038849 --target-ts=449547079467925505 '--sink-uri=kafka://127.0.0.1:9092/open-protocol-claim-check?protocol=open-protocol&max-message-bytes=800&kafka-version=2.4.1' --config=/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/open_protocol_claim_check/conf/changefeed.toml
Create changefeed successfully!
ID: 3dfd3cb5-2c3e-4061-89cd-cdb821ac0675
Info: {"upstream_id":7365379266645012790,"namespace":"default","id":"3dfd3cb5-2c3e-4061-89cd-cdb821ac0675","sink_uri":"kafka://127.0.0.1:9092/open-protocol-claim-check?protocol=open-protocol\u0026max-message-bytes=800\u0026kafka-version=2.4.1","create_time":"2024-05-05T13:13:39.811190243+08:00","start_ts":449547078105038849,"target_ts":449547079467925505,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["*.*"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":false,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"kafka_config":{"large_message_handle":{"large_message_handle_option":"claim-check","large_message_handle_compression":"lz4","claim_check_storage_uri":"file:///tmp/open-protocol-claim-check"}},"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547078105038849,"checkpoint_ts":449547078105038849,"checkpoint_time":"2024-05-05 13:13:29.617"}
PASS
coverage: 2.5% of statements in github.com/pingcap/tiflow/...
+ set +x
table test.finish_mark not exists for 1-th check, retry later
table test.finish_mark not exists for 2-th check, retry later
table test.finish_mark exists
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:13:46 CST 2024] <<<<<< run test case open_protocol_claim_check success! >>>>>>
/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/open_protocol_claim_check/run.sh: line 1: 48030 Killed                  cdc_kafka_consumer --upstream-uri $SINK_URI --downstream-uri="mysql://root@127.0.0.1:3306/?safe-mode=true&batch-dml-enable=false" --upstream-tidb-dsn="root@tcp(${UP_TIDB_HOST}:${UP_TIDB_PORT})/?" --config="$CUR/conf/changefeed.toml" 2>&1
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_storage_basic/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 13:13:58 CST 2024] <<<<<< run test case canal_json_storage_basic success! >>>>>>
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/canal_json_storage_partition_table/run.sh using Sink-Type: kafka... <<=================
[Sun May  5 13:14:01 CST 2024] <<<<<< run test case canal_json_storage_partition_table success! >>>>>>
=================>> Running test /home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/tests/integration_tests/multi_tables_ddl/run.sh using Sink-Type: kafka... <<=================
* About to connect() to 127.0.0.1 port 24927 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:24927; Connection refused
* Closing connection 0

 You are running an older version of MinIO released 3 years ago 
 Update: Run `mc admin update` 


Attempting encryption of all config, IAM users and policies on MinIO backend
Endpoint:  http://127.0.0.1:24927

Object API (Amazon S3 compatible):
   Go:         https://docs.min.io/docs/golang-client-quickstart-guide
   Java:       https://docs.min.io/docs/java-client-quickstart-guide
   Python:     https://docs.min.io/docs/python-client-quickstart-guide
   JavaScript: https://docs.min.io/docs/javascript-client-quickstart-guide
   .NET:       https://docs.min.io/docs/dotnet-client-quickstart-guide
* About to connect() to 127.0.0.1 port 24927 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 24927 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:24927
> Accept: */*
> 
< HTTP/1.1 403 Forbidden
< Accept-Ranges: bytes
< Content-Length: 226
< Content-Security-Policy: block-all-mixed-content
< Content-Type: application/xml
< Server: MinIO/RELEASE.2020-07-27T18-37-02Z
< Vary: Origin
< X-Amz-Request-Id: 17CC7FC621D9E8D8
< X-Xss-Protection: 1; mode=block
< Date: Sun, 05 May 2024 05:14:07 GMT
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
Bucket 's3://logbucket/' created
The 1 times to try to start tidb cluster...
start tidb cluster in /tmp/tidb_cdc_test/multi_tables_ddl
Starting Upstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Starting Downstream PD...
Release Version: v8.2.0-alpha-14-g1679dbca2
Edition: Community
Git Commit Hash: 1679dbca25b3483d1375c7e747da27e99ad77360
Git Branch: master
UTC Build Time:  2024-04-30 08:09:12
Verifying upstream PD is started...
Verifying downstream PD is started...
Starting Upstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Downstream TiKV...
TiKV 
Release Version:   8.2.0-alpha
Edition:           Community
Git Commit Hash:   72a0fd5b00235a7c56014b77ddd933e2a0d33c88
Git Commit Branch: master
UTC Build Time:    2024-04-30 02:23:51
Rust Version:      rustc 1.77.0-nightly (89e2160c4 2023-12-27)
Enable Features:   memory-engine pprof-fp jemalloc mem-profiling portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine trace-async-tasks openssl-vendored
Profile:           dist_release
Starting Upstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Starting Downstream TiDB...
Release Version: v8.2.0-alpha-79-g600b2ed4bf
Edition: Community
Git Commit Hash: 600b2ed4bf0aa38224a1c4c4c68831820735515c
Git Branch: master
UTC Build Time: 2024-05-01 02:56:48
GoVersion: go1.21.6
Race Enabled: false
Check Table Before Drop: false
Store: unistore
Verifying Upstream TiDB is started...
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
ERROR 2003 (HY000): Can't connect to MySQL server on '127.0.0.1' (111)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1cb29d0c0006	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:49513, start at 2024-05-05 13:14:19.845896911 +0800 CST m=+5.114774319	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:16:19.852 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:14:19.843 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:04:19.843 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1cb29d0c0006	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:49513, start at 2024-05-05 13:14:19.845896911 +0800 CST m=+5.114774319	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:16:19.852 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:14:19.843 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:04:19.843 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Verifying Downstream TiDB is started...
VARIABLE_NAME	VARIABLE_VALUE	COMMENT
bootstrapped	True	Bootstrap flag. Do not delete.
tidb_server_version	196	Bootstrap version. Do not delete.
system_tz	Asia/Shanghai	TiDB Global System Timezone.
new_collation_enabled	True	If the new collations are enabled. Do not edit it.
ddl_table_version	3	DDL Table Version. Do not delete.
tikv_gc_leader_uuid	63d1cb29db40005	Current GC worker leader UUID. (DO NOT EDIT)
tikv_gc_leader_desc	host:pingcap-tiflow-pull-cdc-integration-kafka-test-1856-nm9bc-lnj5s, pid:49593, start at 2024-05-05 13:14:19.889354005 +0800 CST m=+5.111243259	Host name and pid of current GC leader. (DO NOT EDIT)
tikv_gc_leader_lease	20240505-13:16:19.899 +0800	Current GC worker leader lease. (DO NOT EDIT)
tikv_gc_auto_concurrency	true	Let TiDB pick the concurrency automatically. If set false, tikv_gc_concurrency will be used
tikv_gc_enable	true	Current GC enable status
tikv_gc_run_interval	10m0s	GC run interval, at least 10m, in Go format.
tikv_gc_life_time	10m0s	All versions within life time will not be collected by GC, at least 10m, in Go format.
tikv_gc_last_run_time	20240505-13:14:19.885 +0800	The time when last GC starts. (DO NOT EDIT)
tikv_gc_safe_point	20240505-13:04:19.885 +0800	All versions after safe point can be accessed. (DO NOT EDIT)
Starting Upstream TiFlash...
TiFlash
Release Version: v8.2.0-alpha-16-g8e170090f
Edition:         Community
Git Commit Hash: 8e170090fad91c94bef8d908e21c195c1d145b02
Git Branch:      HEAD
UTC Build Time:  2024-04-30 02:34:21
Enable Features: jemalloc sm4(GmSSL) avx2 avx512 unwind thinlto
Profile:         RELWITHDEBINFO
Compiler:        clang++ 13.0.0

Raft Proxy
Git Commit Hash:   7dc50b4eb06124e31f03adb06c20ff7ab61c5f79
Git Commit Branch: HEAD
UTC Build Time:    2024-04-30 02:38:45
Rust Version:      rustc 1.67.0-nightly (96ddd32c4 2022-11-14)
Storage Engine:    tiflash
Prometheus Prefix: tiflash_proxy_
Profile:           release
Enable Features:   external-jemalloc portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored portable sse test-engine-kv-rocksdb test-engine-raft-raft-engine cloud-aws cloud-gcp cloud-azure openssl-vendored
Verifying Upstream TiFlash is started...
Logging trace to /tmp/tidb_cdc_test/multi_tables_ddl/tiflash/log/server.log
Logging errors to /tmp/tidb_cdc_test/multi_tables_ddl/tiflash/log/error.log
arg matches is ArgMatches { args: {"addr": MatchedArg { occurs: 1, indices: [20], vals: ["127.0.0.1:9000"] }, "advertise-addr": MatchedArg { occurs: 1, indices: [4], vals: ["127.0.0.1:9000"] }, "data-dir": MatchedArg { occurs: 1, indices: [6], vals: ["/tmp/tidb_cdc_test/multi_tables_ddl/tiflash/db/proxy"] }, "engine-version": MatchedArg { occurs: 1, indices: [12], vals: ["v8.2.0-alpha-16-g8e170090f"] }, "engine-label": MatchedArg { occurs: 1, indices: [14], vals: ["tiflash"] }, "engine-git-hash": MatchedArg { occurs: 1, indices: [10], vals: ["8e170090fad91c94bef8d908e21c195c1d145b02"] }, "config": MatchedArg { occurs: 1, indices: [8], vals: ["/tmp/tidb_cdc_test/multi_tables_ddl/tiflash-proxy.toml"] }, "engine-addr": MatchedArg { occurs: 1, indices: [2], vals: ["127.0.0.1:9500"] }, "pd-endpoints": MatchedArg { occurs: 1, indices: [16], vals: ["127.0.0.1:2379"] }, "log-file": MatchedArg { occurs: 1, indices: [18], vals: ["/tmp/tidb_cdc_test/multi_tables_ddl/tiflash/log/proxy.log"] }}, subcommand: None, usage: Some("USAGE:\n    TiFlash Proxy [FLAGS] [OPTIONS] --engine-git-hash <engine-git-hash> --engine-label <engine-label> --engine-version <engine-version>") }
[Sun May  5 13:14:23 CST 2024] <<<<<< START cdc server in multi_tables_ddl case >>>>>>
+ [[ '' == \t\r\u\e ]]
+ set +e
+ get_info_fail_msg='failed to get info:'
+ etcd_info_msg='etcd info'
+ '[' -z '' ']'
+ curl_status_cmd='curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL'
+ [[ no != \n\o ]]
+ GO_FAILPOINTS=
+ (( i = 0 ))
+ cdc.test -test.coverprofile=/tmp/tidb_cdc_test/cov.multi_tables_ddl.5096150963.out server --log-file /tmp/tidb_cdc_test/multi_tables_ddl/cdc.log --log-level debug --data-dir /tmp/tidb_cdc_test/multi_tables_ddl/cdc_data --cluster-id default
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connection refused
* Failed connect to 127.0.0.1:8300; Connection refused
* Closing connection 0
+ res=
+ echo ''
+ grep -q 'failed to get info:'
+ echo ''
+ grep -q 'etcd info'
+ '[' 0 -eq 50 ']'
+ sleep 3
+ (( i++ ))
+ (( i <= 50 ))
++ curl -vsL --max-time 20 http://127.0.0.1:8300/debug/info --user ticdc:ticdc_secret -vsL
* About to connect() to 127.0.0.1 port 8300 (#0)
*   Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 8300 (#0)
* Server auth using Basic with user 'ticdc'
> GET /debug/info HTTP/1.1
> Authorization: Basic dGljZGM6dGljZGNfc2VjcmV0
> User-Agent: curl/7.29.0
> Host: 127.0.0.1:8300
> Accept: */*
> 
< HTTP/1.1 200 OK
< Date: Sun, 05 May 2024 05:14:26 GMT
< Content-Length: 815
< Content-Type: text/plain; charset=utf-8
< 
{ [data not shown]
* Connection #0 to host 127.0.0.1 left intact
+ res='

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/a9348296-2a54-4ae6-b4b9-718cd0d9b99f
	{"id":"a9348296-2a54-4ae6-b4b9-718cd0d9b99f","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714886063}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472c834fc9
	a9348296-2a54-4ae6-b4b9-718cd0d9b99f

/tidb/cdc/default/default/upstream/7365379508944909231
	{"id":7365379508944909231,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/a9348296-2a54-4ae6-b4b9-718cd0d9b99f
	{"id":"a9348296-2a54-4ae6-b4b9-718cd0d9b99f","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714886063}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472c834fc9
	a9348296-2a54-4ae6-b4b9-718cd0d9b99f

/tidb/cdc/default/default/upstream/7365379508944909231
	{"id":7365379508944909231,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'failed to get info:'
+ echo '

*** owner info ***:



*** processors info ***:



*** etcd info ***:

/tidb/cdc/default/__cdc_meta__/capture/a9348296-2a54-4ae6-b4b9-718cd0d9b99f
	{"id":"a9348296-2a54-4ae6-b4b9-718cd0d9b99f","address":"127.0.0.1:8300","version":"v8.2.0-alpha-53-g0de8dc3e4","git-hash":"0de8dc3e43ec741eba58047155ce7f3dba8eb4f7","deploy-path":"/home/jenkins/agent/workspace/pingcap/tiflow/pull_cdc_integration_kafka_test/tiflow/bin/cdc.test","start-timestamp":1714886063}

/tidb/cdc/default/__cdc_meta__/meta/meta-version
	1

/tidb/cdc/default/__cdc_meta__/owner/22318f472c834fc9
	a9348296-2a54-4ae6-b4b9-718cd0d9b99f

/tidb/cdc/default/default/upstream/7365379508944909231
	{"id":7365379508944909231,"pd-endpoints":"http://127.0.0.1:2379,http://127.0.0.1:2379","key-path":"","cert-path":"","ca-path":"","cert-allowed-cn":null}'
+ grep -q 'etcd info'
+ break
+ set +x
Create changefeed successfully!
ID: test-normal
Info: {"upstream_id":7365379508944909231,"namespace":"default","id":"test-normal","sink_uri":"kafka://127.0.0.1:9092/ticdc-multi-tables-ddl-test-normal-3682?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:14:26.394621+08:00","start_ts":449547092123451393,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["multi_tables_ddl_test.t1","multi_tables_ddl_test.t2","multi_tables_ddl_test.t3","multi_tables_ddl_test.t4","multi_tables_ddl_test.t1_7","multi_tables_ddl_test.t2_7","multi_tables_ddl_test.finish_mark"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":true,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547092123451393,"checkpoint_ts":449547092123451393,"checkpoint_time":"2024-05-05 13:14:23.093"}
Create changefeed successfully!
ID: test-error-1
Info: {"upstream_id":7365379508944909231,"namespace":"default","id":"test-error-1","sink_uri":"kafka://127.0.0.1:9092/ticdc-multi-tables-ddl-test-error-1-25566?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:14:26.594833405+08:00","start_ts":449547092123451393,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["multi_tables_ddl_test.t5","multi_tables_ddl_test.t6","multi_tables_ddl_test.t7","multi_tables_ddl_test.t8"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":true,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547092123451393,"checkpoint_ts":449547092123451393,"checkpoint_time":"2024-05-05 13:14:23.093"}
Create changefeed successfully!
ID: test-error-2
Info: {"upstream_id":7365379508944909231,"namespace":"default","id":"test-error-2","sink_uri":"kafka://127.0.0.1:9092/ticdc-multi-tables-ddl-test-error-2-10501?protocol=open-protocol\u0026partition-num=4\u0026kafka-version=2.4.1\u0026max-message-bytes=10485760","create_time":"2024-05-05T13:14:26.794228208+08:00","start_ts":449547092123451393,"config":{"memory_quota":1073741824,"case_sensitive":false,"force_replicate":false,"ignore_ineligible_table":false,"check_gc_safe_point":true,"enable_sync_point":false,"enable_table_monitor":false,"bdr_mode":false,"sync_point_interval":600000000000,"sync_point_retention":86400000000000,"filter":{"rules":["multi_tables_ddl_test.t9","multi_tables_ddl_test.t10"]},"mounter":{"worker_num":16},"sink":{"protocol":"open-protocol","csv":{"delimiter":",","quote":"\"","null":"\\N","include_commit_ts":true,"binary_encoding_method":"base64","output_old_value":false,"output_handle_key":false},"encoder_concurrency":32,"terminator":"\r\n","date_separator":"day","enable_partition_separator":true,"enable_kafka_sink_v2":false,"only_output_updated_columns":false,"delete_only_output_handle_key_columns":false,"content_compatible":false,"advance_timeout":150,"send_bootstrap_interval_in_sec":120,"send_bootstrap_in_msg_count":10000,"send_bootstrap_to_all_partition":true,"debezium_disable_schema":false,"debezium":{"output_old_value":true},"open":{"output_old_value":true}},"consistent":{"level":"none","max_log_size":64,"flush_interval":2000,"meta_flush_interval":200,"encoding_worker_num":16,"flush_worker_num":8,"use_file_backend":false,"memory_usage":{"memory_quota_percentage":50}},"scheduler":{"enable_table_across_nodes":false,"region_threshold":100000,"write_key_threshold":0},"integrity":{"integrity_check_level":"none","corruption_handle_level":"warn"},"changefeed_error_stuck_duration":1800000000000,"synced_status":{"synced_check_interval":300,"checkpoint_interval":15}},"state":"normal","creator_version":"v8.2.0-alpha-53-g0de8dc3e4","resolved_ts":449547092123451393,"checkpoint_ts":449547092123451393,"checkpoint_time":"2024-05-05 13:14:23.093"}
[Sun May  5 13:14:26 CST 2024] <<<<<< START kafka consumer in multi_tables_ddl case >>>>>>
[Sun May  5 13:14:26 CST 2024] <<<<<< START kafka consumer in multi_tables_ddl case >>>>>>
[Sun May  5 13:14:26 CST 2024] <<<<<< START kafka consumer in multi_tables_ddl case >>>>>>
table multi_tables_ddl_test.t55 exists
table multi_tables_ddl_test.t66 exists
table multi_tables_ddl_test.t7 exists
table multi_tables_ddl_test.t88 exists
table multi_tables_ddl_test.finish_mark not exists for 1-th check, retry later
table multi_tables_ddl_test.finish_mark exists
check table exists success
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test-normal
+ expected_state=normal
+ error_msg=null
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test-normal -s
+ info='{
  "upstream_id": 7365379508944909231,
  "namespace": "default",
  "id": "test-normal",
  "state": "normal",
  "checkpoint_tso": 449547096593268757,
  "checkpoint_time": "2024-05-05 13:14:40.144",
  "error": null
}'
+ echo '{
  "upstream_id": 7365379508944909231,
  "namespace": "default",
  "id": "test-normal",
  "state": "normal",
  "checkpoint_tso": 449547096593268757,
  "checkpoint_time": "2024-05-05 13:14:40.144",
  "error": null
}'
{
  "upstream_id": 7365379508944909231,
  "namespace": "default",
  "id": "test-normal",
  "state": "normal",
  "checkpoint_tso": 449547096593268757,
  "checkpoint_time": "2024-05-05 13:14:40.144",
  "error": null
}
++ echo '{' '"upstream_id":' 7365379508944909231, '"namespace":' '"default",' '"id":' '"test-normal",' '"state":' '"normal",' '"checkpoint_tso":' 449547096593268757, '"checkpoint_time":' '"2024-05-05' '13:14:40.144",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365379508944909231, '"namespace":' '"default",' '"id":' '"test-normal",' '"state":' '"normal",' '"checkpoint_tso":' 449547096593268757, '"checkpoint_time":' '"2024-05-05' '13:14:40.144",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test-error-1
+ expected_state=normal
+ error_msg=null
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test-error-1 -s
+ info='{
  "upstream_id": 7365379508944909231,
  "namespace": "default",
  "id": "test-error-1",
  "state": "normal",
  "checkpoint_tso": 449547097287688203,
  "checkpoint_time": "2024-05-05 13:14:42.793",
  "error": null
}'
+ echo '{
  "upstream_id": 7365379508944909231,
  "namespace": "default",
  "id": "test-error-1",
  "state": "normal",
  "checkpoint_tso": 449547097287688203,
  "checkpoint_time": "2024-05-05 13:14:42.793",
  "error": null
}'
{
  "upstream_id": 7365379508944909231,
  "namespace": "default",
  "id": "test-error-1",
  "state": "normal",
  "checkpoint_tso": 449547097287688203,
  "checkpoint_time": "2024-05-05 13:14:42.793",
  "error": null
}
++ echo '{' '"upstream_id":' 7365379508944909231, '"namespace":' '"default",' '"id":' '"test-error-1",' '"state":' '"normal",' '"checkpoint_tso":' 449547097287688203, '"checkpoint_time":' '"2024-05-05' '13:14:42.793",' '"error":' null '}'
++ jq -r .state
+ state=normal
+ [[ ! normal == \n\o\r\m\a\l ]]
++ echo '{' '"upstream_id":' 7365379508944909231, '"namespace":' '"default",' '"id":' '"test-error-1",' '"state":' '"normal",' '"checkpoint_tso":' 449547097287688203, '"checkpoint_time":' '"2024-05-05' '13:14:42.793",' '"error":' null '}'
++ jq -r .error.message
+ message=null
+ [[ ! null =~ null ]]
+ endpoints=http://127.0.0.1:2379
+ changefeed_id=test-error-2
+ expected_state=failed
+ error_msg=ErrSyncRenameTableFailed
+ tls_dir=
+ [[ http://127.0.0.1:2379 =~ https ]]
++ cdc cli changefeed query --pd=http://127.0.0.1:2379 -c test-error-2 -s
+ info='{
  "upstream_id": 7365379508944909231,
  "namespace": "default",
  "id": "test-error-2",
  "state": "failed",
  "checkpoint_tso": 449547096239112196,
  "checkpoint_time": "2024-05-05 13:14:38.793",
  "error": {
    "time": "2024-05-05T13:14:39.845438828+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrSyncRenameTableFailed",
    "message": "[CDC:ErrSyncRenameTableFailed]table'\''s old name is not in filter rule, and its new name in filter rule table id '\''130'\'', ddl query: [rename table t11 to t9], it'\''s an unexpected behavior, if you want to replicate this table, please add its old name to filter rule."
  }
}'
+ echo '{
  "upstream_id": 7365379508944909231,
  "namespace": "default",
  "id": "test-error-2",
  "state": "failed",
  "checkpoint_tso": 449547096239112196,
  "checkpoint_time": "2024-05-05 13:14:38.793",
  "error": {
    "time": "2024-05-05T13:14:39.845438828+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrSyncRenameTableFailed",
    "message": "[CDC:ErrSyncRenameTableFailed]table'\''s old name is not in filter rule, and its new name in filter rule table id '\''130'\'', ddl query: [rename table t11 to t9], it'\''s an unexpected behavior, if you want to replicate this table, please add its old name to filter rule."
  }
}'
{
  "upstream_id": 7365379508944909231,
  "namespace": "default",
  "id": "test-error-2",
  "state": "failed",
  "checkpoint_tso": 449547096239112196,
  "checkpoint_time": "2024-05-05 13:14:38.793",
  "error": {
    "time": "2024-05-05T13:14:39.845438828+08:00",
    "addr": "127.0.0.1:8300",
    "code": "CDC:ErrSyncRenameTableFailed",
    "message": "[CDC:ErrSyncRenameTableFailed]table's old name is not in filter rule, and its new name in filter rule table id '130', ddl query: [rename table t11 to t9], it's an unexpected behavior, if you want to replicate this table, please add its old name to filter rule."
  }
}
++ jq -r .state
++ echo '{' '"upstream_id":' 7365379508944909231, '"namespace":' '"default",' '"id":' '"test-error-2",' '"state":' '"failed",' '"checkpoint_tso":' 449547096239112196, '"checkpoint_time":' '"2024-05-05' '13:14:38.793",' '"error":' '{' '"time":' '"2024-05-05T13:14:39.845438828+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrSyncRenameTableFailed",' '"message":' '"[CDC:ErrSyncRenameTableFailed]table'\''s' old name is not in filter rule, and its new name in filter rule table id ''\''130'\'',' ddl query: '[rename' table t11 to 't9],' 'it'\''s' an unexpected behavior, if you want to replicate this table, please add its old name to filter 'rule."' '}' '}'
+ state=failed
+ [[ ! failed == \f\a\i\l\e\d ]]
++ jq -r .error.message
++ echo '{' '"upstream_id":' 7365379508944909231, '"namespace":' '"default",' '"id":' '"test-error-2",' '"state":' '"failed",' '"checkpoint_tso":' 449547096239112196, '"checkpoint_time":' '"2024-05-05' '13:14:38.793",' '"error":' '{' '"time":' '"2024-05-05T13:14:39.845438828+08:00",' '"addr":' '"127.0.0.1:8300",' '"code":' '"CDC:ErrSyncRenameTableFailed",' '"message":' '"[CDC:ErrSyncRenameTableFailed]table'\''s' old name is not in filter rule, and its new name in filter rule table id ''\''130'\'',' ddl query: '[rename' table t11 to 't9],' 'it'\''s' an unexpected behavior, if you want to replicate this table, please add its old name to filter 'rule."' '}' '}'
+ message='[CDC:ErrSyncRenameTableFailed]table'\''s old name is not in filter rule, and its new name in filter rule table id '\''130'\'', ddl query: [rename table t11 to t9], it'\''s an unexpected behavior, if you want to replicate this table, please add its old name to filter rule.'
+ [[ ! [CDC:ErrSyncRenameTableFailed]table's old name is not in filter rule, and its new name in filter rule table id '130', ddl query: [rename table t11 to t9], it's an unexpected behavior, if you want to replicate this table, please add its old name to filter rule. =~ ErrSyncRenameTableFailed ]]
check diff successfully
wait process cdc.test exit for 1-th time...
wait process cdc.test exit for 2-th time...
cdc.test: no process found
wait process cdc.test exit for 3-th time...
process cdc.test already exit
[Sun May  5 13:14:45 CST 2024] <<<<<< run test case multi_tables_ddl success! >>>>>>
Exiting on signal: INTERRUPT
\033[0;36m<<< Run all test success >>>\033[0m
[Pipeline] }
Cache not saved (ws/jenkins-pingcap-tiflow-pull_cdc_integration_kafka_test-1856/tiflow-cdc already exists)
[Pipeline] // cache
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
Finished: SUCCESS