Skip to content

Console Output

Skipping 641 KB.. Full Log
INFO: Reading rc options for 'run' from /data/..2024_05_06_13_27_46.2492278083/bazel:
  Inherited 'build' options: --remote_cache=http://ats.apps.svc/brc/tidb --remote_timeout=30s
INFO: Reading rc options for 'run' from /home/jenkins/.bazelrc:
  Inherited 'build' options: --local_ram_resources=29491 --local_cpu_resources=8 --jobs=8 --repository_cache=/share/.cache/bazel-repository-cache
INFO: Found applicable config definition build:ci in file /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc: --color=yes --experimental_remote_cache_compression
INFO: Found applicable config definition run:ci in file /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc: --color=yes

Bazel caught terminate signal; shutting down.


Bazel caught terminate signal; shutting down.


Could not interrupt server: (14) Connection reset by peer


Could not interrupt server: (14) failed to connect to all addresses


Server terminated abruptly (error code: 14, error message: 'Connection reset by peer', log file: '/home/jenkins/.tidb/tmp/903aa8c667333396b92cebbde26882dc/server/jvm.out')

make: *** [bazel_ci_simple_prepare] Error 37
script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
make: *** [bazel_ci_simple_prepare] Terminated
script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
INFO: Invocation ID: c357a08a-6a60-4523-a02b-57cd6deca71c
INFO: Reading 'startup' options from /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc: --host_jvm_args=-Xmx4g, --unlimit_coredumps
INFO: Options provided by the client:
  Inherited 'common' options: --isatty=0 --terminal_columns=80
INFO: Reading rc options for 'run' from /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc:
  Inherited 'build' options: --announce_rc --experimental_guard_against_concurrent_changes --experimental_remote_merkle_tree_cache --java_language_version=17 --java_runtime_version=17 --tool_java_language_version=17 --tool_java_runtime_version=17 --incompatible_strict_action_env --incompatible_enable_cc_toolchain_resolution
INFO: Reading rc options for 'run' from /data/..2024_05_06_13_27_46.3223338557/bazel:
  Inherited 'build' options: --remote_cache=http://ats.apps.svc/brc/tidb --remote_timeout=30s
INFO: Reading rc options for 'run' from /home/jenkins/.bazelrc:
  Inherited 'build' options: --local_ram_resources=29491 --local_cpu_resources=8 --jobs=8 --repository_cache=/share/.cache/bazel-repository-cache
INFO: Found applicable config definition build:ci in file /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc: --color=yes --experimental_remote_cache_compression
INFO: Found applicable config definition run:ci in file /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc: --color=yes
bazel --output_user_root=/home/jenkins/.tidb/tmp run --config=ci --repository_cache=/share/.cache/bazel-repository-cache //:gazelle
Extracting Bazel installation...
Starting local Bazel server and connecting to it...
bazel --output_user_root=/home/jenkins/.tidb/tmp run --config=ci --repository_cache=/share/.cache/bazel-repository-cache //:gazelle
Extracting Bazel installation...
make: *** [bazel_ci_simple_prepare] Terminated
script returned exit code 143
INFO: Invocation ID: 3e1c2b34-7b5e-416e-97ca-2e8fe470a914
INFO: Reading 'startup' options from /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc: --host_jvm_args=-Xmx4g, --unlimit_coredumps
INFO: Options provided by the client:
  Inherited 'common' options: --isatty=0 --terminal_columns=80
INFO: Reading rc options for 'run' from /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc:
  Inherited 'build' options: --announce_rc --experimental_guard_against_concurrent_changes --experimental_remote_merkle_tree_cache --java_language_version=17 --java_runtime_version=17 --tool_java_language_version=17 --tool_java_runtime_version=17 --incompatible_strict_action_env --incompatible_enable_cc_toolchain_resolution
INFO: Reading rc options for 'run' from /data/..2024_05_06_13_27_46.947484816/bazel:
  Inherited 'build' options: --remote_cache=http://ats.apps.svc/brc/tidb --remote_timeout=30s
INFO: Reading rc options for 'run' from /home/jenkins/.bazelrc:
  Inherited 'build' options: --local_ram_resources=29491 --local_cpu_resources=8 --jobs=8 --repository_cache=/share/.cache/bazel-repository-cache
INFO: Found applicable config definition build:ci in file /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc: --color=yes --experimental_remote_cache_compression
INFO: Found applicable config definition run:ci in file /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb/.bazelrc: --color=yes
Loading: 
Loading: 
kill finished with exit code 0
Sending interrupt signal to process
Killing processes

Bazel caught terminate signal; shutting down.


Bazel caught terminate signal; shutting down.

script returned exit code 143
kill finished with exit code 0
Sending interrupt signal to process
Killing processes
Loading: 
Loading: 

Bazel caught terminate signal; shutting down.


Bazel caught terminate signal; shutting down.

script returned exit code 143
kill finished with exit code 0

Bazel caught terminate signal; shutting down.


Bazel caught terminate signal; shutting down.

script returned exit code 143
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_statisticstest'
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
Post stage
Post stage
Post stage
Post stage
Post stage
Post stage
Post stage
Post stage
Post stage
Post stage
Post stage
Post stage
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] junit
Recording test results
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_txntest'
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
[Pipeline] // dir
[Pipeline] }
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
No test report files were found. Configuration error?
[Pipeline] {
No test report files were found. Configuration error?
No test report files were found. Configuration error?
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
No test report files were found. Configuration error?
No test report files were found. Configuration error?
No test report files were found. Configuration error?
[Pipeline] sh
No test report files were found. Configuration error?
No test report files were found. Configuration error?
No test report files were found. Configuration error?
No test report files were found. Configuration error?
[Pipeline] sh
No test report files were found. Configuration error?
None of the test reports contained any result
[Checks API] No suitable checks publisher found.
+ str='run_real_tikv_tests.sh bazel_addindextest1'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_addindextest1
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_addindextest1
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_addindextest1
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_addindextest1
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_addindextest1
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_addindextest1.tar.gz logs_run_real_tikv_tests.sh_bazel_addindextest1
logs_run_real_tikv_tests.sh_bazel_addindextest1/
logs_run_real_tikv_tests.sh_bazel_addindextest1/tikv2.log
logs_run_real_tikv_tests.sh_bazel_addindextest1/pd1.log
logs_run_real_tikv_tests.sh_bazel_addindextest1/pd3.log
logs_run_real_tikv_tests.sh_bazel_addindextest1/tikv1.log
logs_run_real_tikv_tests.sh_bazel_addindextest1/tikv3.log
logs_run_real_tikv_tests.sh_bazel_addindextest1/pd2.log
[Pipeline] sh
[Pipeline] sh
+ str='run_real_tikv_tests.sh bazel_addindextest3'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_addindextest3
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_addindextest3
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_addindextest3
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_addindextest3
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_addindextest3
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_addindextest3.tar.gz logs_run_real_tikv_tests.sh_bazel_addindextest3
logs_run_real_tikv_tests.sh_bazel_addindextest3/
logs_run_real_tikv_tests.sh_bazel_addindextest3/tikv2.log
logs_run_real_tikv_tests.sh_bazel_addindextest3/pd3.log
logs_run_real_tikv_tests.sh_bazel_addindextest3/tikv3.log
logs_run_real_tikv_tests.sh_bazel_addindextest3/pd2.log
logs_run_real_tikv_tests.sh_bazel_addindextest3/tikv1.log
logs_run_real_tikv_tests.sh_bazel_addindextest3/pd1.log
[Pipeline] sh
No test report files were found. Configuration error?
[Pipeline] sh
+ str='run_real_tikv_tests.sh bazel_addindextest'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_addindextest
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_addindextest
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_addindextest
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_addindextest
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_addindextest
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_addindextest.tar.gz logs_run_real_tikv_tests.sh_bazel_addindextest
logs_run_real_tikv_tests.sh_bazel_addindextest/
logs_run_real_tikv_tests.sh_bazel_addindextest/tikv3.log
logs_run_real_tikv_tests.sh_bazel_addindextest/pd1.log
logs_run_real_tikv_tests.sh_bazel_addindextest/tikv2.log
logs_run_real_tikv_tests.sh_bazel_addindextest/tikv1.log
logs_run_real_tikv_tests.sh_bazel_addindextest/pd2.log
logs_run_real_tikv_tests.sh_bazel_addindextest/pd3.log
+ str='run_real_tikv_tests.sh bazel_importintotest'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_importintotest
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_importintotest
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_importintotest
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_importintotest
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_importintotest
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_importintotest.tar.gz logs_run_real_tikv_tests.sh_bazel_importintotest
logs_run_real_tikv_tests.sh_bazel_importintotest/
logs_run_real_tikv_tests.sh_bazel_importintotest/tikv3.log
logs_run_real_tikv_tests.sh_bazel_importintotest/pd1.log
logs_run_real_tikv_tests.sh_bazel_importintotest/pd2.log
logs_run_real_tikv_tests.sh_bazel_importintotest/tikv1.log
logs_run_real_tikv_tests.sh_bazel_importintotest/tikv2.log
logs_run_real_tikv_tests.sh_bazel_importintotest/pd3.log
[Pipeline] sh
+ str='run_real_tikv_tests.sh bazel_importintotest2'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_importintotest2
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_importintotest2
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_importintotest2
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_importintotest2
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_importintotest2
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_importintotest2.tar.gz logs_run_real_tikv_tests.sh_bazel_importintotest2
logs_run_real_tikv_tests.sh_bazel_importintotest2/
logs_run_real_tikv_tests.sh_bazel_importintotest2/pd2.log
logs_run_real_tikv_tests.sh_bazel_importintotest2/tikv3.log
logs_run_real_tikv_tests.sh_bazel_importintotest2/pd3.log
logs_run_real_tikv_tests.sh_bazel_importintotest2/tikv1.log
logs_run_real_tikv_tests.sh_bazel_importintotest2/tikv2.log
logs_run_real_tikv_tests.sh_bazel_importintotest2/pd1.log
[Pipeline] sh
[Pipeline] sh
+ str='run_real_tikv_tests.sh bazel_addindextest4'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_addindextest4
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_addindextest4
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_addindextest4
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_addindextest4
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_addindextest4
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_addindextest4.tar.gz logs_run_real_tikv_tests.sh_bazel_addindextest4
logs_run_real_tikv_tests.sh_bazel_addindextest4/
logs_run_real_tikv_tests.sh_bazel_addindextest4/tikv2.log
logs_run_real_tikv_tests.sh_bazel_addindextest4/tikv1.log
logs_run_real_tikv_tests.sh_bazel_addindextest4/pd2.log
logs_run_real_tikv_tests.sh_bazel_addindextest4/tikv3.log
logs_run_real_tikv_tests.sh_bazel_addindextest4/pd3.log
logs_run_real_tikv_tests.sh_bazel_addindextest4/pd1.log
+ str='run_real_tikv_tests.sh bazel_pipelineddmltest'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_pipelineddmltest
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_pipelineddmltest
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_pipelineddmltest
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_pipelineddmltest
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_pipelineddmltest
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_pipelineddmltest.tar.gz logs_run_real_tikv_tests.sh_bazel_pipelineddmltest
logs_run_real_tikv_tests.sh_bazel_pipelineddmltest/
logs_run_real_tikv_tests.sh_bazel_pipelineddmltest/tikv2.log
logs_run_real_tikv_tests.sh_bazel_pipelineddmltest/tikv3.log
logs_run_real_tikv_tests.sh_bazel_pipelineddmltest/pd3.log
logs_run_real_tikv_tests.sh_bazel_pipelineddmltest/pd2.log
logs_run_real_tikv_tests.sh_bazel_pipelineddmltest/tikv1.log
logs_run_real_tikv_tests.sh_bazel_pipelineddmltest/pd1.log
[Pipeline] // dir
[Pipeline] }
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
+ str='run_real_tikv_tests.sh bazel_importintotest4'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_importintotest4
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_importintotest4
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_importintotest4
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_importintotest4
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_importintotest4
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_importintotest4.tar.gz logs_run_real_tikv_tests.sh_bazel_importintotest4
logs_run_real_tikv_tests.sh_bazel_importintotest4/
logs_run_real_tikv_tests.sh_bazel_importintotest4/pd1.log
logs_run_real_tikv_tests.sh_bazel_importintotest4/pd2.log
logs_run_real_tikv_tests.sh_bazel_importintotest4/tikv2.log
logs_run_real_tikv_tests.sh_bazel_importintotest4/tikv3.log
logs_run_real_tikv_tests.sh_bazel_importintotest4/tikv1.log
logs_run_real_tikv_tests.sh_bazel_importintotest4/pd3.log
[Pipeline] }
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] sh
[Pipeline] // dir
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] sh
+ str='run_real_tikv_tests.sh bazel_addindextest2'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_addindextest2
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_addindextest2
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_addindextest2
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_addindextest2
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_addindextest2
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_addindextest2.tar.gz logs_run_real_tikv_tests.sh_bazel_addindextest2
logs_run_real_tikv_tests.sh_bazel_addindextest2/
logs_run_real_tikv_tests.sh_bazel_addindextest2/pd1.log
logs_run_real_tikv_tests.sh_bazel_addindextest2/pd3.log
logs_run_real_tikv_tests.sh_bazel_addindextest2/pd2.log
logs_run_real_tikv_tests.sh_bazel_addindextest2/tikv3.log
logs_run_real_tikv_tests.sh_bazel_addindextest2/tikv2.log
logs_run_real_tikv_tests.sh_bazel_addindextest2/tikv1.log
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] dir
Running in /home/jenkins/agent/workspace/pingcap/tidb/ghpr_check2/tidb
[Pipeline] {
[Pipeline] sh
[Pipeline] }
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] // dir
[Pipeline] // dir
+ str='run_real_tikv_tests.sh bazel_importintotest3'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_importintotest3
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_importintotest3
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_importintotest3
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_importintotest3
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_importintotest3
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_importintotest3.tar.gz logs_run_real_tikv_tests.sh_bazel_importintotest3
logs_run_real_tikv_tests.sh_bazel_importintotest3/
logs_run_real_tikv_tests.sh_bazel_importintotest3/tikv3.log
logs_run_real_tikv_tests.sh_bazel_importintotest3/pd1.log
logs_run_real_tikv_tests.sh_bazel_importintotest3/tikv2.log
logs_run_real_tikv_tests.sh_bazel_importintotest3/pd3.log
logs_run_real_tikv_tests.sh_bazel_importintotest3/pd2.log
logs_run_real_tikv_tests.sh_bazel_importintotest3/tikv1.log
[Pipeline] // dir
+ str='run_real_tikv_tests.sh bazel_flashbacktest'
+ logs_dir=logs_run_real_tikv_tests.sh_bazel_flashbacktest
+ mkdir -p logs_run_real_tikv_tests.sh_bazel_flashbacktest
+ mv pd1.log pd2.log pd3.log logs_run_real_tikv_tests.sh_bazel_flashbacktest
+ mv tikv1.log tikv2.log tikv3.log logs_run_real_tikv_tests.sh_bazel_flashbacktest
+ mv tests/integrationtest/integration-test.out logs_run_real_tikv_tests.sh_bazel_flashbacktest
mv: cannot stat 'tests/integrationtest/integration-test.out': No such file or directory
+ true
+ tar -czvf logs_run_real_tikv_tests.sh_bazel_flashbacktest.tar.gz logs_run_real_tikv_tests.sh_bazel_flashbacktest
logs_run_real_tikv_tests.sh_bazel_flashbacktest/
logs_run_real_tikv_tests.sh_bazel_flashbacktest/pd2.log
logs_run_real_tikv_tests.sh_bazel_flashbacktest/pd3.log
logs_run_real_tikv_tests.sh_bazel_flashbacktest/tikv1.log
logs_run_real_tikv_tests.sh_bazel_flashbacktest/pd1.log
logs_run_real_tikv_tests.sh_bazel_flashbacktest/tikv2.log
logs_run_real_tikv_tests.sh_bazel_flashbacktest/tikv3.log
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] }
+ str='integrationtest_with_tikv.sh y'
+ logs_dir=logs_integrationtest_with_tikv.sh_y
+ mkdir -p logs_integrationtest_with_tikv.sh_y
+ mv pd1.log pd2.log pd3.log logs_integrationtest_with_tikv.sh_y
+ mv tikv1.log tikv2.log tikv3.log logs_integrationtest_with_tikv.sh_y
+ mv tests/integrationtest/integration-test.out logs_integrationtest_with_tikv.sh_y
+ tar -czvf logs_integrationtest_with_tikv.sh_y.tar.gz logs_integrationtest_with_tikv.sh_y
logs_integrationtest_with_tikv.sh_y/
logs_integrationtest_with_tikv.sh_y/pd2.log
logs_integrationtest_with_tikv.sh_y/pd3.log
logs_integrationtest_with_tikv.sh_y/tikv1.log
logs_integrationtest_with_tikv.sh_y/integration-test.out
logs_integrationtest_with_tikv.sh_y/pd1.log
logs_integrationtest_with_tikv.sh_y/tikv2.log
logs_integrationtest_with_tikv.sh_y/tikv3.log
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] archiveArtifacts
Archiving artifacts
[Pipeline] }
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] // timeout
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // dir
[Pipeline] // timeout
[Pipeline] // dir
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // dir
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] }
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] // container
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] // node
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_addindextest'
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_addindextest1'
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_addindextest3'
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_addindextest4'
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_importintotest'
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_importintotest2'
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_pipelineddmltest'
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_importintotest4'
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_addindextest2'
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_importintotest3'
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'run_real_tikv_tests.sh bazel_flashbacktest'
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
Failed in branch Matrix - SCRIPT_AND_ARGS = 'integrationtest_with_tikv.sh y'
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] container
[Pipeline] {
[Pipeline] sh
+ bash scripts/plugins/report_job_result.sh FAILURE result.json
http://fileserver.pingcap.net
--2024-05-06 13:31:54--  http://fileserver.pingcap.net/download/rd-atom-agent/agent_upload_verifyci_metadata.py
Resolving fileserver.pingcap.net (fileserver.pingcap.net)... 10.2.12.82
Connecting to fileserver.pingcap.net (fileserver.pingcap.net)|10.2.12.82|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 4181 (4.1K) [application/octet-stream]
Saving to: ‘agent_upload_verifyci_metadata.py’

     0K ....                                                  100%  612M=0s

2024-05-06 13:31:54 (612 MB/s) - ‘agent_upload_verifyci_metadata.py’ saved [4181/4181]

No junit report file
parse result file result.json success
upload data succesfully.
[Pipeline] }
[Pipeline] // container
[Pipeline] archiveArtifacts
Archiving artifacts
Recording fingerprints
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE