Started by upstream project "pipeline-nightly" build number 24 originally caused by: Started by timer Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on slave-openstack-gz1_1 (openstack) in workspace /home/jenkins/workspace/collect-logs-and-cleanup@2 [WS-CLEANUP] Deleting project workspace... [WS-CLEANUP] Deferred wipeout is disabled by the job configuration... [WS-CLEANUP] Done The recommended git tool is: git No credentials specified Wiping out workspace first. Cloning the remote Git repository Cloning repository https://github.com/opensdn-io/tf-jenkins.git > git init /home/jenkins/workspace/collect-logs-and-cleanup@2/src/opensdn-io/tf-jenkins # timeout=10 Fetching upstream changes from https://github.com/opensdn-io/tf-jenkins.git > git --version # timeout=10 > git --version # 'git version 2.25.1' > git fetch --tags --force --progress -- https://github.com/opensdn-io/tf-jenkins.git +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url https://github.com/opensdn-io/tf-jenkins.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 Avoid second fetch > git rev-parse refs/remotes/origin/master^{commit} # timeout=10 Checking out Revision a254c0f0da78c12e0a8ee66ae1dc5418a1d1c014 (refs/remotes/origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f a254c0f0da78c12e0a8ee66ae1dc5418a1d1c014 # timeout=10 Commit message: "Merge "add tf- images"" > git rev-list --no-walk a254c0f0da78c12e0a8ee66ae1dc5418a1d1c014 # timeout=10 The recommended git tool is: NONE No credentials specified Wiping out workspace first. Cloning the remote Git repository Cloning repository https://github.com/opensdn-io/tf-devstack.git > git init /home/jenkins/workspace/collect-logs-and-cleanup@2/src/opensdn-io/tf-devstack # timeout=10 Fetching upstream changes from https://github.com/opensdn-io/tf-devstack.git > git --version # timeout=10 > git --version # 'git version 2.25.1' > git fetch --tags --force --progress -- https://github.com/opensdn-io/tf-devstack.git +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url https://github.com/opensdn-io/tf-devstack.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 Avoid second fetch > git rev-parse refs/remotes/origin/master^{commit} # timeout=10 Checking out Revision d9a1892692e9132932141ac1f6602965341ed565 (refs/remotes/origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f d9a1892692e9132932141ac1f6602965341ed565 # timeout=10 Commit message: "Merge "add description of scale testing"" > git rev-list --no-walk d9a1892692e9132932141ac1f6602965341ed565 # timeout=10 Copied 3 artifacts from "pipeline-nightly" build number 24 [collect-logs-and-cleanup@2] $ /bin/bash -xe /tmp/jenkins12652358277764734811.sh + source /home/jenkins/workspace/collect-logs-and-cleanup@2/global.env ++ export PIPELINE_BUILD_TAG=jenkins-pipeline-nightly-24 ++ PIPELINE_BUILD_TAG=jenkins-pipeline-nightly-24 ++ export SLAVE=openstack ++ SLAVE=openstack ++ export SLAVE_REGION=gz1 ++ SLAVE_REGION=gz1 ++ export LOGS_HOST=nexus.gz1.opensdn.io ++ LOGS_HOST=nexus.gz1.opensdn.io ++ export LOGS_PATH=/var/www/logs/jenkins_logs/nightly/pipeline_24 ++ LOGS_PATH=/var/www/logs/jenkins_logs/nightly/pipeline_24 ++ export LOGS_URL=http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24 ++ LOGS_URL=http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24 ++ export SITE_MIRROR=http://nexus.gz1.opensdn.io/repository ++ SITE_MIRROR=http://nexus.gz1.opensdn.io/repository ++ export CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ export DEPLOYER_CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ DEPLOYER_CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ export CONTRAIL_CONTAINER_TAG=nightly ++ CONTRAIL_CONTAINER_TAG=nightly ++ export CONTRAIL_DEPLOYER_CONTAINER_TAG=nightly ++ CONTRAIL_DEPLOYER_CONTAINER_TAG=nightly ++ export CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ export DEPLOYER_CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ DEPLOYER_CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ export CONTRAIL_CONTAINER_TAG_ORIGINAL=nightly ++ CONTRAIL_CONTAINER_TAG_ORIGINAL=nightly ++ export CONTRAIL_DEPLOYER_CONTAINER_TAG_ORIGINAL=nightly ++ CONTRAIL_DEPLOYER_CONTAINER_TAG_ORIGINAL=nightly ++ export GERRIT_PIPELINE=nightly ++ GERRIT_PIPELINE=nightly ++ export GERRIT_BRANCH=master ++ GERRIT_BRANCH=master ++ export REPOS_CHANNEL=latest ++ REPOS_CHANNEL=latest + desc='Pipeline: pipeline-nightly-24 Random: 58340 Stream: ansible-os-vanilla' + desc+='
Job logs: http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24/ansible-os-vanilla' + echo 'DESCRIPTION Pipeline: pipeline-nightly-24 Random: 58340 Stream: ansible-os-vanilla
Job logs: http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24/ansible-os-vanilla' DESCRIPTION Pipeline: pipeline-nightly-24 Random: 58340 Stream: ansible-os-vanilla
Job logs: http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24/ansible-os-vanilla [description-setter] Description set: Pipeline: pipeline-nightly-24 Random: 58340 Stream: ansible-os-vanilla
Job logs: http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24/ansible-os-vanilla' [collect-logs-and-cleanup@2] $ /bin/bash -xe /tmp/jenkins14468808066145464770.sh + source /home/jenkins/workspace/collect-logs-and-cleanup@2/global.env ++ export PIPELINE_BUILD_TAG=jenkins-pipeline-nightly-24 ++ PIPELINE_BUILD_TAG=jenkins-pipeline-nightly-24 ++ export SLAVE=openstack ++ SLAVE=openstack ++ export SLAVE_REGION=gz1 ++ SLAVE_REGION=gz1 ++ export LOGS_HOST=nexus.gz1.opensdn.io ++ LOGS_HOST=nexus.gz1.opensdn.io ++ export LOGS_PATH=/var/www/logs/jenkins_logs/nightly/pipeline_24 ++ LOGS_PATH=/var/www/logs/jenkins_logs/nightly/pipeline_24 ++ export LOGS_URL=http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24 ++ LOGS_URL=http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24 ++ export SITE_MIRROR=http://nexus.gz1.opensdn.io/repository ++ SITE_MIRROR=http://nexus.gz1.opensdn.io/repository ++ export CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ export DEPLOYER_CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ DEPLOYER_CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ export CONTRAIL_CONTAINER_TAG=nightly ++ CONTRAIL_CONTAINER_TAG=nightly ++ export CONTRAIL_DEPLOYER_CONTAINER_TAG=nightly ++ CONTRAIL_DEPLOYER_CONTAINER_TAG=nightly ++ export CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ export DEPLOYER_CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ DEPLOYER_CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ export CONTRAIL_CONTAINER_TAG_ORIGINAL=nightly ++ CONTRAIL_CONTAINER_TAG_ORIGINAL=nightly ++ export CONTRAIL_DEPLOYER_CONTAINER_TAG_ORIGINAL=nightly ++ CONTRAIL_DEPLOYER_CONTAINER_TAG_ORIGINAL=nightly ++ export GERRIT_PIPELINE=nightly ++ GERRIT_PIPELINE=nightly ++ export GERRIT_BRANCH=master ++ GERRIT_BRANCH=master ++ export REPOS_CHANNEL=latest ++ REPOS_CHANNEL=latest + ./src/opensdn-io/tf-jenkins/infra/gerrit/apply_patchsets.sh ./src opensdn-io/tf-jenkins ./patchsets-info.json + ./src/opensdn-io/tf-jenkins/infra/gerrit/apply_patchsets.sh ./src opensdn-io/tf-devstack ./patchsets-info.json [collect-logs-and-cleanup@2] $ /bin/bash -xe /tmp/jenkins8322333222187868798.sh + source /home/jenkins/workspace/collect-logs-and-cleanup@2/global.env ++ export PIPELINE_BUILD_TAG=jenkins-pipeline-nightly-24 ++ PIPELINE_BUILD_TAG=jenkins-pipeline-nightly-24 ++ export SLAVE=openstack ++ SLAVE=openstack ++ export SLAVE_REGION=gz1 ++ SLAVE_REGION=gz1 ++ export LOGS_HOST=nexus.gz1.opensdn.io ++ LOGS_HOST=nexus.gz1.opensdn.io ++ export LOGS_PATH=/var/www/logs/jenkins_logs/nightly/pipeline_24 ++ LOGS_PATH=/var/www/logs/jenkins_logs/nightly/pipeline_24 ++ export LOGS_URL=http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24 ++ LOGS_URL=http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24 ++ export SITE_MIRROR=http://nexus.gz1.opensdn.io/repository ++ SITE_MIRROR=http://nexus.gz1.opensdn.io/repository ++ export CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ export DEPLOYER_CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ DEPLOYER_CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ++ export CONTRAIL_CONTAINER_TAG=nightly ++ CONTRAIL_CONTAINER_TAG=nightly ++ export CONTRAIL_DEPLOYER_CONTAINER_TAG=nightly ++ CONTRAIL_DEPLOYER_CONTAINER_TAG=nightly ++ export CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ export DEPLOYER_CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ DEPLOYER_CONTAINER_REGISTRY_ORIGINAL=nexus.gz1.opensdn.io:5102 ++ export CONTRAIL_CONTAINER_TAG_ORIGINAL=nightly ++ CONTRAIL_CONTAINER_TAG_ORIGINAL=nightly ++ export CONTRAIL_DEPLOYER_CONTAINER_TAG_ORIGINAL=nightly ++ CONTRAIL_DEPLOYER_CONTAINER_TAG_ORIGINAL=nightly ++ export GERRIT_PIPELINE=nightly ++ GERRIT_PIPELINE=nightly ++ export GERRIT_BRANCH=master ++ GERRIT_BRANCH=master ++ export REPOS_CHANNEL=latest ++ REPOS_CHANNEL=latest + source /home/jenkins/workspace/collect-logs-and-cleanup@2/deps.collect-logs-and-cleanup.58340.env ++ export PROVIDER=openstack ++ PROVIDER=openstack ++ export ENVIRONMENT_OS=ubuntu22 ++ ENVIRONMENT_OS=ubuntu22 ++ export DATA_NETWORK=10.20.0.0/24 ++ DATA_NETWORK=10.20.0.0/24 ++ head -1 ++ export VROUTER_GATEWAY=10.20.0.1 ++ VROUTER_GATEWAY=10.20.0.1 ++ export IMAGE=4744ca8a-852c-4f31-8cf0-48b97ea797c5 ++ IMAGE=4744ca8a-852c-4f31-8cf0-48b97ea797c5 ++ export IMAGE_SSH_USER=ubuntu ++ IMAGE_SSH_USER=ubuntu ++ export INSTANCE_IDS=ea018499-4c54-4e03-9e00-b94711d3d9ed, ++ INSTANCE_IDS=ea018499-4c54-4e03-9e00-b94711d3d9ed, ++ export instance_ip=10.0.0.17 ++ instance_ip=10.0.0.17 ++ export CONTROLLER_NODES=10.0.0.17, ++ CONTROLLER_NODES=10.0.0.17, ++ export CONTROL_NODES=10.20.0.23, ++ CONTROL_NODES=10.20.0.23, ++ export ORCHESTRATOR=openstack ++ ORCHESTRATOR=openstack ++ export DEPLOYER=ansible ++ DEPLOYER=ansible ++ export JOB_LOGS_PATH=ansible-os-vanilla ++ JOB_LOGS_PATH=ansible-os-vanilla + source /home/jenkins/workspace/collect-logs-and-cleanup@2/vars.collect-logs-and-cleanup.58340.env ++ export USE_DATAPLANE_NETWORK=true ++ USE_DATAPLANE_NETWORK=true ++ export KOLLA_MODE=vanilla ++ KOLLA_MODE=vanilla + export FULL_LOGS_PATH=/var/www/logs/jenkins_logs/nightly/pipeline_24/ansible-os-vanilla + FULL_LOGS_PATH=/var/www/logs/jenkins_logs/nightly/pipeline_24/ansible-os-vanilla + [[ -n 10.0.0.17 ]] + /home/jenkins/workspace/collect-logs-and-cleanup@2/src/opensdn-io/tf-jenkins/jobs/devstack/ansible/collect_logs.sh INFO: wait for host Linux cn-jenkins-deploy-platform-ansible-os-2155-1 5.15.0-100-generic #110-Ubuntu SMP Wed Feb 7 13:27:48 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux INFO: collect logs INFO: Deploy ansible/logs (collect-logs-and-cleanup) Warning: Permanently added '10.0.0.17' (ECDSA) to the list of known hosts. Warning: Permanently added '10.0.0.17' (ECDSA) to the list of known hosts. INFO: =================== Mon Oct 28 02:25:47 UTC 2024 =================== [there is no tf devenv configuration to load] INFO: Applying stages logs INFO: Running stage logs at Mon Oct 28 02:25:47 UTC 2024 INFO: collecting logs... chmod: cannot access '/home/ubuntu/contrail-kolla-ansible/etc/kolla/': No such file or directory cp: cannot stat '/home/ubuntu/contrail-kolla-ansible/etc/kolla/*': No such file or directory cp: cannot stat '/home/ubuntu/.tf/*.yml': No such file or directory Warning: Permanently added '10.0.0.17' (ED25519) to the list of known hosts. Warning: Permanently added '10.0.0.17' (ED25519) to the list of known hosts. Warning: Permanently added '10.0.0.17' (ED25519) to the list of known hosts. Warning: Permanently added '10.0.0.17' (ED25519) to the list of known hosts. INFO: Collecting contrail-status INFO: Collecting system statistics for logs INFO: Collecting docker logs INFO: Collecting TF logs INFO: Collecting tf logs: /etc/contrail INFO: Collecting tf logs: /var/log/contrail/analytics-alarm-alarm-gen INFO: Collecting tf logs: /var/log/contrail/analytics-alarm-nodemgr INFO: Collecting tf logs: /var/log/contrail/analytics-alarm-provisioner INFO: Collecting tf logs: /var/log/contrail/analytics-api INFO: Collecting tf logs: /var/log/contrail/analytics-collector INFO: Collecting tf logs: /var/log/contrail/analytics-nodemgr INFO: Collecting tf logs: /var/log/contrail/analytics-provisioner INFO: Collecting tf logs: /var/log/contrail/analytics-snmp-nodemgr INFO: Collecting tf logs: /var/log/contrail/analytics-snmp-provisioner INFO: Collecting tf logs: /var/log/contrail/analytics-snmp-snmp-collector INFO: Collecting tf logs: /var/log/contrail/analytics-snmp-topology INFO: Collecting tf logs: /var/log/contrail/config-api INFO: Collecting tf logs: /var/log/contrail/config-database INFO: Collecting tf logs: /var/log/contrail/config-database-nodemgr INFO: Collecting tf logs: /var/log/contrail/config-database-provisioner INFO: Collecting tf logs: /var/log/contrail/config-database-rabbitmq INFO: Collecting tf logs: /var/log/contrail/config-device-manager INFO: Collecting tf logs: /var/log/contrail/config-nodemgr INFO: Collecting tf logs: /var/log/contrail/config-provisioner INFO: Collecting tf logs: /var/log/contrail/config-schema INFO: Collecting tf logs: /var/log/contrail/config-svc-monitor INFO: Collecting tf logs: /var/log/contrail/contrail-lbaas-haproxy-stdout.log INFO: Collecting tf logs: /var/log/contrail/control-control INFO: Collecting tf logs: /var/log/contrail/control-dns INFO: Collecting tf logs: /var/log/contrail/control-named INFO: Collecting tf logs: /var/log/contrail/control-nodemgr INFO: Collecting tf logs: /var/log/contrail/control-provisioner INFO: Collecting tf logs: /var/log/contrail/database INFO: Collecting tf logs: /var/log/contrail/database-nodemgr INFO: Collecting tf logs: /var/log/contrail/database-provisioner INFO: Collecting tf logs: /var/log/contrail/database-query-engine INFO: Collecting tf logs: /var/log/contrail/device-manager-dnsmasq INFO: Collecting tf logs: /var/log/contrail/kafka INFO: Collecting tf logs: /var/log/contrail/node-init INFO: Collecting tf logs: /var/log/contrail/rabbitmq INFO: Collecting tf logs: /var/log/contrail/rsyslogd INFO: Collecting tf logs: /var/log/contrail/vrouter-agent INFO: Collecting tf logs: /var/log/contrail/vrouter-kernel-build-init INFO: Collecting tf logs: /var/log/contrail/vrouter-nodemgr INFO: Collecting tf logs: /var/log/contrail/vrouter-provisioner INFO: Collecting tf logs: /var/log/contrail/webui-job INFO: Collecting tf logs: /var/log/contrail/webui-web INFO: Collecting tf logs: /var/log/contrail/zookeeper INFO: Collecting tf logs: save_introspect_info INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8100/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8101/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8102/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8103/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8104/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8112/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8113/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8114/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8083/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8084/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8085/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8087/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8088/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8096/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8089/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8090/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8091/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:8092/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:5995/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:5920/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting tf logs: introspection request: curl -s http://cn-jenkins-deploy-platform-ansible-os-2155-1.:5921/Snh_SandeshUVECacheReq?x=NodeStatus INFO: Collecting core dumps INFO: content of /var/crash /var/crash: total 12 drwxrwxrwt 2 root root 4096 Oct 28 01:46 . drwxr-xr-x 14 root root 4096 Oct 28 02:09 .. -rw-r--r-- 1 root root 0 Oct 28 01:46 kdump_lock -rw-r--r-- 1 root root 287 Oct 28 01:46 kexec_cmd INFO: content of /var/crashes /var/crashes: total 8 drwxrwxrwx 2 root root 4096 Oct 28 02:09 . drwxr-xr-x 14 root root 4096 Oct 28 02:09 .. INFO: Collecting statuses from cassandra, zookeeper and rabbitmq services nodetool: Failed to connect to '127.0.0.1:7201' - SecurityException: 'Authentication failed! Credentials required'. nodetool: Failed to connect to '127.0.0.1:7201' - SecurityException: 'Authentication failed! Credentials required'. INFO: Collecting kolla logs admin-openrc.sh barbican-api barbican-keystone-listener barbican-worker cron fluentd glance-api heat-api heat-api-cfn heat-engine horizon keystone keystone-fernet keystone-ssh kolla-toolbox mariadb memcached neutron-dhcp-agent neutron-l3-agent neutron-metadata-agent neutron-server nova-api nova-api-bootstrap nova-cell-bootstrap nova-compute nova-conductor nova-libvirt nova-novncproxy nova-scheduler nova-ssh placement-api rabbitmq ansible.log barbican fluentd glance heat horizon keystone libvirt mariadb neutron nova placement rabbitmq /tmp/ansible-logs/logs /tmp/ansible-logs /tmp/ansible-logs Warning: Permanently added '10.0.0.17' (ED25519) to the list of known hosts. ~/.tf/logs/10.0.0.17 ~ ~ INFO: Stage logs was run successfully Mon Oct 28 02:26:14 UTC 2024 [update tf stack configuration] chmod: cannot access '/home/ubuntu/contrail-kolla-ansible/etc/kolla/': No such file or directory cp: cannot stat '/home/ubuntu/contrail-kolla-ansible/etc/kolla/*': No such file or directory tf setup profile /home/ubuntu/.tf/stack.env DEPLOYER=ansible CONTRAIL_CONTAINER_TAG=nightly CONTRAIL_DEPLOYER_CONTAINER_TAG=nightly CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 DEPLOYER_CONTAINER_REGISTRY=nexus.gz1.opensdn.io:5102 ORCHESTRATOR=openstack OPENSTACK_VERSION="yoga" CONTROLLER_NODES="10.0.0.17 " AGENT_NODES="10.0.0.17" CONTROL_NODES="10.20.0.23 " SSL_ENABLE="false" LEGACY_ANALYTICS_ENABLE="true" HUGE_PAGES_1G= CONTAINER_RUNTIME=docker K8S_CA= DEPLOY_IPA_SERVER= IPA_PASSWORD= OPENSTACK_CONTROLLER_NODES='10.0.0.17' OS_AUTH_URL='http://10.0.0.17:5000/v3' AUTH_PASSWORD='contrail123' AUTH_URL='' INFO: Successful deployment Mon Oct 28 02:26:14 UTC 2024 DEBUG: kill running child jobs: INFO: Deploy logs finished INFO: Copy logs from host to workspace Warning: Permanently added '10.0.0.17' (ECDSA) to the list of known hosts. ~/workspace/collect-logs-and-cleanup@2 ~/workspace/collect-logs-and-cleanup@2 Warning: Permanently added 'nexus.gz1.opensdn.io,212.233.90.199' (ECDSA) to the list of known hosts. Warning: Permanently added 'nexus.gz1.opensdn.io,212.233.90.199' (ECDSA) to the list of known hosts. INFO: Logs collected at http://nexus.gz1.opensdn.io:8082/jenkins_logs/nightly/pipeline_24/ansible-os-vanilla ~/workspace/collect-logs-and-cleanup@2 + /home/jenkins/workspace/collect-logs-and-cleanup@2/src/opensdn-io/tf-jenkins/jobs/devstack/ansible/remove_workers.sh nova CLI is deprecated and will be removed in a future release | locked | False | INFO: do down nodes +-----------------------------+----------------------------------------------------------+ | Field | Value | +-----------------------------+----------------------------------------------------------+ | OS-DCF:diskConfig | MANUAL | | OS-EXT-AZ:availability_zone | GZ1 | | OS-EXT-STS:power_state | Running | | OS-EXT-STS:task_state | None | | OS-EXT-STS:vm_state | active | | OS-SRV-USG:launched_at | 2024-10-28T01:46:16.000000 | | OS-SRV-USG:terminated_at | None | | accessIPv4 | | | accessIPv6 | | | addresses | data=10.20.0.23; management=10.0.0.17 | | config_drive | | | created | 2024-10-28T01:45:52Z | | flavor | STD3-8-32 (e72b2823-6f7e-433e-a18c-942ca2bb41aa) | | hostId | 45e366a071c59927090583a340149fa5843f75e9276b0a0b5fbb8cd3 | | id | ea018499-4c54-4e03-9e00-b94711d3d9ed | | image | N/A (booted from volume) | | key_name | worker | | name | cn_jenkins-deploy-platform-ansible-os-2155_1 | | progress | 0 | | project_id | **** | | properties | | | status | ACTIVE | | updated | 2024-10-28T01:46:16Z | | user_id | 25d930adb802408885c71e341481886e | | volumes_attached | id='2bad2b23-5795-48f5-a841-4863d6478100' | +-----------------------------+----------------------------------------------------------+ Archiving artifacts Finished: SUCCESS