Started by upstream project "test_pull_request_origin_aggregated_logging" build number 74 originally caused by: Started by remote host 50.17.198.52 [EnvInject] - Loading node environment variables. Building in workspace /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content OS_ROOT=/data/src/github.com/openshift/origin INSTANCE_TYPE=c4.xlarge GITHUB_REPO=openshift OS=rhel7 TESTNAME=logging [EnvInject] - Variables injected successfully. [workspace@2] $ /bin/sh -xe /tmp/hudson7662290820197538277.sh + false + unset GOPATH + REPO_NAME=origin-aggregated-logging + rm -rf origin-aggregated-logging + vagrant origin-local-checkout --replace --repo origin-aggregated-logging -b master You don't seem to have the GOPATH environment variable set on your system. See: 'go help gopath' for more details about GOPATH. Waiting for the cloning process to finish Cloning origin-aggregated-logging ... Submodule 'deployer/common' (https://github.com/openshift/origin-integration-common) registered for path 'deployer/common' Submodule 'kibana-proxy' (https://github.com/fabric8io/openshift-auth-proxy.git) registered for path 'kibana-proxy' Cloning into 'deployer/common'... Submodule path 'deployer/common': checked out '45bf993212cdcbab5cbce3b3fab74a72b851402e' Cloning into 'kibana-proxy'... Submodule path 'kibana-proxy': checked out '118dfb40f7a8082d370ba7f4805255c9ec7c8178' Origin repositories cloned into /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2 + pushd origin-aggregated-logging ~/jobs/test-origin-aggregated-logging/workspace@2/origin-aggregated-logging ~/jobs/test-origin-aggregated-logging/workspace@2 + git checkout master Already on 'master' + popd ~/jobs/test-origin-aggregated-logging/workspace@2 + '[' -n '' ']' + vagrant origin-local-checkout --replace You don't seem to have the GOPATH environment variable set on your system. See: 'go help gopath' for more details about GOPATH. Waiting for the cloning process to finish Checking repo integrity for /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2 # On branch master # Untracked files: # (use "git add <file>..." to include in what will be committed) # # artifacts/ nothing added to commit but untracked files present (use "git add" to track) ~/jobs/test-origin-aggregated-logging/workspace@2 Replacing: /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2 From https://github.com/openshift/origin e935d8e..ba62cde master -> origin/master * [new tag] v3.6.0-alpha.2 -> v3.6.0-alpha.2 Already on 'master' Your branch is behind 'origin/master' by 13 commits, and can be fast-forwarded. (use "git pull" to update your local branch) HEAD is now at ba62cde Merge pull request #14474 from deads2k/client-ca-retry Removing .vagrant-openshift.json Removing .vagrant/ Removing artifacts/ fatal: branch name required ~/jobs/test-origin-aggregated-logging/workspace@2 Origin repositories cloned into /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2 + pushd origin ~/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2 + INSTANCE_NAME=origin_logging-rhel7-1627 + GIT_URL=https://github.com/openshift/origin-aggregated-logging ++ echo https://github.com/openshift/origin-aggregated-logging ++ sed s,https://,, + OAL_LOCAL_PATH=github.com/openshift/origin-aggregated-logging + OS_O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging + env + sort _=/bin/env BRANCH=master BUILD_CAUSE=UPSTREAMTRIGGER BUILD_CAUSE_UPSTREAMTRIGGER=true BUILD_DISPLAY_NAME=#1627 BUILD_ID=1627 BUILD_NUMBER=1627 BUILD_TAG=jenkins-test-origin-aggregated-logging-1627 BUILD_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1627/ EXECUTOR_NUMBER=49 GITHUB_REPO=openshift HOME=/var/lib/jenkins HUDSON_COOKIE=4920a961-2633-406c-8f16-40012b06b1e5 HUDSON_HOME=/var/lib/jenkins HUDSON_SERVER_COOKIE=ec11f8b2841c966f HUDSON_URL=https://ci.openshift.redhat.com/jenkins/ INSTANCE_TYPE=c4.xlarge JENKINS_HOME=/var/lib/jenkins JENKINS_SERVER_COOKIE=ec11f8b2841c966f JENKINS_URL=https://ci.openshift.redhat.com/jenkins/ JOB_BASE_NAME=test-origin-aggregated-logging JOB_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/display/redirect JOB_NAME=test-origin-aggregated-logging JOB_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/ LANG=en_US.UTF-8 LOGNAME=jenkins MERGE=false MERGE_SEVERITY=none NLSPATH=/usr/dt/lib/nls/msg/%L/%N.cat NODE_LABELS=master NODE_NAME=master OLDPWD=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2 OPENSHIFT_ANSIBLE_TARGET_BRANCH=master ORIGIN_AGGREGATED_LOGGING_PULL_ID=421 ORIGIN_AGGREGATED_LOGGING_TARGET_BRANCH=master OS_ANSIBLE_BRANCH=master OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS=rhel7 OS_ROOT=/data/src/github.com/openshift/origin PATH=/sbin:/usr/sbin:/bin:/usr/bin PWD=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin ROOT_BUILD_CAUSE=REMOTECAUSE ROOT_BUILD_CAUSE_REMOTECAUSE=true RUN_CHANGES_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1627/display/redirect?page=changes RUN_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1627/display/redirect SHELL=/bin/bash SHLVL=3 TESTNAME=logging TEST_PERF=false USER=jenkins WORKSPACE=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2 XFILESEARCHPATH=/usr/dt/app-defaults/%L/Dt + vagrant origin-init --stage inst --os rhel7 --instance-type c4.xlarge origin_logging-rhel7-1627 Reading AWS credentials from /var/lib/jenkins/.awscred Searching devenv-rhel7_* for latest base AMI (required_name_tag=) Found: ami-83a1fc95 (devenv-rhel7_6323) ++ seq 0 2 + for i in '$(seq 0 2)' + vagrant up --provider aws Bringing machine 'openshiftdev' up with 'aws' provider... ==> openshiftdev: Warning! The AWS provider doesn't support any of the Vagrant ==> openshiftdev: high-level network configurations (`config.vm.network`). They ==> openshiftdev: will be silently ignored. ==> openshiftdev: Warning! You're launching this instance into a VPC without an ==> openshiftdev: elastic IP. Please verify you're properly connected to a VPN so ==> openshiftdev: you can access this machine, otherwise Vagrant will not be able ==> openshiftdev: to SSH into it. ==> openshiftdev: Launching an instance with the following settings... ==> openshiftdev: -- Type: c4.xlarge ==> openshiftdev: -- AMI: ami-83a1fc95 ==> openshiftdev: -- Region: us-east-1 ==> openshiftdev: -- Keypair: libra ==> openshiftdev: -- Subnet ID: subnet-cf57c596 ==> openshiftdev: -- User Data: yes ==> openshiftdev: -- User Data: ==> openshiftdev: # cloud-config ==> openshiftdev: ==> openshiftdev: growpart: ==> openshiftdev: mode: auto ==> openshiftdev: devices: ['/'] ==> openshiftdev: runcmd: ==> openshiftdev: - [ sh, -xc, "sed -i s/^Defaults.*requiretty/#Defaults requiretty/g /etc/sudoers"] ==> openshiftdev: ==> openshiftdev: -- Block Device Mapping: [{"DeviceName"=>"/dev/sda1", "Ebs.VolumeSize"=>25, "Ebs.VolumeType"=>"gp2"}, {"DeviceName"=>"/dev/sdb", "Ebs.VolumeSize"=>35, "Ebs.VolumeType"=>"gp2"}] ==> openshiftdev: -- Terminate On Shutdown: false ==> openshiftdev: -- Monitoring: false ==> openshiftdev: -- EBS optimized: false ==> openshiftdev: -- Assigning a public IP address in a VPC: false /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/expects.rb:6:in `response_call': The instance ID 'i-0e2879806f1ee5b7a' does not exist (Fog::Compute::AWS::NotFound) from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/response_parser.rb:8:in `response_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/connection.rb:389:in `response' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/connection.rb:253:in `request' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/idempotent.rb:26:in `error_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/base.rb:10:in `error_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/base.rb:10:in `error_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/connection.rb:273:in `rescue in request' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/connection.rb:221:in `request' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/idempotent.rb:26:in `error_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/base.rb:10:in `error_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/base.rb:10:in `error_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/connection.rb:273:in `rescue in request' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/connection.rb:221:in `request' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/idempotent.rb:26:in `error_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/base.rb:10:in `error_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/middlewares/base.rb:10:in `error_call' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/connection.rb:273:in `rescue in request' from /var/lib/jenkins/.vagrant.d/gems/gems/excon-0.49.0/lib/excon/connection.rb:221:in `request' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-xml-0.1.1/lib/fog/xml/sax_parser_connection.rb:37:in `request' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-xml-0.1.1/lib/fog/xml/connection.rb:7:in `request' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-1.26.0/lib/fog/aws/compute.rb:522:in `_request' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-1.26.0/lib/fog/aws/compute.rb:517:in `request' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-1.26.0/lib/fog/aws/requests/compute/create_tags.rb:31:in `create_tags' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-1.26.0/lib/fog/aws/models/compute/servers.rb:167:in `block in save_many' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-1.26.0/lib/fog/aws/models/compute/servers.rb:161:in `map' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-1.26.0/lib/fog/aws/models/compute/servers.rb:161:in `save_many' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-1.26.0/lib/fog/aws/models/compute/server.rb:201:in `save' from /var/lib/jenkins/.vagrant.d/gems/gems/fog-core-1.43.0/lib/fog/core/collection.rb:51:in `create' from /var/lib/jenkins/.vagrant.d/gems/gems/vagrant-aws-0.6.0/lib/vagrant-aws/action/run_instance.rb:102:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /var/lib/jenkins/.vagrant.d/gems/gems/vagrant-openshift-3.0.9/lib/vagrant-openshift/hooks/find_ami.rb:37:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /var/lib/jenkins/.vagrant.d/gems/gems/vagrant-aws-0.6.0/lib/vagrant-aws/action/elb_register_instance.rb:16:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /var/lib/jenkins/.vagrant.d/gems/gems/vagrant-aws-0.6.0/lib/vagrant-aws/action/warn_networks.rb:14:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/synced_folders.rb:86:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/provision.rb:80:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:95:in `block in finalize_action' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in `block in run' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:in `busy' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in `run' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/call.rb:53:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /var/lib/jenkins/.vagrant.d/gems/gems/vagrant-aws-0.6.0/lib/vagrant-aws/action/connect_aws.rb:43:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/config_validate.rb:25:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builtin/handle_box.rb:56:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/warden.rb:34:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/builder.rb:116:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in `block in run' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/util/busy.rb:19:in `busy' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/action/runner.rb:66:in `run' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:214:in `action_raw' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:191:in `block in action' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/environment.rb:516:in `lock' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:178:in `call' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/machine.rb:178:in `action' from /opt/vagrant/embedded/gems/gems/vagrant-1.7.4/lib/vagrant/batch_action.rb:82:in `block (2 levels) in run' + echo ''\''vagrant up'\'' failed - retrying' 'vagrant up' failed - retrying + vagrant destroy -f ==> openshiftdev: Instance is not created. Please run `vagrant up` first. + for i in '$(seq 0 2)' + vagrant up --provider aws Bringing machine 'openshiftdev' up with 'aws' provider... ==> openshiftdev: Warning! The AWS provider doesn't support any of the Vagrant ==> openshiftdev: high-level network configurations (`config.vm.network`). They ==> openshiftdev: will be silently ignored. ==> openshiftdev: Warning! You're launching this instance into a VPC without an ==> openshiftdev: elastic IP. Please verify you're properly connected to a VPN so ==> openshiftdev: you can access this machine, otherwise Vagrant will not be able ==> openshiftdev: to SSH into it. ==> openshiftdev: Launching an instance with the following settings... ==> openshiftdev: -- Type: c4.xlarge ==> openshiftdev: -- AMI: ami-83a1fc95 ==> openshiftdev: -- Region: us-east-1 ==> openshiftdev: -- Keypair: libra ==> openshiftdev: -- Subnet ID: subnet-cf57c596 ==> openshiftdev: -- User Data: yes ==> openshiftdev: -- User Data: ==> openshiftdev: # cloud-config ==> openshiftdev: ==> openshiftdev: growpart: ==> openshiftdev: mode: auto ==> openshiftdev: devices: ['/'] ==> openshiftdev: runcmd: ==> openshiftdev: - [ sh, -xc, "sed -i s/^Defaults.*requiretty/#Defaults requiretty/g /etc/sudoers"] ==> openshiftdev: ==> openshiftdev: -- Block Device Mapping: [{"DeviceName"=>"/dev/sda1", "Ebs.VolumeSize"=>25, "Ebs.VolumeType"=>"gp2"}, {"DeviceName"=>"/dev/sdb", "Ebs.VolumeSize"=>35, "Ebs.VolumeType"=>"gp2"}] ==> openshiftdev: -- Terminate On Shutdown: false ==> openshiftdev: -- Monitoring: false ==> openshiftdev: -- EBS optimized: false ==> openshiftdev: -- Assigning a public IP address in a VPC: false ==> openshiftdev: Waiting for instance to become "ready"... ==> openshiftdev: Waiting for SSH to become available... ==> openshiftdev: Machine is booted and ready for use! ==> openshiftdev: Running provisioner: setup (shell)... openshiftdev: Running: /tmp/vagrant-shell20170608-28638-1bvl5ei.sh ==> openshiftdev: Host: ec2-54-144-35-174.compute-1.amazonaws.com + break + vagrant sync-origin-aggregated-logging -c -s Running ssh/sudo command 'rm -rf /data/src/github.com/openshift/origin-aggregated-logging-bare; ' with timeout 14400. Attempt #0 Running ssh/sudo command 'mkdir -p /ec2-user/.ssh; mv /tmp/file20170608-29456-1h2rztc /ec2-user/.ssh/config && chown ec2-user:ec2-user /ec2-user/.ssh/config && chmod 0600 /ec2-user/.ssh/config' with timeout 14400. Attempt #0 Running ssh/sudo command 'mkdir -p /data/src/github.com/openshift/' with timeout 14400. Attempt #0 Running ssh/sudo command 'mkdir -p /data/src/github.com/openshift/builder && chown -R ec2-user:ec2-user /data/src/github.com/openshift/' with timeout 14400. Attempt #0 Running ssh/sudo command 'set -e rm -fr /data/src/github.com/openshift/origin-aggregated-logging-bare; if [ ! -d /data/src/github.com/openshift/origin-aggregated-logging-bare ]; then git clone --quiet --bare https://github.com/openshift/origin-aggregated-logging.git /data/src/github.com/openshift/origin-aggregated-logging-bare >/dev/null fi ' with timeout 14400. Attempt #0 Synchronizing local sources Synchronizing [origin-aggregated-logging@master] from origin-aggregated-logging... Warning: Permanently added '54.144.35.174' (ECDSA) to the list of known hosts. Running ssh/sudo command 'set -e if [ -d /data/src/github.com/openshift/origin-aggregated-logging-bare ]; then rm -rf /data/src/github.com/openshift/origin-aggregated-logging echo 'Cloning origin-aggregated-logging ...' git clone --quiet --recurse-submodules /data/src/github.com/openshift/origin-aggregated-logging-bare /data/src/github.com/openshift/origin-aggregated-logging else MISSING_REPO+='origin-aggregated-logging-bare' fi if [ -n "$MISSING_REPO" ]; then echo 'Missing required upstream repositories:' echo $MISSING_REPO echo 'To fix, execute command: vagrant clone-upstream-repos' fi ' with timeout 14400. Attempt #0 Cloning origin-aggregated-logging ... Submodule 'deployer/common' (https://github.com/openshift/origin-integration-common) registered for path 'deployer/common' Submodule 'kibana-proxy' (https://github.com/fabric8io/openshift-auth-proxy.git) registered for path 'kibana-proxy' Cloning into 'deployer/common'... Submodule path 'deployer/common': checked out '45bf993212cdcbab5cbce3b3fab74a72b851402e' Cloning into 'kibana-proxy'... Submodule path 'kibana-proxy': checked out '118dfb40f7a8082d370ba7f4805255c9ec7c8178' + vagrant ssh -c 'if [ ! -d /tmp/openshift ] ; then mkdir /tmp/openshift ; fi ; sudo chmod 777 /tmp/openshift' + for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana + echo pulling image openshift/base-centos7 ... pulling image openshift/base-centos7 ... + vagrant ssh -c 'docker pull openshift/base-centos7' -- -n Using default tag: latest Trying to pull repository docker.io/openshift/base-centos7 ... latest: Pulling from docker.io/openshift/base-centos7 45a2e645736c: Pulling fs layer 734fb161cf89: Pulling fs layer 78efc9e155c4: Pulling fs layer 8a3400b7e31a: Pulling fs layer 8a3400b7e31a: Waiting 734fb161cf89: Verifying Checksum 734fb161cf89: Download complete 8a3400b7e31a: Verifying Checksum 8a3400b7e31a: Download complete 78efc9e155c4: Verifying Checksum 78efc9e155c4: Download complete 45a2e645736c: Verifying Checksum 45a2e645736c: Download complete 45a2e645736c: Pull complete 734fb161cf89: Pull complete 78efc9e155c4: Pull complete 8a3400b7e31a: Pull complete Digest: sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c + echo done with openshift/base-centos7 done with openshift/base-centos7 + for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana + echo pulling image centos:centos7 ... pulling image centos:centos7 ... + vagrant ssh -c 'docker pull centos:centos7' -- -n Trying to pull repository docker.io/library/centos ... centos7: Pulling from docker.io/library/centos Digest: sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9 + echo done with centos:centos7 done with centos:centos7 + for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana + echo pulling image openshift/origin-logging-elasticsearch ... pulling image openshift/origin-logging-elasticsearch ... + vagrant ssh -c 'docker pull openshift/origin-logging-elasticsearch' -- -n Using default tag: latest Trying to pull repository docker.io/openshift/origin-logging-elasticsearch ... latest: Pulling from docker.io/openshift/origin-logging-elasticsearch d5e46245fe40: Already exists a7f338c4f8f1: Pulling fs layer 9e2e7a74201a: Pulling fs layer fef68d5538a8: Pulling fs layer 8d01a96d29f1: Pulling fs layer dbc1ff1ecc57: Pulling fs layer 1bd6b3975e11: Pulling fs layer 50f97f247f0a: Pulling fs layer 1661b1dc3fa9: Pulling fs layer 33ff5ec495e5: Pulling fs layer 239942808138: Pulling fs layer dbc1ff1ecc57: Waiting 1bd6b3975e11: Waiting 50f97f247f0a: Waiting 1661b1dc3fa9: Waiting 33ff5ec495e5: Waiting 239942808138: Waiting 8d01a96d29f1: Waiting fef68d5538a8: Verifying Checksum fef68d5538a8: Download complete a7f338c4f8f1: Verifying Checksum 8d01a96d29f1: Verifying Checksum 8d01a96d29f1: Download complete dbc1ff1ecc57: Verifying Checksum dbc1ff1ecc57: Download complete 1bd6b3975e11: Verifying Checksum 1bd6b3975e11: Download complete 50f97f247f0a: Verifying Checksum 50f97f247f0a: Download complete 33ff5ec495e5: Download complete 239942808138: Verifying Checksum 239942808138: Download complete 1661b1dc3fa9: Verifying Checksum 1661b1dc3fa9: Download complete 9e2e7a74201a: Download complete a7f338c4f8f1: Pull complete 9e2e7a74201a: Pull complete fef68d5538a8: Pull complete 8d01a96d29f1: Pull complete dbc1ff1ecc57: Pull complete 1bd6b3975e11: Pull complete 50f97f247f0a: Pull complete 1661b1dc3fa9: Pull complete 33ff5ec495e5: Pull complete 239942808138: Pull complete Digest: sha256:1e72563ad0551f5c15fc6aa8057a64cc9d0c21b2c40bca7efabdd1b55a4fc2e4 + echo done with openshift/origin-logging-elasticsearch done with openshift/origin-logging-elasticsearch + for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana + echo pulling image openshift/origin-logging-fluentd ... pulling image openshift/origin-logging-fluentd ... + vagrant ssh -c 'docker pull openshift/origin-logging-fluentd' -- -n Using default tag: latest Trying to pull repository docker.io/openshift/origin-logging-fluentd ... latest: Pulling from docker.io/openshift/origin-logging-fluentd d5e46245fe40: Already exists e4a1001ab6e5: Pulling fs layer 574b0fde62a3: Pulling fs layer e153c28eb839: Pulling fs layer 38620628d3c7: Pulling fs layer af3228b34eff: Pulling fs layer 38620628d3c7: Waiting af3228b34eff: Waiting e153c28eb839: Verifying Checksum e153c28eb839: Download complete 38620628d3c7: Download complete af3228b34eff: Verifying Checksum af3228b34eff: Download complete 574b0fde62a3: Verifying Checksum 574b0fde62a3: Download complete e4a1001ab6e5: Verifying Checksum e4a1001ab6e5: Download complete e4a1001ab6e5: Pull complete 574b0fde62a3: Pull complete e153c28eb839: Pull complete 38620628d3c7: Pull complete af3228b34eff: Pull complete Digest: sha256:bc70848086a50bad58a2f41e166098e8ed351bf4dbe7af83caeb7a29f35b4395 + echo done with openshift/origin-logging-fluentd done with openshift/origin-logging-fluentd + for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana + echo pulling image openshift/origin-logging-curator ... pulling image openshift/origin-logging-curator ... + vagrant ssh -c 'docker pull openshift/origin-logging-curator' -- -n Using default tag: latest Trying to pull repository docker.io/openshift/origin-logging-curator ... latest: Pulling from docker.io/openshift/origin-logging-curator d5e46245fe40: Already exists 9b159b6e6e2b: Pulling fs layer e4616c6e28d7: Pulling fs layer 9b159b6e6e2b: Download complete e4616c6e28d7: Verifying Checksum e4616c6e28d7: Download complete 9b159b6e6e2b: Pull complete e4616c6e28d7: Pull complete Digest: sha256:e820338ca7fb0addfaec25d80d40a49f5ea25b24ff056ab6adbb42dd9eec94b4 + echo done with openshift/origin-logging-curator done with openshift/origin-logging-curator + for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana + echo pulling image openshift/origin-logging-kibana ... pulling image openshift/origin-logging-kibana ... + vagrant ssh -c 'docker pull openshift/origin-logging-kibana' -- -n Using default tag: latest Trying to pull repository docker.io/openshift/origin-logging-kibana ... latest: Pulling from docker.io/openshift/origin-logging-kibana 45a2e645736c: Already exists 734fb161cf89: Already exists 78efc9e155c4: Already exists 8a3400b7e31a: Already exists 51a36b029166: Pulling fs layer e57c029afcc6: Pulling fs layer 89f2e4ae387a: Pulling fs layer b036afb2cb60: Pulling fs layer 1c68a5b6ade6: Pulling fs layer 6e5af8882c65: Pulling fs layer b036afb2cb60: Waiting 6e5af8882c65: Waiting 1c68a5b6ade6: Waiting 51a36b029166: Verifying Checksum 51a36b029166: Download complete 89f2e4ae387a: Download complete b036afb2cb60: Download complete 1c68a5b6ade6: Verifying Checksum 1c68a5b6ade6: Download complete 51a36b029166: Pull complete e57c029afcc6: Verifying Checksum e57c029afcc6: Download complete 6e5af8882c65: Verifying Checksum 6e5af8882c65: Download complete e57c029afcc6: Pull complete 89f2e4ae387a: Pull complete b036afb2cb60: Pull complete 1c68a5b6ade6: Pull complete 6e5af8882c65: Pull complete Digest: sha256:70ead525ed596b73301e8df3ac229e33dd7f8431ec1233b37e96544c556530e9 + echo done with openshift/origin-logging-kibana done with openshift/origin-logging-kibana + vagrant test-origin-aggregated-logging -d --env GIT_URL=https://github.com/openshift/origin-aggregated-logging --env GIT_BRANCH=master --env O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging --env OS_ROOT=/data/src/github.com/openshift/origin --env ENABLE_OPS_CLUSTER=true --env USE_LOCAL_SOURCE=true --env TEST_PERF=false --env VERBOSE=1 --env OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible --env OS_ANSIBLE_BRANCH=master *************************************************** Running GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh... /data/src/github.com/openshift/origin /data/src/github.com/openshift/origin-aggregated-logging/hack/testing /data/src/github.com/openshift/origin-aggregated-logging/hack/testing /data/src/github.com/openshift/origin-aggregated-logging /data/src/github.com/openshift/origin-aggregated-logging/hack/testing /data/src/github.com/openshift/origin-aggregated-logging/hack/testing Loaded plugins: amazon-id, rhui-lb, search-disabled-repos Metadata Cache Created Loaded plugins: amazon-id, rhui-lb, search-disabled-repos Resolving Dependencies --> Running transaction check ---> Package ansible.noarch 0:2.3.0.0-3.el7 will be installed --> Processing Dependency: sshpass for package: ansible-2.3.0.0-3.el7.noarch --> Processing Dependency: python-paramiko for package: ansible-2.3.0.0-3.el7.noarch --> Processing Dependency: python-keyczar for package: ansible-2.3.0.0-3.el7.noarch --> Processing Dependency: python-httplib2 for package: ansible-2.3.0.0-3.el7.noarch --> Processing Dependency: python-crypto for package: ansible-2.3.0.0-3.el7.noarch ---> Package python2-pip.noarch 0:8.1.2-5.el7 will be installed ---> Package python2-ruamel-yaml.x86_64 0:0.12.14-9.el7 will be installed --> Processing Dependency: python2-typing for package: python2-ruamel-yaml-0.12.14-9.el7.x86_64 --> Processing Dependency: python2-ruamel-ordereddict for package: python2-ruamel-yaml-0.12.14-9.el7.x86_64 --> Running transaction check ---> Package python-httplib2.noarch 0:0.9.1-2.el7aos will be installed ---> Package python-keyczar.noarch 0:0.71c-2.el7aos will be installed --> Processing Dependency: python-pyasn1 for package: python-keyczar-0.71c-2.el7aos.noarch ---> Package python-paramiko.noarch 0:2.1.1-1.el7 will be installed --> Processing Dependency: python-cryptography for package: python-paramiko-2.1.1-1.el7.noarch ---> Package python2-crypto.x86_64 0:2.6.1-13.el7 will be installed --> Processing Dependency: libtomcrypt.so.0()(64bit) for package: python2-crypto-2.6.1-13.el7.x86_64 ---> Package python2-ruamel-ordereddict.x86_64 0:0.4.9-3.el7 will be installed ---> Package python2-typing.noarch 0:3.5.2.2-3.el7 will be installed ---> Package sshpass.x86_64 0:1.06-1.el7 will be installed --> Running transaction check ---> Package libtomcrypt.x86_64 0:1.17-23.el7 will be installed --> Processing Dependency: libtommath >= 0.42.0 for package: libtomcrypt-1.17-23.el7.x86_64 --> Processing Dependency: libtommath.so.0()(64bit) for package: libtomcrypt-1.17-23.el7.x86_64 ---> Package python2-cryptography.x86_64 0:1.3.1-3.el7 will be installed --> Processing Dependency: python-idna >= 2.0 for package: python2-cryptography-1.3.1-3.el7.x86_64 --> Processing Dependency: python-cffi >= 1.4.1 for package: python2-cryptography-1.3.1-3.el7.x86_64 --> Processing Dependency: python-ipaddress for package: python2-cryptography-1.3.1-3.el7.x86_64 --> Processing Dependency: python-enum34 for package: python2-cryptography-1.3.1-3.el7.x86_64 ---> Package python2-pyasn1.noarch 0:0.1.9-7.el7 will be installed --> Running transaction check ---> Package libtommath.x86_64 0:0.42.0-4.el7 will be installed ---> Package python-cffi.x86_64 0:1.6.0-5.el7 will be installed --> Processing Dependency: python-pycparser for package: python-cffi-1.6.0-5.el7.x86_64 ---> Package python-enum34.noarch 0:1.0.4-1.el7 will be installed ---> Package python-idna.noarch 0:2.0-1.el7 will be installed ---> Package python-ipaddress.noarch 0:1.0.16-2.el7 will be installed --> Running transaction check ---> Package python-pycparser.noarch 0:2.14-1.el7 will be installed --> Processing Dependency: python-ply for package: python-pycparser-2.14-1.el7.noarch --> Running transaction check ---> Package python-ply.noarch 0:3.4-10.el7 will be installed --> Finished Dependency Resolution Dependencies Resolved ================================================================================ Package Arch Version Repository Size ================================================================================ Installing: ansible noarch 2.3.0.0-3.el7 epel 5.7 M python2-pip noarch 8.1.2-5.el7 epel 1.7 M python2-ruamel-yaml x86_64 0.12.14-9.el7 li 245 k Installing for dependencies: libtomcrypt x86_64 1.17-23.el7 epel 224 k libtommath x86_64 0.42.0-4.el7 epel 35 k python-cffi x86_64 1.6.0-5.el7 oso-rhui-rhel-server-releases 218 k python-enum34 noarch 1.0.4-1.el7 oso-rhui-rhel-server-releases 52 k python-httplib2 noarch 0.9.1-2.el7aos li 115 k python-idna noarch 2.0-1.el7 oso-rhui-rhel-server-releases 92 k python-ipaddress noarch 1.0.16-2.el7 oso-rhui-rhel-server-releases 34 k python-keyczar noarch 0.71c-2.el7aos rhel-7-server-ose-3.1-rpms 217 k python-paramiko noarch 2.1.1-1.el7 rhel-7-server-ose-3.4-rpms 266 k python-ply noarch 3.4-10.el7 oso-rhui-rhel-server-releases 123 k python-pycparser noarch 2.14-1.el7 oso-rhui-rhel-server-releases 105 k python2-crypto x86_64 2.6.1-13.el7 epel 476 k python2-cryptography x86_64 1.3.1-3.el7 oso-rhui-rhel-server-releases 471 k python2-pyasn1 noarch 0.1.9-7.el7 oso-rhui-rhel-server-releases 100 k python2-ruamel-ordereddict x86_64 0.4.9-3.el7 li 38 k python2-typing noarch 3.5.2.2-3.el7 epel 39 k sshpass x86_64 1.06-1.el7 epel 21 k Transaction Summary ================================================================================ Install 3 Packages (+17 Dependent packages) Total download size: 10 M Installed size: 47 M Downloading packages: -------------------------------------------------------------------------------- Total 5.2 MB/s | 10 MB 00:01 Running transaction check Running transaction test Transaction test succeeded Running transaction Installing : python2-pyasn1-0.1.9-7.el7.noarch 1/20 Installing : sshpass-1.06-1.el7.x86_64 2/20 Installing : libtommath-0.42.0-4.el7.x86_64 3/20 Installing : libtomcrypt-1.17-23.el7.x86_64 4/20 Installing : python2-crypto-2.6.1-13.el7.x86_64 5/20 Installing : python-keyczar-0.71c-2.el7aos.noarch 6/20 Installing : python-enum34-1.0.4-1.el7.noarch 7/20 Installing : python-ply-3.4-10.el7.noarch 8/20 Installing : python-pycparser-2.14-1.el7.noarch 9/20 Installing : python-cffi-1.6.0-5.el7.x86_64 10/20 Installing : python-httplib2-0.9.1-2.el7aos.noarch 11/20 Installing : python-idna-2.0-1.el7.noarch 12/20 Installing : python2-ruamel-ordereddict-0.4.9-3.el7.x86_64 13/20 Installing : python2-typing-3.5.2.2-3.el7.noarch 14/20 Installing : python-ipaddress-1.0.16-2.el7.noarch 15/20 Installing : python2-cryptography-1.3.1-3.el7.x86_64 16/20 Installing : python-paramiko-2.1.1-1.el7.noarch 17/20 Installing : ansible-2.3.0.0-3.el7.noarch 18/20 Installing : python2-ruamel-yaml-0.12.14-9.el7.x86_64 19/20 Installing : python2-pip-8.1.2-5.el7.noarch 20/20 Verifying : python-pycparser-2.14-1.el7.noarch 1/20 Verifying : python-ipaddress-1.0.16-2.el7.noarch 2/20 Verifying : ansible-2.3.0.0-3.el7.noarch 3/20 Verifying : python2-typing-3.5.2.2-3.el7.noarch 4/20 Verifying : python2-pip-8.1.2-5.el7.noarch 5/20 Verifying : python2-pyasn1-0.1.9-7.el7.noarch 6/20 Verifying : libtomcrypt-1.17-23.el7.x86_64 7/20 Verifying : python-cffi-1.6.0-5.el7.x86_64 8/20 Verifying : python2-ruamel-yaml-0.12.14-9.el7.x86_64 9/20 Verifying : python2-ruamel-ordereddict-0.4.9-3.el7.x86_64 10/20 Verifying : python-idna-2.0-1.el7.noarch 11/20 Verifying : python-httplib2-0.9.1-2.el7aos.noarch 12/20 Verifying : python-ply-3.4-10.el7.noarch 13/20 Verifying : python-enum34-1.0.4-1.el7.noarch 14/20 Verifying : python-keyczar-0.71c-2.el7aos.noarch 15/20 Verifying : libtommath-0.42.0-4.el7.x86_64 16/20 Verifying : sshpass-1.06-1.el7.x86_64 17/20 Verifying : python2-cryptography-1.3.1-3.el7.x86_64 18/20 Verifying : python-paramiko-2.1.1-1.el7.noarch 19/20 Verifying : python2-crypto-2.6.1-13.el7.x86_64 20/20 Installed: ansible.noarch 0:2.3.0.0-3.el7 python2-pip.noarch 0:8.1.2-5.el7 python2-ruamel-yaml.x86_64 0:0.12.14-9.el7 Dependency Installed: libtomcrypt.x86_64 0:1.17-23.el7 libtommath.x86_64 0:0.42.0-4.el7 python-cffi.x86_64 0:1.6.0-5.el7 python-enum34.noarch 0:1.0.4-1.el7 python-httplib2.noarch 0:0.9.1-2.el7aos python-idna.noarch 0:2.0-1.el7 python-ipaddress.noarch 0:1.0.16-2.el7 python-keyczar.noarch 0:0.71c-2.el7aos python-paramiko.noarch 0:2.1.1-1.el7 python-ply.noarch 0:3.4-10.el7 python-pycparser.noarch 0:2.14-1.el7 python2-crypto.x86_64 0:2.6.1-13.el7 python2-cryptography.x86_64 0:1.3.1-3.el7 python2-pyasn1.noarch 0:0.1.9-7.el7 python2-ruamel-ordereddict.x86_64 0:0.4.9-3.el7 python2-typing.noarch 0:3.5.2.2-3.el7 sshpass.x86_64 0:1.06-1.el7 Complete! Cloning into '/tmp/tmp.NlS0ai2Nx2/openhift-ansible'... Copying oc from path to /usr/local/bin for use by openshift-ansible Copying oc from path to /usr/bin for use by openshift-ansible Copying oadm from path to /usr/local/bin for use by openshift-ansible Copying oadm from path to /usr/bin for use by openshift-ansible [INFO] Starting logging tests at Thu Jun 8 10:28:12 EDT 2017 Generated new key pair as /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/serviceaccounts.public.key and /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/serviceaccounts.private.key Generating node credentials ... Created node config for 172.18.5.99 in /tmp/openshift/origin-aggregated-logging/openshift.local.config/node-172.18.5.99 Wrote master config to: /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/master-config.yaml Running hack/lib/start.sh:352: executing 'oc get --raw /healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s... SUCCESS after 27.678s: hack/lib/start.sh:352: executing 'oc get --raw /healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s Standard output from the command: ok Standard error from the command: The connection to the server 172.18.5.99:8443 was refused - did you specify the right host or port? ... repeated 58 times Error from server (Forbidden): User "system:admin" cannot "get" on "/healthz" ... repeated 6 times Running hack/lib/start.sh:353: executing 'oc get --raw https://172.18.5.99:10250/healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.5s until completion or 120.000s... SUCCESS after 0.190s: hack/lib/start.sh:353: executing 'oc get --raw https://172.18.5.99:10250/healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.5s until completion or 120.000s Standard output from the command: ok There was no error output from the command. Running hack/lib/start.sh:354: executing 'oc get --raw /healthz/ready --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s... SUCCESS after 0.783s: hack/lib/start.sh:354: executing 'oc get --raw /healthz/ready --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s Standard output from the command: ok Standard error from the command: Error from server (InternalError): an error on the server ("") has prevented the request from succeeding Running hack/lib/start.sh:355: executing 'oc get service kubernetes --namespace default --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 160.000s... SUCCESS after 0.428s: hack/lib/start.sh:355: executing 'oc get service kubernetes --namespace default --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 160.000s Standard output from the command: NAME CLUSTER-IP EXTERNAL-IP PORT(S) AGE kubernetes 172.30.0.1 <none> 443/TCP,53/UDP,53/TCP 4s There was no error output from the command. Running hack/lib/start.sh:356: executing 'oc get --raw /api/v1/nodes/172.18.5.99 --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 80.000s... SUCCESS after 0.291s: hack/lib/start.sh:356: executing 'oc get --raw /api/v1/nodes/172.18.5.99 --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 80.000s Standard output from the command: {"kind":"Node","apiVersion":"v1","metadata":{"name":"172.18.5.99","selfLink":"/api/v1/nodes/172.18.5.99","uid":"cf4b679d-4c56-11e7-b45c-0ee3ca2e9d16","resourceVersion":"287","creationTimestamp":"2017-06-08T14:28:58Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/hostname":"172.18.5.99"},"annotations":{"volumes.kubernetes.io/controller-managed-attach-detach":"true"}},"spec":{"externalID":"172.18.5.99","providerID":"aws:////i-0595778c10a3c3d79"},"status":{"capacity":{"cpu":"4","memory":"7231688Ki","pods":"40"},"allocatable":{"cpu":"4","memory":"7129288Ki","pods":"40"},"conditions":[{"type":"OutOfDisk","status":"False","lastHeartbeatTime":"2017-06-08T14:28:58Z","lastTransitionTime":"2017-06-08T14:28:58Z","reason":"KubeletHasSufficientDisk","message":"kubelet has sufficient disk space available"},{"type":"MemoryPressure","status":"False","lastHeartbeatTime":"2017-06-08T14:28:58Z","lastTransitionTime":"2017-06-08T14:28:58Z","reason":"KubeletHasSufficientMemory","message":"kubelet has sufficient memory available"},{"type":"DiskPressure","status":"False","lastHeartbeatTime":"2017-06-08T14:28:58Z","lastTransitionTime":"2017-06-08T14:28:58Z","reason":"KubeletHasNoDiskPressure","message":"kubelet has no disk pressure"},{"type":"Ready","status":"True","lastHeartbeatTime":"2017-06-08T14:28:58Z","lastTransitionTime":"2017-06-08T14:28:58Z","reason":"KubeletReady","message":"kubelet is posting ready status"}],"addresses":[{"type":"LegacyHostIP","address":"172.18.5.99"},{"type":"InternalIP","address":"172.18.5.99"},{"type":"Hostname","address":"172.18.5.99"}],"daemonEndpoints":{"kubeletEndpoint":{"Port":10250}},"nodeInfo":{"machineID":"f9370ed252a14f73b014c1301a9b6d1b","systemUUID":"EC294375-D0A5-8D41-E873-7AA385EB3040","bootID":"018d0681-b50c-4944-b009-fae4ea4dc670","kernelVersion":"3.10.0-327.22.2.el7.x86_64","osImage":"Red Hat Enterprise Linux Server 7.3 (Maipo)","containerRuntimeVersion":"docker://1.12.6","kubeletVersion":"v1.6.1+5115d708d7","kubeProxyVersion":"v1.6.1+5115d708d7","operatingSystem":"linux","architecture":"amd64"},"images":[{"names":["openshift/origin-federation:6acabdc","openshift/origin-federation:latest"],"sizeBytes":1205885664},{"names":["openshift/origin-docker-registry:6acabdc","openshift/origin-docker-registry:latest"],"sizeBytes":1100164272},{"names":["openshift/origin-gitserver:6acabdc","openshift/origin-gitserver:latest"],"sizeBytes":1086520226},{"names":["openshift/openvswitch:6acabdc","openshift/openvswitch:latest"],"sizeBytes":1053403667},{"names":["openshift/node:6acabdc","openshift/node:latest"],"sizeBytes":1051721928},{"names":["openshift/origin-keepalived-ipfailover:6acabdc","openshift/origin-keepalived-ipfailover:latest"],"sizeBytes":1028529711},{"names":["openshift/origin-haproxy-router:6acabdc","openshift/origin-haproxy-router:latest"],"sizeBytes":1022758742},{"names":["openshift/origin:6acabdc","openshift/origin:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-f5-router:6acabdc","openshift/origin-f5-router:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-sti-builder:6acabdc","openshift/origin-sti-builder:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-recycler:6acabdc","openshift/origin-recycler:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-deployer:6acabdc","openshift/origin-deployer:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-docker-builder:6acabdc","openshift/origin-docker-builder:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-cluster-capacity:6acabdc","openshift/origin-cluster-capacity:latest"],"sizeBytes":962455026},{"names":["rhel7.1:latest"],"sizeBytes":765301508},{"names":["openshift/dind-master:latest"],"sizeBytes":731456758},{"names":["openshift/dind-node:latest"],"sizeBytes":731453034},{"names":["\u003cnone\u003e@\u003cnone\u003e","\u003cnone\u003e:\u003cnone\u003e"],"sizeBytes":709532011},{"names":["docker.io/openshift/origin-logging-kibana@sha256:70ead525ed596b73301e8df3ac229e33dd7f8431ec1233b37e96544c556530e9","docker.io/openshift/origin-logging-kibana:latest"],"sizeBytes":682851528},{"names":["openshift/dind:latest"],"sizeBytes":640650210},{"names":["docker.io/openshift/origin-logging-elasticsearch@sha256:1e72563ad0551f5c15fc6aa8057a64cc9d0c21b2c40bca7efabdd1b55a4fc2e4","docker.io/openshift/origin-logging-elasticsearch:latest"],"sizeBytes":425433997},{"names":["docker.io/openshift/base-centos7@sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c","docker.io/openshift/base-centos7:latest"],"sizeBytes":383049978},{"names":["rhel7.2:latest"],"sizeBytes":377493597},{"names":["openshift/origin-egress-router:6acabdc","openshift/origin-egress-router:latest"],"sizeBytes":364745713},{"names":["openshift/origin-base:latest"],"sizeBytes":363070172},{"names":["\u003cnone\u003e@\u003cnone\u003e","\u003cnone\u003e:\u003cnone\u003e"],"sizeBytes":363024702},{"names":["docker.io/openshift/origin-logging-fluentd@sha256:bc70848086a50bad58a2f41e166098e8ed351bf4dbe7af83caeb7a29f35b4395","docker.io/openshift/origin-logging-fluentd:latest"],"sizeBytes":359217371},{"names":["docker.io/fedora@sha256:69281ddd7b2600e5f2b17f1e12d7fba25207f459204fb2d15884f8432c479136","docker.io/fedora:25"],"sizeBytes":230864375},{"names":["docker.io/openshift/origin-logging-curator@sha256:e820338ca7fb0addfaec25d80d40a49f5ea25b24ff056ab6adbb42dd9eec94b4","docker.io/openshift/origin-logging-curator:latest"],"sizeBytes":224977691},{"names":["rhel7.3:latest","rhel7:latest"],"sizeBytes":219121266},{"names":["openshift/origin-pod:6acabdc","openshift/origin-pod:latest"],"sizeBytes":213199843},{"names":["registry.access.redhat.com/rhel7.2@sha256:98e6ca5d226c26e31a95cd67716afe22833c943e1926a21daf1a030906a02249","registry.access.redhat.com/rhel7.2:latest"],"sizeBytes":201376319},{"names":["registry.access.redhat.com/rhel7.3@sha256:1e232401d8e0ba53b36b757b4712fbcbd1dab9c21db039c45a84871a74e89e68","registry.access.redhat.com/rhel7.3:latest"],"sizeBytes":192693772},{"names":["docker.io/centos@sha256:bba1de7c9d900a898e3cadbae040dfe8a633c06bc104a0df76ae24483e03c077"],"sizeBytes":192548999},{"names":["openshift/origin-source:latest"],"sizeBytes":192548894},{"names":["docker.io/centos@sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9","docker.io/centos:7","docker.io/centos:centos7"],"sizeBytes":192548537},{"names":["registry.access.redhat.com/rhel7.1@sha256:1bc5a4c43bbb29a5a96a61896ff696933be3502e2f5fdc4cde02d9e101731fdd","registry.access.redhat.com/rhel7.1:latest"],"sizeBytes":158229901},{"names":["openshift/hello-openshift:6acabdc","openshift/hello-openshift:latest"],"sizeBytes":5643318}]}} There was no error output from the command. serviceaccount "registry" created clusterrolebinding "registry-registry-role" created deploymentconfig "docker-registry" created service "docker-registry" created --> Creating router router ... info: password for stats user admin has been set to GOWmysxkhJ serviceaccount "router" created clusterrolebinding "router-router-role" created deploymentconfig "router" created service "router" created --> Success Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:162: executing 'oadm new-project logging --node-selector=''' expecting success... SUCCESS after 0.579s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:162: executing 'oadm new-project logging --node-selector=''' expecting success Standard output from the command: Created project logging There was no error output from the command. Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:163: executing 'oc project logging > /dev/null' expecting success... SUCCESS after 0.310s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:163: executing 'oc project logging > /dev/null' expecting success There was no output from the command. There was no error output from the command. apiVersion: v1 items: - apiVersion: v1 kind: ImageStream metadata: labels: build: logging-elasticsearch component: development logging-infra: development provider: openshift name: logging-elasticsearch spec: {} - apiVersion: v1 kind: ImageStream metadata: labels: build: logging-fluentd component: development logging-infra: development provider: openshift name: logging-fluentd spec: {} - apiVersion: v1 kind: ImageStream metadata: labels: build: logging-kibana component: development logging-infra: development provider: openshift name: logging-kibana spec: {} - apiVersion: v1 kind: ImageStream metadata: labels: build: logging-curator component: development logging-infra: development provider: openshift name: logging-curator spec: {} - apiVersion: v1 kind: ImageStream metadata: labels: build: logging-auth-proxy component: development logging-infra: development provider: openshift name: logging-auth-proxy spec: {} - apiVersion: v1 kind: ImageStream metadata: labels: build: logging-deployment component: development logging-infra: development provider: openshift name: origin spec: dockerImageRepository: openshift/origin tags: - from: kind: DockerImage name: openshift/origin:v1.5.0-alpha.2 name: v1.5.0-alpha.2 - apiVersion: v1 kind: BuildConfig metadata: labels: app: logging-elasticsearch component: development logging-infra: development provider: openshift name: logging-elasticsearch spec: output: to: kind: ImageStreamTag name: logging-elasticsearch:latest resources: {} source: contextDir: elasticsearch git: ref: master uri: https://github.com/openshift/origin-aggregated-logging type: Git strategy: dockerStrategy: from: kind: DockerImage name: openshift/base-centos7 type: Docker - apiVersion: v1 kind: BuildConfig metadata: labels: build: logging-fluentd component: development logging-infra: development provider: openshift name: logging-fluentd spec: output: to: kind: ImageStreamTag name: logging-fluentd:latest resources: {} source: contextDir: fluentd git: ref: master uri: https://github.com/openshift/origin-aggregated-logging type: Git strategy: dockerStrategy: from: kind: DockerImage name: openshift/base-centos7 type: Docker - apiVersion: v1 kind: BuildConfig metadata: labels: build: logging-kibana component: development logging-infra: development provider: openshift name: logging-kibana spec: output: to: kind: ImageStreamTag name: logging-kibana:latest resources: {} source: contextDir: kibana git: ref: master uri: https://github.com/openshift/origin-aggregated-logging type: Git strategy: dockerStrategy: from: kind: DockerImage name: openshift/base-centos7 type: Docker - apiVersion: v1 kind: BuildConfig metadata: labels: build: logging-curator component: development logging-infra: development provider: openshift name: logging-curator spec: output: to: kind: ImageStreamTag name: logging-curator:latest resources: {} source: contextDir: curator git: ref: master uri: https://github.com/openshift/origin-aggregated-logging type: Git strategy: dockerStrategy: from: kind: DockerImage name: openshift/base-centos7 type: Docker - apiVersion: v1 kind: BuildConfig metadata: labels: build: logging-auth-proxy component: development logging-infra: development provider: openshift name: logging-auth-proxy spec: output: to: kind: ImageStreamTag name: logging-auth-proxy:latest resources: {} source: contextDir: kibana-proxy git: ref: master uri: https://github.com/openshift/origin-aggregated-logging type: Git strategy: dockerStrategy: from: kind: DockerImage name: library/node:0.10.36 type: Docker kind: List metadata: {} Running hack/testing/build-images:31: executing 'oc process -o yaml -f /data/src/github.com/openshift/origin-aggregated-logging/hack/templates/dev-builds-wo-deployer.yaml -p LOGGING_FORK_URL=https://github.com/openshift/origin-aggregated-logging -p LOGGING_FORK_BRANCH=master | build_filter | oc create -f -' expecting success... SUCCESS after 0.351s: hack/testing/build-images:31: executing 'oc process -o yaml -f /data/src/github.com/openshift/origin-aggregated-logging/hack/templates/dev-builds-wo-deployer.yaml -p LOGGING_FORK_URL=https://github.com/openshift/origin-aggregated-logging -p LOGGING_FORK_BRANCH=master | build_filter | oc create -f -' expecting success Standard output from the command: imagestream "logging-elasticsearch" created imagestream "logging-fluentd" created imagestream "logging-kibana" created imagestream "logging-curator" created imagestream "logging-auth-proxy" created imagestream "origin" created buildconfig "logging-elasticsearch" created buildconfig "logging-fluentd" created buildconfig "logging-kibana" created buildconfig "logging-curator" created buildconfig "logging-auth-proxy" created There was no error output from the command. Running hack/testing/build-images:9: executing 'oc get imagestreamtag origin:latest' expecting success; re-trying every 0.2s until completion or 60.000s... SUCCESS after 1.008s: hack/testing/build-images:9: executing 'oc get imagestreamtag origin:latest' expecting success; re-trying every 0.2s until completion or 60.000s Standard output from the command: NAME DOCKER REF UPDATED IMAGENAME origin:latest openshift/origin@sha256:4ce85347f606fb161cee6f0f58c68ddbd557716b6742e18e9ca7d9183372480e 1 second ago sha256:4ce85347f606fb161cee6f0f58c68ddbd557716b6742e18e9ca7d9183372480e Standard error from the command: Error from server (NotFound): imagestreamtags.image.openshift.io "origin:latest" not found ... repeated 2 times Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ... build "logging-auth-proxy-1" started Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ... build "logging-curator-1" started Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ... build "logging-elasticsearch-1" started Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ... build "logging-fluentd-1" started Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ... build "logging-kibana-1" started Running hack/testing/build-images:33: executing 'wait_for_builds_complete' expecting success... FAILURE after 4231.993s: hack/testing/build-images:33: executing 'wait_for_builds_complete' expecting success: the command returned the wrong error code Standard output from the command: build "logging-kibana-3" started build "logging-kibana-4" started build in progress for logging-kibana - delete failed build logging-kibana-3 status running build "logging-kibana-3" deleted build in progress for logging-kibana - delete failed build logging-kibana-1 status complete build "logging-kibana-1" deleted error builds are not complete NAME TYPE FROM STATUS STARTED DURATION logging-auth-proxy-1 Docker Binary@a0be43d Complete About an hour ago 2m50s logging-curator-1 Docker Binary@a0be43d Complete About an hour ago 1m54s logging-elasticsearch-1 Docker Binary@a0be43d Complete About an hour ago 2m13s logging-fluentd-1 Docker Binary@a0be43d Complete About an hour ago 2m5s logging-kibana-2 Docker Binary@a0be43d Cancelled (CancelledBuild) About an hour ago logging-kibana-4 Docker Binary@a0be43d Complete About an hour ago 2m41s Standard error from the command: Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ... Error from server (BadRequest): cannot upload file to build logging-kibana-2 with status Pending Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ... Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ... [ERROR] PID 4246: hack/lib/cmd.sh:241: `return "${return_code}"` exited with status 1. [INFO] Stack Trace: [INFO] 1: hack/lib/cmd.sh:241: `return "${return_code}"` [INFO] 2: hack/testing/build-images:33: os::cmd::expect_success [INFO] 3: hack/testing/init-log-stack:14: source [INFO] 4: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:166: source [INFO] Exiting with code 1. /data/src/github.com/openshift/origin-aggregated-logging/hack/lib/log/system.sh: line 31: 4604 Terminated sar -A -o "${binary_logfile}" 1 86400 > /dev/null 2> "${stderr_logfile}" [INFO] [CLEANUP] Beginning cleanup routines... [INFO] [CLEANUP] Dumping cluster events to /tmp/origin-aggregated-logging/artifacts/events.txt [INFO] [CLEANUP] Dumping etcd contents to /tmp/origin-aggregated-logging/artifacts/etcd [WARNING] No compiled `etcdhelper` binary was found. Attempting to build one using: [WARNING] $ hack/build-go.sh tools/etcdhelper ++ Building go targets for linux/amd64: tools/etcdhelper /data/src/github.com/openshift/origin-aggregated-logging/../origin/hack/build-go.sh took 51 seconds 2017-06-08 11:50:26.408526 I | warning: ignoring ServerName for user-provided CA for backwards compatibility is deprecated [INFO] [CLEANUP] Dumping container logs to /tmp/origin-aggregated-logging/logs/containers [INFO] [CLEANUP] Truncating log files over 200M [INFO] [CLEANUP] Stopping docker containers [INFO] [CLEANUP] Removing docker containers [INFO] [CLEANUP] Killing child processes [INFO] [CLEANUP] Pruning etcd data directory [ERROR] /data/src/github.com/openshift/origin-aggregated-logging/logging.sh exited with code 1 after 01h 27m 55s Error while running ssh/sudo command: set -e pushd /data/src/github.com/openshift//origin-aggregated-logging/hack/testing >/dev/null export PATH=$GOPATH/bin:$PATH echo '***************************************************' echo 'Running GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh...' time GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh echo 'Finished GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh' echo '***************************************************' popd >/dev/null The SSH command responded with a non-zero exit status. Vagrant assumes that this means the command failed. The output for this command should be in the log above. Please read the output to determine what went wrong. ==> openshiftdev: Downloading logs ==> openshiftdev: Downloading artifacts from '/var/log/yum.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin/artifacts/yum.log' ==> openshiftdev: Downloading artifacts from '/var/log/secure' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin/artifacts/secure' ==> openshiftdev: Downloading artifacts from '/var/log/audit/audit.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin/artifacts/audit.log' ==> openshiftdev: Downloading artifacts from '/tmp/origin-aggregated-logging/' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin/artifacts' Build step 'Execute shell' marked build as failure [description-setter] Could not determine description. [PostBuildScript] - Execution post build scripts. [workspace@2] $ /bin/sh -xe /tmp/hudson2371450256970039401.sh + INSTANCE_NAME=origin_logging-rhel7-1627 + pushd origin ~/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2 + rc=0 + '[' -f .vagrant-openshift.json ']' ++ /usr/bin/vagrant ssh -c 'sudo ausearch -m avc' + ausearchresult='<no matches>' + rc=1 + '[' '<no matches>' = '<no matches>' ']' + rc=0 + /usr/bin/vagrant destroy -f ==> openshiftdev: Terminating the instance... ==> openshiftdev: Running cleanup tasks for 'shell' provisioner... + popd ~/jobs/test-origin-aggregated-logging/workspace@2 + exit 0 [BFA] Scanning build for known causes... [BFA] Found failure cause(s): [BFA] Command Failure from category failure [BFA] Done. 0s Finished: FAILURE