Console Output
Started by upstream project "test_pull_request_origin_aggregated_logging" build number 89
originally caused by:
Started by remote host 50.17.198.52
[EnvInject] - Loading node environment variables.
Building in workspace /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content
OS_ROOT=/data/src/github.com/openshift/origin
INSTANCE_TYPE=c4.xlarge
GITHUB_REPO=openshift
OS=rhel7
TESTNAME=logging
[EnvInject] - Variables injected successfully.
[workspace@2] $ /bin/sh -xe /tmp/hudson1287636239146572317.sh
+ false
+ unset GOPATH
+ REPO_NAME=origin-aggregated-logging
+ rm -rf origin-aggregated-logging
+ vagrant origin-local-checkout --replace --repo origin-aggregated-logging -b master
You don't seem to have the GOPATH environment variable set on your system.
See: 'go help gopath' for more details about GOPATH.
Waiting for the cloning process to finish
Cloning origin-aggregated-logging ...
Submodule 'deployer/common' (https://github.com/openshift/origin-integration-common) registered for path 'deployer/common'
Submodule 'kibana-proxy' (https://github.com/fabric8io/openshift-auth-proxy.git) registered for path 'kibana-proxy'
Cloning into 'deployer/common'...
Submodule path 'deployer/common': checked out '45bf993212cdcbab5cbce3b3fab74a72b851402e'
Cloning into 'kibana-proxy'...
Submodule path 'kibana-proxy': checked out '118dfb40f7a8082d370ba7f4805255c9ec7c8178'
Origin repositories cloned into /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2
+ pushd origin-aggregated-logging
~/jobs/test-origin-aggregated-logging/workspace@2/origin-aggregated-logging ~/jobs/test-origin-aggregated-logging/workspace@2
+ git checkout master
Already on 'master'
+ popd
~/jobs/test-origin-aggregated-logging/workspace@2
+ '[' -n '' ']'
+ vagrant origin-local-checkout --replace
You don't seem to have the GOPATH environment variable set on your system.
See: 'go help gopath' for more details about GOPATH.
Waiting for the cloning process to finish
Checking repo integrity for /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin
~/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2
# On branch master
# Untracked files:
# (use "git add <file>..." to include in what will be committed)
#
# artifacts/
nothing added to commit but untracked files present (use "git add" to track)
~/jobs/test-origin-aggregated-logging/workspace@2
Replacing: /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin
~/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2
From https://github.com/openshift/origin
71efe29..2458531 master -> origin/master
Already on 'master'
Your branch is behind 'origin/master' by 5 commits, and can be fast-forwarded.
(use "git pull" to update your local branch)
HEAD is now at 2458531 Merge pull request #14451 from chlunde/use-https-centos-paas-yum-repo
Removing .vagrant-openshift.json
Removing .vagrant/
Removing artifacts/
fatal: branch name required
~/jobs/test-origin-aggregated-logging/workspace@2
Origin repositories cloned into /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2
+ pushd origin
~/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2
+ INSTANCE_NAME=origin_logging-rhel7-1650
+ GIT_URL=https://github.com/openshift/origin-aggregated-logging
++ echo https://github.com/openshift/origin-aggregated-logging
++ sed s,https://,,
+ OAL_LOCAL_PATH=github.com/openshift/origin-aggregated-logging
+ OS_O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging
+ env
+ sort
_=/bin/env
BRANCH=master
BUILD_CAUSE=UPSTREAMTRIGGER
BUILD_CAUSE_UPSTREAMTRIGGER=true
BUILD_DISPLAY_NAME=#1650
BUILD_ID=1650
BUILD_NUMBER=1650
BUILD_TAG=jenkins-test-origin-aggregated-logging-1650
BUILD_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1650/
EXECUTOR_NUMBER=12
GITHUB_REPO=openshift
HOME=/var/lib/jenkins
HUDSON_COOKIE=34bf1955-b32d-4565-8ea5-b04b89adf26d
HUDSON_HOME=/var/lib/jenkins
HUDSON_SERVER_COOKIE=ec11f8b2841c966f
HUDSON_URL=https://ci.openshift.redhat.com/jenkins/
INSTANCE_TYPE=c4.xlarge
JENKINS_HOME=/var/lib/jenkins
JENKINS_SERVER_COOKIE=ec11f8b2841c966f
JENKINS_URL=https://ci.openshift.redhat.com/jenkins/
JOB_BASE_NAME=test-origin-aggregated-logging
JOB_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/display/redirect
JOB_NAME=test-origin-aggregated-logging
JOB_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/
LANG=en_US.UTF-8
LOGNAME=jenkins
MERGE=false
MERGE_SEVERITY=none
NLSPATH=/usr/dt/lib/nls/msg/%L/%N.cat
NODE_LABELS=master
NODE_NAME=master
OLDPWD=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2
OPENSHIFT_ANSIBLE_TARGET_BRANCH=master
ORIGIN_AGGREGATED_LOGGING_PULL_ID=464
ORIGIN_AGGREGATED_LOGGING_TARGET_BRANCH=master
OS_ANSIBLE_BRANCH=master
OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible
OS=rhel7
OS_ROOT=/data/src/github.com/openshift/origin
PATH=/sbin:/usr/sbin:/bin:/usr/bin
PWD=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin
ROOT_BUILD_CAUSE=REMOTECAUSE
ROOT_BUILD_CAUSE_REMOTECAUSE=true
RUN_CHANGES_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1650/display/redirect?page=changes
RUN_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1650/display/redirect
SHELL=/bin/bash
SHLVL=3
TESTNAME=logging
TEST_PERF=false
USER=jenkins
WORKSPACE=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2
XFILESEARCHPATH=/usr/dt/app-defaults/%L/Dt
+ vagrant origin-init --stage inst --os rhel7 --instance-type c4.xlarge origin_logging-rhel7-1650
Reading AWS credentials from /var/lib/jenkins/.awscred
Searching devenv-rhel7_* for latest base AMI (required_name_tag=)
Found: ami-83a1fc95 (devenv-rhel7_6323)
++ seq 0 2
+ for i in '$(seq 0 2)'
+ vagrant up --provider aws
Bringing machine 'openshiftdev' up with 'aws' provider...
==> openshiftdev: Warning! The AWS provider doesn't support any of the Vagrant
==> openshiftdev: high-level network configurations (`config.vm.network`). They
==> openshiftdev: will be silently ignored.
==> openshiftdev: Warning! You're launching this instance into a VPC without an
==> openshiftdev: elastic IP. Please verify you're properly connected to a VPN so
==> openshiftdev: you can access this machine, otherwise Vagrant will not be able
==> openshiftdev: to SSH into it.
==> openshiftdev: Launching an instance with the following settings...
==> openshiftdev: -- Type: c4.xlarge
==> openshiftdev: -- AMI: ami-83a1fc95
==> openshiftdev: -- Region: us-east-1
==> openshiftdev: -- Keypair: libra
==> openshiftdev: -- Subnet ID: subnet-cf57c596
==> openshiftdev: -- User Data: yes
==> openshiftdev: -- User Data:
==> openshiftdev: # cloud-config
==> openshiftdev:
==> openshiftdev: growpart:
==> openshiftdev: mode: auto
==> openshiftdev: devices: ['/']
==> openshiftdev: runcmd:
==> openshiftdev: - [ sh, -xc, "sed -i s/^Defaults.*requiretty/#Defaults requiretty/g /etc/sudoers"]
==> openshiftdev:
==> openshiftdev: -- Block Device Mapping: [{"DeviceName"=>"/dev/sda1", "Ebs.VolumeSize"=>25, "Ebs.VolumeType"=>"gp2"}, {"DeviceName"=>"/dev/sdb", "Ebs.VolumeSize"=>35, "Ebs.VolumeType"=>"gp2"}]
==> openshiftdev: -- Terminate On Shutdown: false
==> openshiftdev: -- Monitoring: false
==> openshiftdev: -- EBS optimized: false
==> openshiftdev: -- Assigning a public IP address in a VPC: false
==> openshiftdev: Waiting for instance to become "ready"...
==> openshiftdev: Waiting for SSH to become available...
==> openshiftdev: Machine is booted and ready for use!
==> openshiftdev: Running provisioner: setup (shell)...
openshiftdev: Running: /tmp/vagrant-shell20170608-22400-1gclebf.sh
==> openshiftdev: Host: ec2-34-207-97-147.compute-1.amazonaws.com
+ break
+ vagrant sync-origin-aggregated-logging -c -s
Running ssh/sudo command 'rm -rf /data/src/github.com/openshift/origin-aggregated-logging-bare;
' with timeout 14400. Attempt #0
Running ssh/sudo command 'mkdir -p /ec2-user/.ssh;
mv /tmp/file20170608-22907-11isunq /ec2-user/.ssh/config &&
chown ec2-user:ec2-user /ec2-user/.ssh/config &&
chmod 0600 /ec2-user/.ssh/config' with timeout 14400. Attempt #0
Running ssh/sudo command 'mkdir -p /data/src/github.com/openshift/' with timeout 14400. Attempt #0
Running ssh/sudo command 'mkdir -p /data/src/github.com/openshift/builder && chown -R ec2-user:ec2-user /data/src/github.com/openshift/' with timeout 14400. Attempt #0
Running ssh/sudo command 'set -e
rm -fr /data/src/github.com/openshift/origin-aggregated-logging-bare;
if [ ! -d /data/src/github.com/openshift/origin-aggregated-logging-bare ]; then
git clone --quiet --bare https://github.com/openshift/origin-aggregated-logging.git /data/src/github.com/openshift/origin-aggregated-logging-bare >/dev/null
fi
' with timeout 14400. Attempt #0
Synchronizing local sources
Synchronizing [origin-aggregated-logging@master] from origin-aggregated-logging...
Warning: Permanently added '34.207.97.147' (ECDSA) to the list of known hosts.
Running ssh/sudo command 'set -e
if [ -d /data/src/github.com/openshift/origin-aggregated-logging-bare ]; then
rm -rf /data/src/github.com/openshift/origin-aggregated-logging
echo 'Cloning origin-aggregated-logging ...'
git clone --quiet --recurse-submodules /data/src/github.com/openshift/origin-aggregated-logging-bare /data/src/github.com/openshift/origin-aggregated-logging
else
MISSING_REPO+='origin-aggregated-logging-bare'
fi
if [ -n "$MISSING_REPO" ]; then
echo 'Missing required upstream repositories:'
echo $MISSING_REPO
echo 'To fix, execute command: vagrant clone-upstream-repos'
fi
' with timeout 14400. Attempt #0
Cloning origin-aggregated-logging ...
Submodule 'deployer/common' (https://github.com/openshift/origin-integration-common) registered for path 'deployer/common'
Submodule 'kibana-proxy' (https://github.com/fabric8io/openshift-auth-proxy.git) registered for path 'kibana-proxy'
Cloning into 'deployer/common'...
Submodule path 'deployer/common': checked out '45bf993212cdcbab5cbce3b3fab74a72b851402e'
Cloning into 'kibana-proxy'...
Submodule path 'kibana-proxy': checked out '118dfb40f7a8082d370ba7f4805255c9ec7c8178'
+ vagrant ssh -c 'if [ ! -d /tmp/openshift ] ; then mkdir /tmp/openshift ; fi ; sudo chmod 777 /tmp/openshift'
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/base-centos7 ...
pulling image openshift/base-centos7 ...
+ vagrant ssh -c 'docker pull openshift/base-centos7' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/base-centos7 ...
latest: Pulling from docker.io/openshift/base-centos7
45a2e645736c: Pulling fs layer
734fb161cf89: Pulling fs layer
78efc9e155c4: Pulling fs layer
8a3400b7e31a: Pulling fs layer
8a3400b7e31a: Waiting
734fb161cf89: Verifying Checksum
734fb161cf89: Download complete
8a3400b7e31a: Verifying Checksum
8a3400b7e31a: Download complete
45a2e645736c: Verifying Checksum
45a2e645736c: Download complete
78efc9e155c4: Verifying Checksum
78efc9e155c4: Download complete
45a2e645736c: Pull complete
734fb161cf89: Pull complete
78efc9e155c4: Pull complete
8a3400b7e31a: Pull complete
Digest: sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c
+ echo done with openshift/base-centos7
done with openshift/base-centos7
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image centos:centos7 ...
pulling image centos:centos7 ...
+ vagrant ssh -c 'docker pull centos:centos7' -- -n
Trying to pull repository docker.io/library/centos ...
centos7: Pulling from docker.io/library/centos
Digest: sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9
+ echo done with centos:centos7
done with centos:centos7
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-elasticsearch ...
pulling image openshift/origin-logging-elasticsearch ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-elasticsearch' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-elasticsearch ...
latest: Pulling from docker.io/openshift/origin-logging-elasticsearch
d5e46245fe40: Already exists
ab4780386529: Pulling fs layer
80503ae3b0fe: Pulling fs layer
110d90898f8a: Pulling fs layer
b110708dfac6: Pulling fs layer
8f9ecbfd25ab: Pulling fs layer
29d7ed0baa52: Pulling fs layer
17ebbcb3d605: Pulling fs layer
d37a5fc9cbde: Pulling fs layer
060ad1853242: Pulling fs layer
eee851304b3a: Pulling fs layer
b110708dfac6: Waiting
8f9ecbfd25ab: Waiting
29d7ed0baa52: Waiting
060ad1853242: Waiting
d37a5fc9cbde: Waiting
17ebbcb3d605: Waiting
eee851304b3a: Waiting
ab4780386529: Verifying Checksum
110d90898f8a: Verifying Checksum
110d90898f8a: Download complete
b110708dfac6: Download complete
29d7ed0baa52: Verifying Checksum
29d7ed0baa52: Download complete
8f9ecbfd25ab: Verifying Checksum
8f9ecbfd25ab: Download complete
17ebbcb3d605: Verifying Checksum
17ebbcb3d605: Download complete
060ad1853242: Verifying Checksum
060ad1853242: Download complete
eee851304b3a: Verifying Checksum
eee851304b3a: Download complete
d37a5fc9cbde: Verifying Checksum
d37a5fc9cbde: Download complete
80503ae3b0fe: Verifying Checksum
80503ae3b0fe: Download complete
ab4780386529: Pull complete
80503ae3b0fe: Pull complete
110d90898f8a: Pull complete
b110708dfac6: Pull complete
8f9ecbfd25ab: Pull complete
29d7ed0baa52: Pull complete
17ebbcb3d605: Pull complete
d37a5fc9cbde: Pull complete
060ad1853242: Pull complete
eee851304b3a: Pull complete
Digest: sha256:3a4d359a10d7655cdca2cfa3a89771d6825ffe1d50de4ac7bb570e79f862ccfb
+ echo done with openshift/origin-logging-elasticsearch
done with openshift/origin-logging-elasticsearch
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-fluentd ...
pulling image openshift/origin-logging-fluentd ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-fluentd' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-fluentd ...
latest: Pulling from docker.io/openshift/origin-logging-fluentd
d5e46245fe40: Already exists
e0f9da45960a: Pulling fs layer
b7564a1b49c3: Pulling fs layer
1f0ac0ad59f6: Pulling fs layer
a036466e4202: Pulling fs layer
954e91cd4a3c: Pulling fs layer
a036466e4202: Waiting
954e91cd4a3c: Waiting
1f0ac0ad59f6: Verifying Checksum
1f0ac0ad59f6: Download complete
a036466e4202: Verifying Checksum
a036466e4202: Download complete
b7564a1b49c3: Verifying Checksum
b7564a1b49c3: Download complete
954e91cd4a3c: Verifying Checksum
954e91cd4a3c: Download complete
e0f9da45960a: Verifying Checksum
e0f9da45960a: Download complete
e0f9da45960a: Pull complete
b7564a1b49c3: Pull complete
1f0ac0ad59f6: Pull complete
a036466e4202: Pull complete
954e91cd4a3c: Pull complete
Digest: sha256:8e382dfb002d4f0788d8c5d30ec1baff8005c548bc49fa061fc24d9a0302d9e9
+ echo done with openshift/origin-logging-fluentd
done with openshift/origin-logging-fluentd
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-curator ...
pulling image openshift/origin-logging-curator ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-curator' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-curator ...
latest: Pulling from docker.io/openshift/origin-logging-curator
d5e46245fe40: Already exists
45b57d2b5ea1: Pulling fs layer
a2722b2a33b6: Pulling fs layer
45b57d2b5ea1: Verifying Checksum
45b57d2b5ea1: Download complete
45b57d2b5ea1: Pull complete
a2722b2a33b6: Verifying Checksum
a2722b2a33b6: Download complete
a2722b2a33b6: Pull complete
Digest: sha256:ee6d3de66a3dac118b6c961786fc075276bda8c688f9bf8e24f6559b38f0fbeb
+ echo done with openshift/origin-logging-curator
done with openshift/origin-logging-curator
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-kibana ...
pulling image openshift/origin-logging-kibana ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-kibana' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-kibana ...
latest: Pulling from docker.io/openshift/origin-logging-kibana
45a2e645736c: Already exists
734fb161cf89: Already exists
78efc9e155c4: Already exists
8a3400b7e31a: Already exists
6e4f505c5772: Pulling fs layer
a746d34fe6c3: Pulling fs layer
2e2d74c80385: Pulling fs layer
8f4b9444f21e: Pulling fs layer
9a5f7882bf53: Pulling fs layer
1a2586e469f9: Pulling fs layer
8f4b9444f21e: Waiting
9a5f7882bf53: Waiting
1a2586e469f9: Waiting
6e4f505c5772: Verifying Checksum
6e4f505c5772: Download complete
2e2d74c80385: Verifying Checksum
2e2d74c80385: Download complete
8f4b9444f21e: Verifying Checksum
8f4b9444f21e: Download complete
9a5f7882bf53: Verifying Checksum
9a5f7882bf53: Download complete
6e4f505c5772: Pull complete
a746d34fe6c3: Verifying Checksum
a746d34fe6c3: Download complete
1a2586e469f9: Verifying Checksum
1a2586e469f9: Download complete
a746d34fe6c3: Pull complete
2e2d74c80385: Pull complete
8f4b9444f21e: Pull complete
9a5f7882bf53: Pull complete
1a2586e469f9: Pull complete
Digest: sha256:9e3e11edb1f14c744ecf9587a3212e7648934a8bb302513ba84a8c6b058a1229
+ echo done with openshift/origin-logging-kibana
done with openshift/origin-logging-kibana
+ vagrant test-origin-aggregated-logging -d --env GIT_URL=https://github.com/openshift/origin-aggregated-logging --env GIT_BRANCH=master --env O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging --env OS_ROOT=/data/src/github.com/openshift/origin --env ENABLE_OPS_CLUSTER=true --env USE_LOCAL_SOURCE=true --env TEST_PERF=false --env VERBOSE=1 --env OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible --env OS_ANSIBLE_BRANCH=master
***************************************************
Running GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh...
/data/src/github.com/openshift/origin /data/src/github.com/openshift/origin-aggregated-logging/hack/testing
/data/src/github.com/openshift/origin-aggregated-logging/hack/testing
/data/src/github.com/openshift/origin-aggregated-logging /data/src/github.com/openshift/origin-aggregated-logging/hack/testing
/data/src/github.com/openshift/origin-aggregated-logging/hack/testing
Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
Metadata Cache Created
Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
Resolving Dependencies
--> Running transaction check
---> Package ansible.noarch 0:2.3.0.0-3.el7 will be installed
--> Processing Dependency: sshpass for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-paramiko for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-keyczar for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-httplib2 for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-crypto for package: ansible-2.3.0.0-3.el7.noarch
---> Package python2-pip.noarch 0:8.1.2-5.el7 will be installed
---> Package python2-ruamel-yaml.x86_64 0:0.12.14-9.el7 will be installed
--> Processing Dependency: python2-typing for package: python2-ruamel-yaml-0.12.14-9.el7.x86_64
--> Processing Dependency: python2-ruamel-ordereddict for package: python2-ruamel-yaml-0.12.14-9.el7.x86_64
--> Running transaction check
---> Package python-httplib2.noarch 0:0.9.1-2.el7aos will be installed
---> Package python-keyczar.noarch 0:0.71c-2.el7aos will be installed
--> Processing Dependency: python-pyasn1 for package: python-keyczar-0.71c-2.el7aos.noarch
---> Package python-paramiko.noarch 0:2.1.1-1.el7 will be installed
--> Processing Dependency: python-cryptography for package: python-paramiko-2.1.1-1.el7.noarch
---> Package python2-crypto.x86_64 0:2.6.1-13.el7 will be installed
--> Processing Dependency: libtomcrypt.so.0()(64bit) for package: python2-crypto-2.6.1-13.el7.x86_64
---> Package python2-ruamel-ordereddict.x86_64 0:0.4.9-3.el7 will be installed
---> Package python2-typing.noarch 0:3.5.2.2-3.el7 will be installed
---> Package sshpass.x86_64 0:1.06-1.el7 will be installed
--> Running transaction check
---> Package libtomcrypt.x86_64 0:1.17-23.el7 will be installed
--> Processing Dependency: libtommath >= 0.42.0 for package: libtomcrypt-1.17-23.el7.x86_64
--> Processing Dependency: libtommath.so.0()(64bit) for package: libtomcrypt-1.17-23.el7.x86_64
---> Package python2-cryptography.x86_64 0:1.3.1-3.el7 will be installed
--> Processing Dependency: python-idna >= 2.0 for package: python2-cryptography-1.3.1-3.el7.x86_64
--> Processing Dependency: python-cffi >= 1.4.1 for package: python2-cryptography-1.3.1-3.el7.x86_64
--> Processing Dependency: python-ipaddress for package: python2-cryptography-1.3.1-3.el7.x86_64
--> Processing Dependency: python-enum34 for package: python2-cryptography-1.3.1-3.el7.x86_64
---> Package python2-pyasn1.noarch 0:0.1.9-7.el7 will be installed
--> Running transaction check
---> Package libtommath.x86_64 0:0.42.0-4.el7 will be installed
---> Package python-cffi.x86_64 0:1.6.0-5.el7 will be installed
--> Processing Dependency: python-pycparser for package: python-cffi-1.6.0-5.el7.x86_64
---> Package python-enum34.noarch 0:1.0.4-1.el7 will be installed
---> Package python-idna.noarch 0:2.0-1.el7 will be installed
---> Package python-ipaddress.noarch 0:1.0.16-2.el7 will be installed
--> Running transaction check
---> Package python-pycparser.noarch 0:2.14-1.el7 will be installed
--> Processing Dependency: python-ply for package: python-pycparser-2.14-1.el7.noarch
--> Running transaction check
---> Package python-ply.noarch 0:3.4-10.el7 will be installed
--> Finished Dependency Resolution
Dependencies Resolved
================================================================================
Package Arch Version Repository Size
================================================================================
Installing:
ansible noarch 2.3.0.0-3.el7 epel 5.7 M
python2-pip noarch 8.1.2-5.el7 epel 1.7 M
python2-ruamel-yaml x86_64 0.12.14-9.el7 li 245 k
Installing for dependencies:
libtomcrypt x86_64 1.17-23.el7 epel 224 k
libtommath x86_64 0.42.0-4.el7 epel 35 k
python-cffi x86_64 1.6.0-5.el7 oso-rhui-rhel-server-releases 218 k
python-enum34 noarch 1.0.4-1.el7 oso-rhui-rhel-server-releases 52 k
python-httplib2 noarch 0.9.1-2.el7aos li 115 k
python-idna noarch 2.0-1.el7 oso-rhui-rhel-server-releases 92 k
python-ipaddress noarch 1.0.16-2.el7 oso-rhui-rhel-server-releases 34 k
python-keyczar noarch 0.71c-2.el7aos rhel-7-server-ose-3.1-rpms 217 k
python-paramiko noarch 2.1.1-1.el7 rhel-7-server-ose-3.4-rpms 266 k
python-ply noarch 3.4-10.el7 oso-rhui-rhel-server-releases 123 k
python-pycparser noarch 2.14-1.el7 oso-rhui-rhel-server-releases 105 k
python2-crypto x86_64 2.6.1-13.el7 epel 476 k
python2-cryptography x86_64 1.3.1-3.el7 oso-rhui-rhel-server-releases 471 k
python2-pyasn1 noarch 0.1.9-7.el7 oso-rhui-rhel-server-releases 100 k
python2-ruamel-ordereddict
x86_64 0.4.9-3.el7 li 38 k
python2-typing noarch 3.5.2.2-3.el7 epel 39 k
sshpass x86_64 1.06-1.el7 epel 21 k
Transaction Summary
================================================================================
Install 3 Packages (+17 Dependent packages)
Total download size: 10 M
Installed size: 47 M
Downloading packages:
--------------------------------------------------------------------------------
Total 13 MB/s | 10 MB 00:00
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
Installing : python2-pyasn1-0.1.9-7.el7.noarch 1/20
Installing : sshpass-1.06-1.el7.x86_64 2/20
Installing : libtommath-0.42.0-4.el7.x86_64 3/20
Installing : libtomcrypt-1.17-23.el7.x86_64 4/20
Installing : python2-crypto-2.6.1-13.el7.x86_64 5/20
Installing : python-keyczar-0.71c-2.el7aos.noarch 6/20
Installing : python-enum34-1.0.4-1.el7.noarch 7/20
Installing : python-ply-3.4-10.el7.noarch 8/20
Installing : python-pycparser-2.14-1.el7.noarch 9/20
Installing : python-cffi-1.6.0-5.el7.x86_64 10/20
Installing : python-httplib2-0.9.1-2.el7aos.noarch 11/20
Installing : python-idna-2.0-1.el7.noarch 12/20
Installing : python2-ruamel-ordereddict-0.4.9-3.el7.x86_64 13/20
Installing : python2-typing-3.5.2.2-3.el7.noarch 14/20
Installing : python-ipaddress-1.0.16-2.el7.noarch 15/20
Installing : python2-cryptography-1.3.1-3.el7.x86_64 16/20
Installing : python-paramiko-2.1.1-1.el7.noarch 17/20
Installing : ansible-2.3.0.0-3.el7.noarch 18/20
Installing : python2-ruamel-yaml-0.12.14-9.el7.x86_64 19/20
Installing : python2-pip-8.1.2-5.el7.noarch 20/20
Verifying : python-pycparser-2.14-1.el7.noarch 1/20
Verifying : python-ipaddress-1.0.16-2.el7.noarch 2/20
Verifying : ansible-2.3.0.0-3.el7.noarch 3/20
Verifying : python2-typing-3.5.2.2-3.el7.noarch 4/20
Verifying : python2-pip-8.1.2-5.el7.noarch 5/20
Verifying : python2-pyasn1-0.1.9-7.el7.noarch 6/20
Verifying : libtomcrypt-1.17-23.el7.x86_64 7/20
Verifying : python-cffi-1.6.0-5.el7.x86_64 8/20
Verifying : python2-ruamel-yaml-0.12.14-9.el7.x86_64 9/20
Verifying : python2-ruamel-ordereddict-0.4.9-3.el7.x86_64 10/20
Verifying : python-idna-2.0-1.el7.noarch 11/20
Verifying : python-httplib2-0.9.1-2.el7aos.noarch 12/20
Verifying : python-ply-3.4-10.el7.noarch 13/20
Verifying : python-enum34-1.0.4-1.el7.noarch 14/20
Verifying : python-keyczar-0.71c-2.el7aos.noarch 15/20
Verifying : libtommath-0.42.0-4.el7.x86_64 16/20
Verifying : sshpass-1.06-1.el7.x86_64 17/20
Verifying : python2-cryptography-1.3.1-3.el7.x86_64 18/20
Verifying : python-paramiko-2.1.1-1.el7.noarch 19/20
Verifying : python2-crypto-2.6.1-13.el7.x86_64 20/20
Installed:
ansible.noarch 0:2.3.0.0-3.el7 python2-pip.noarch 0:8.1.2-5.el7
python2-ruamel-yaml.x86_64 0:0.12.14-9.el7
Dependency Installed:
libtomcrypt.x86_64 0:1.17-23.el7
libtommath.x86_64 0:0.42.0-4.el7
python-cffi.x86_64 0:1.6.0-5.el7
python-enum34.noarch 0:1.0.4-1.el7
python-httplib2.noarch 0:0.9.1-2.el7aos
python-idna.noarch 0:2.0-1.el7
python-ipaddress.noarch 0:1.0.16-2.el7
python-keyczar.noarch 0:0.71c-2.el7aos
python-paramiko.noarch 0:2.1.1-1.el7
python-ply.noarch 0:3.4-10.el7
python-pycparser.noarch 0:2.14-1.el7
python2-crypto.x86_64 0:2.6.1-13.el7
python2-cryptography.x86_64 0:1.3.1-3.el7
python2-pyasn1.noarch 0:0.1.9-7.el7
python2-ruamel-ordereddict.x86_64 0:0.4.9-3.el7
python2-typing.noarch 0:3.5.2.2-3.el7
sshpass.x86_64 0:1.06-1.el7
Complete!
Cloning into '/tmp/tmp.tLTpRAtiGP/openhift-ansible'...
Copying oc from path to /usr/local/bin for use by openshift-ansible
Copying oc from path to /usr/bin for use by openshift-ansible
Copying oadm from path to /usr/local/bin for use by openshift-ansible
Copying oadm from path to /usr/bin for use by openshift-ansible
[INFO] Starting logging tests at Thu Jun 8 22:44:19 EDT 2017
Generated new key pair as /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/serviceaccounts.public.key and /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/serviceaccounts.private.key
Generating node credentials ...
Created node config for 172.18.15.77 in /tmp/openshift/origin-aggregated-logging/openshift.local.config/node-172.18.15.77
Wrote master config to: /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/master-config.yaml
Running hack/lib/start.sh:352: executing 'oc get --raw /healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s...
SUCCESS after 36.188s: hack/lib/start.sh:352: executing 'oc get --raw /healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s
Standard output from the command:
ok
Standard error from the command:
The connection to the server 172.18.15.77:8443 was refused - did you specify the right host or port?
... repeated 76 times
Error from server (Forbidden): User "system:admin" cannot "get" on "/healthz"
... repeated 6 times
Running hack/lib/start.sh:353: executing 'oc get --raw https://172.18.15.77:10250/healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.5s until completion or 120.000s...
SUCCESS after 0.214s: hack/lib/start.sh:353: executing 'oc get --raw https://172.18.15.77:10250/healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.5s until completion or 120.000s
Standard output from the command:
ok
There was no error output from the command.
Running hack/lib/start.sh:354: executing 'oc get --raw /healthz/ready --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s...
SUCCESS after 0.857s: hack/lib/start.sh:354: executing 'oc get --raw /healthz/ready --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s
Standard output from the command:
ok
Standard error from the command:
Error from server (InternalError): an error on the server ("") has prevented the request from succeeding
Running hack/lib/start.sh:355: executing 'oc get service kubernetes --namespace default --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 160.000s...
SUCCESS after 0.393s: hack/lib/start.sh:355: executing 'oc get service kubernetes --namespace default --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 160.000s
Standard output from the command:
NAME CLUSTER-IP EXTERNAL-IP PORT(S) AGE
kubernetes 172.30.0.1 <none> 443/TCP,53/UDP,53/TCP 4s
There was no error output from the command.
Running hack/lib/start.sh:356: executing 'oc get --raw /api/v1/nodes/172.18.15.77 --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 80.000s...
SUCCESS after 0.299s: hack/lib/start.sh:356: executing 'oc get --raw /api/v1/nodes/172.18.15.77 --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 80.000s
Standard output from the command:
{"kind":"Node","apiVersion":"v1","metadata":{"name":"172.18.15.77","selfLink":"/api/v1/nodes/172.18.15.77","uid":"a45c28b4-4cbd-11e7-aafb-0eae810685c8","resourceVersion":"299","creationTimestamp":"2017-06-09T02:45:04Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/hostname":"172.18.15.77"},"annotations":{"volumes.kubernetes.io/controller-managed-attach-detach":"true"}},"spec":{"externalID":"172.18.15.77","providerID":"aws:////i-0ac18d5c02ba24ce7"},"status":{"capacity":{"cpu":"4","memory":"7231688Ki","pods":"40"},"allocatable":{"cpu":"4","memory":"7129288Ki","pods":"40"},"conditions":[{"type":"OutOfDisk","status":"False","lastHeartbeatTime":"2017-06-09T02:45:04Z","lastTransitionTime":"2017-06-09T02:45:04Z","reason":"KubeletHasSufficientDisk","message":"kubelet has sufficient disk space available"},{"type":"MemoryPressure","status":"False","lastHeartbeatTime":"2017-06-09T02:45:04Z","lastTransitionTime":"2017-06-09T02:45:04Z","reason":"KubeletHasSufficientMemory","message":"kubelet has sufficient memory available"},{"type":"DiskPressure","status":"False","lastHeartbeatTime":"2017-06-09T02:45:04Z","lastTransitionTime":"2017-06-09T02:45:04Z","reason":"KubeletHasNoDiskPressure","message":"kubelet has no disk pressure"},{"type":"Ready","status":"True","lastHeartbeatTime":"2017-06-09T02:45:04Z","lastTransitionTime":"2017-06-09T02:45:04Z","reason":"KubeletReady","message":"kubelet is posting ready status"}],"addresses":[{"type":"LegacyHostIP","address":"172.18.15.77"},{"type":"InternalIP","address":"172.18.15.77"},{"type":"Hostname","address":"172.18.15.77"}],"daemonEndpoints":{"kubeletEndpoint":{"Port":10250}},"nodeInfo":{"machineID":"f9370ed252a14f73b014c1301a9b6d1b","systemUUID":"EC22BC1C-E38B-CDF5-4388-AC6219749290","bootID":"678da122-eea5-4bbb-b09d-82dece77a370","kernelVersion":"3.10.0-327.22.2.el7.x86_64","osImage":"Red Hat Enterprise Linux Server 7.3 (Maipo)","containerRuntimeVersion":"docker://1.12.6","kubeletVersion":"v1.6.1+5115d708d7","kubeProxyVersion":"v1.6.1+5115d708d7","operatingSystem":"linux","architecture":"amd64"},"images":[{"names":["openshift/origin-federation:6acabdc","openshift/origin-federation:latest"],"sizeBytes":1205885664},{"names":["openshift/origin-docker-registry:6acabdc","openshift/origin-docker-registry:latest"],"sizeBytes":1100164272},{"names":["openshift/origin-gitserver:6acabdc","openshift/origin-gitserver:latest"],"sizeBytes":1086520226},{"names":["openshift/openvswitch:6acabdc","openshift/openvswitch:latest"],"sizeBytes":1053403667},{"names":["openshift/node:6acabdc","openshift/node:latest"],"sizeBytes":1051721928},{"names":["openshift/origin-keepalived-ipfailover:6acabdc","openshift/origin-keepalived-ipfailover:latest"],"sizeBytes":1028529711},{"names":["openshift/origin-haproxy-router:6acabdc","openshift/origin-haproxy-router:latest"],"sizeBytes":1022758742},{"names":["openshift/origin:6acabdc","openshift/origin:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-f5-router:6acabdc","openshift/origin-f5-router:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-sti-builder:6acabdc","openshift/origin-sti-builder:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-recycler:6acabdc","openshift/origin-recycler:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-deployer:6acabdc","openshift/origin-deployer:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-docker-builder:6acabdc","openshift/origin-docker-builder:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-cluster-capacity:6acabdc","openshift/origin-cluster-capacity:latest"],"sizeBytes":962455026},{"names":["rhel7.1:latest"],"sizeBytes":765301508},{"names":["openshift/dind-master:latest"],"sizeBytes":731456758},{"names":["openshift/dind-node:latest"],"sizeBytes":731453034},{"names":["\u003cnone\u003e@\u003cnone\u003e","\u003cnone\u003e:\u003cnone\u003e"],"sizeBytes":709532011},{"names":["docker.io/openshift/origin-logging-kibana@sha256:9e3e11edb1f14c744ecf9587a3212e7648934a8bb302513ba84a8c6b058a1229","docker.io/openshift/origin-logging-kibana:latest"],"sizeBytes":682851463},{"names":["openshift/dind:latest"],"sizeBytes":640650210},{"names":["docker.io/openshift/origin-logging-elasticsearch@sha256:3a4d359a10d7655cdca2cfa3a89771d6825ffe1d50de4ac7bb570e79f862ccfb","docker.io/openshift/origin-logging-elasticsearch:latest"],"sizeBytes":425433788},{"names":["docker.io/openshift/base-centos7@sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c","docker.io/openshift/base-centos7:latest"],"sizeBytes":383049978},{"names":["rhel7.2:latest"],"sizeBytes":377493597},{"names":["openshift/origin-egress-router:6acabdc","openshift/origin-egress-router:latest"],"sizeBytes":364745713},{"names":["openshift/origin-base:latest"],"sizeBytes":363070172},{"names":["\u003cnone\u003e@\u003cnone\u003e","\u003cnone\u003e:\u003cnone\u003e"],"sizeBytes":363024702},{"names":["docker.io/openshift/origin-logging-fluentd@sha256:8e382dfb002d4f0788d8c5d30ec1baff8005c548bc49fa061fc24d9a0302d9e9","docker.io/openshift/origin-logging-fluentd:latest"],"sizeBytes":359223094},{"names":["docker.io/fedora@sha256:69281ddd7b2600e5f2b17f1e12d7fba25207f459204fb2d15884f8432c479136","docker.io/fedora:25"],"sizeBytes":230864375},{"names":["docker.io/openshift/origin-logging-curator@sha256:ee6d3de66a3dac118b6c961786fc075276bda8c688f9bf8e24f6559b38f0fbeb","docker.io/openshift/origin-logging-curator:latest"],"sizeBytes":224977536},{"names":["rhel7.3:latest","rhel7:latest"],"sizeBytes":219121266},{"names":["openshift/origin-pod:6acabdc","openshift/origin-pod:latest"],"sizeBytes":213199843},{"names":["registry.access.redhat.com/rhel7.2@sha256:98e6ca5d226c26e31a95cd67716afe22833c943e1926a21daf1a030906a02249","registry.access.redhat.com/rhel7.2:latest"],"sizeBytes":201376319},{"names":["registry.access.redhat.com/rhel7.3@sha256:1e232401d8e0ba53b36b757b4712fbcbd1dab9c21db039c45a84871a74e89e68","registry.access.redhat.com/rhel7.3:latest"],"sizeBytes":192693772},{"names":["docker.io/centos@sha256:bba1de7c9d900a898e3cadbae040dfe8a633c06bc104a0df76ae24483e03c077"],"sizeBytes":192548999},{"names":["openshift/origin-source:latest"],"sizeBytes":192548894},{"names":["docker.io/centos@sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9","docker.io/centos:7","docker.io/centos:centos7"],"sizeBytes":192548537},{"names":["registry.access.redhat.com/rhel7.1@sha256:1bc5a4c43bbb29a5a96a61896ff696933be3502e2f5fdc4cde02d9e101731fdd","registry.access.redhat.com/rhel7.1:latest"],"sizeBytes":158229901},{"names":["openshift/hello-openshift:6acabdc","openshift/hello-openshift:latest"],"sizeBytes":5643318}]}}
There was no error output from the command.
serviceaccount "registry" created
clusterrolebinding "registry-registry-role" created
deploymentconfig "docker-registry" created
service "docker-registry" created
--> Creating router router ...
info: password for stats user admin has been set to ktEbF1C6oX
serviceaccount "router" created
clusterrolebinding "router-router-role" created
deploymentconfig "router" created
service "router" created
--> Success
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:162: executing 'oadm new-project logging --node-selector=''' expecting success...
SUCCESS after 0.409s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:162: executing 'oadm new-project logging --node-selector=''' expecting success
Standard output from the command:
Created project logging
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:163: executing 'oc project logging > /dev/null' expecting success...
SUCCESS after 0.236s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:163: executing 'oc project logging > /dev/null' expecting success
There was no output from the command.
There was no error output from the command.
apiVersion: v1
items:
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-elasticsearch
component: development
logging-infra: development
provider: openshift
name: logging-elasticsearch
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-fluentd
component: development
logging-infra: development
provider: openshift
name: logging-fluentd
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-kibana
component: development
logging-infra: development
provider: openshift
name: logging-kibana
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-curator
component: development
logging-infra: development
provider: openshift
name: logging-curator
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-auth-proxy
component: development
logging-infra: development
provider: openshift
name: logging-auth-proxy
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-deployment
component: development
logging-infra: development
provider: openshift
name: origin
spec:
dockerImageRepository: openshift/origin
tags:
- from:
kind: DockerImage
name: openshift/origin:v1.5.0-alpha.2
name: v1.5.0-alpha.2
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
app: logging-elasticsearch
component: development
logging-infra: development
provider: openshift
name: logging-elasticsearch
spec:
output:
to:
kind: ImageStreamTag
name: logging-elasticsearch:latest
resources: {}
source:
contextDir: elasticsearch
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: openshift/base-centos7
type: Docker
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
build: logging-fluentd
component: development
logging-infra: development
provider: openshift
name: logging-fluentd
spec:
output:
to:
kind: ImageStreamTag
name: logging-fluentd:latest
resources: {}
source:
contextDir: fluentd
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: openshift/base-centos7
type: Docker
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
build: logging-kibana
component: development
logging-infra: development
provider: openshift
name: logging-kibana
spec:
output:
to:
kind: ImageStreamTag
name: logging-kibana:latest
resources: {}
source:
contextDir: kibana
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: openshift/base-centos7
type: Docker
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
build: logging-curator
component: development
logging-infra: development
provider: openshift
name: logging-curator
spec:
output:
to:
kind: ImageStreamTag
name: logging-curator:latest
resources: {}
source:
contextDir: curator
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: openshift/base-centos7
type: Docker
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
build: logging-auth-proxy
component: development
logging-infra: development
provider: openshift
name: logging-auth-proxy
spec:
output:
to:
kind: ImageStreamTag
name: logging-auth-proxy:latest
resources: {}
source:
contextDir: kibana-proxy
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: library/node:0.10.36
type: Docker
kind: List
metadata: {}
Running hack/testing/build-images:31: executing 'oc process -o yaml -f /data/src/github.com/openshift/origin-aggregated-logging/hack/templates/dev-builds-wo-deployer.yaml -p LOGGING_FORK_URL=https://github.com/openshift/origin-aggregated-logging -p LOGGING_FORK_BRANCH=master | build_filter | oc create -f -' expecting success...
SUCCESS after 0.446s: hack/testing/build-images:31: executing 'oc process -o yaml -f /data/src/github.com/openshift/origin-aggregated-logging/hack/templates/dev-builds-wo-deployer.yaml -p LOGGING_FORK_URL=https://github.com/openshift/origin-aggregated-logging -p LOGGING_FORK_BRANCH=master | build_filter | oc create -f -' expecting success
Standard output from the command:
imagestream "logging-elasticsearch" created
imagestream "logging-fluentd" created
imagestream "logging-kibana" created
imagestream "logging-curator" created
imagestream "logging-auth-proxy" created
imagestream "origin" created
buildconfig "logging-elasticsearch" created
buildconfig "logging-fluentd" created
buildconfig "logging-kibana" created
buildconfig "logging-curator" created
buildconfig "logging-auth-proxy" created
There was no error output from the command.
Running hack/testing/build-images:9: executing 'oc get imagestreamtag origin:latest' expecting success; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 1.015s: hack/testing/build-images:9: executing 'oc get imagestreamtag origin:latest' expecting success; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
NAME DOCKER REF UPDATED IMAGENAME
origin:latest openshift/origin@sha256:9768710ca6181ef1ed3424e3951d2d8d66fd18bc10fac5dec73711314ef12dfc Less than a second ago sha256:9768710ca6181ef1ed3424e3951d2d8d66fd18bc10fac5dec73711314ef12dfc
Standard error from the command:
Error from server (NotFound): imagestreamtags.image.openshift.io "origin:latest" not found
... repeated 2 times
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-auth-proxy-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-curator-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-elasticsearch-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-fluentd-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-kibana-1" started
Running hack/testing/build-images:33: executing 'wait_for_builds_complete' expecting success...
FAILURE after 4138.484s: hack/testing/build-images:33: executing 'wait_for_builds_complete' expecting success: the command returned the wrong error code
Standard output from the command:
build "logging-kibana-3" started
build in progress for logging-kibana - delete failed build logging-kibana-1 status complete
build "logging-kibana-1" deleted
error builds are not complete
NAME TYPE FROM STATUS STARTED DURATION
logging-auth-proxy-1 Docker Binary@3a32de0 Complete About an hour ago 2m59s
logging-curator-1 Docker Binary@3a32de0 Complete About an hour ago 1m22s
logging-elasticsearch-1 Docker Binary@3a32de0 Complete About an hour ago 2m0s
logging-fluentd-1 Docker Binary@3a32de0 Complete About an hour ago 1m54s
logging-kibana-2 Docker Binary@3a32de0 Cancelled (CancelledBuild) About an hour ago
logging-kibana-3 Docker Binary@3a32de0 Complete About an hour ago 3m30s
Standard error from the command:
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
Error from server (BadRequest): cannot upload file to build logging-kibana-2 with status Pending
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
[ERROR] PID 4236: hack/lib/cmd.sh:241: `return "${return_code}"` exited with status 1.
[INFO] Stack Trace:
[INFO] 1: hack/lib/cmd.sh:241: `return "${return_code}"`
[INFO] 2: hack/testing/build-images:33: os::cmd::expect_success
[INFO] 3: hack/testing/init-log-stack:14: source
[INFO] 4: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:166: source
[INFO] Exiting with code 1.
/data/src/github.com/openshift/origin-aggregated-logging/hack/lib/log/system.sh: line 31: 4596 Terminated sar -A -o "${binary_logfile}" 1 86400 > /dev/null 2> "${stderr_logfile}"
[INFO] [CLEANUP] Beginning cleanup routines...
[INFO] [CLEANUP] Dumping cluster events to /tmp/origin-aggregated-logging/artifacts/events.txt
[INFO] [CLEANUP] Dumping etcd contents to /tmp/origin-aggregated-logging/artifacts/etcd
[WARNING] No compiled `etcdhelper` binary was found. Attempting to build one using:
[WARNING] $ hack/build-go.sh tools/etcdhelper
++ Building go targets for linux/amd64: tools/etcdhelper
/data/src/github.com/openshift/origin-aggregated-logging/../origin/hack/build-go.sh took 180 seconds
2017-06-09 00:06:07.936168 I | warning: ignoring ServerName for user-provided CA for backwards compatibility is deprecated
[INFO] [CLEANUP] Dumping container logs to /tmp/origin-aggregated-logging/logs/containers
[INFO] [CLEANUP] Truncating log files over 200M
[INFO] [CLEANUP] Stopping docker containers
[INFO] [CLEANUP] Removing docker containers
[INFO] [CLEANUP] Killing child processes
[INFO] [CLEANUP] Pruning etcd data directory
[ERROR] /data/src/github.com/openshift/origin-aggregated-logging/logging.sh exited with code 1 after 01h 27m 36s
Error while running ssh/sudo command:
set -e
pushd /data/src/github.com/openshift//origin-aggregated-logging/hack/testing >/dev/null
export PATH=$GOPATH/bin:$PATH
echo '***************************************************'
echo 'Running GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh...'
time GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh
echo 'Finished GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh'
echo '***************************************************'
popd >/dev/null
The SSH command responded with a non-zero exit status. Vagrant
assumes that this means the command failed. The output for this command
should be in the log above. Please read the output to determine what
went wrong.
==> openshiftdev: Downloading logs
==> openshiftdev: Downloading artifacts from '/var/log/yum.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin/artifacts/yum.log'
==> openshiftdev: Downloading artifacts from '/var/log/secure' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin/artifacts/secure'
==> openshiftdev: Downloading artifacts from '/var/log/audit/audit.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin/artifacts/audit.log'
==> openshiftdev: Downloading artifacts from '/tmp/origin-aggregated-logging/' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace@2/origin/artifacts'
Build step 'Execute shell' marked build as failure
[description-setter] Could not determine description.
[PostBuildScript] - Execution post build scripts.
[workspace@2] $ /bin/sh -xe /tmp/hudson3623132560563368564.sh
+ INSTANCE_NAME=origin_logging-rhel7-1650
+ pushd origin
~/jobs/test-origin-aggregated-logging/workspace@2/origin ~/jobs/test-origin-aggregated-logging/workspace@2
+ rc=0
+ '[' -f .vagrant-openshift.json ']'
++ /usr/bin/vagrant ssh -c 'sudo ausearch -m avc'
+ ausearchresult='<no matches>'
+ rc=1
+ '[' '<no matches>' = '<no matches>' ']'
+ rc=0
+ /usr/bin/vagrant destroy -f
==> openshiftdev: Terminating the instance...
==> openshiftdev: Running cleanup tasks for 'shell' provisioner...
+ popd
~/jobs/test-origin-aggregated-logging/workspace@2
+ exit 0
[BFA] Scanning build for known causes...
[BFA] Found failure cause(s):
[BFA] Command Failure from category failure
[BFA] Done. 0s
Finished: FAILURE