Console Output
Started by upstream project "test_pull_request_origin_aggregated_logging" build number 75
originally caused by:
Started by remote host 50.17.198.52
[EnvInject] - Loading node environment variables.
Building in workspace /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content
OS_ROOT=/data/src/github.com/openshift/origin
INSTANCE_TYPE=c4.xlarge
GITHUB_REPO=openshift
OS=rhel7
TESTNAME=logging
[EnvInject] - Variables injected successfully.
[workspace] $ /bin/sh -xe /tmp/hudson4036518102927540808.sh
+ false
+ unset GOPATH
+ REPO_NAME=origin-aggregated-logging
+ rm -rf origin-aggregated-logging
+ vagrant origin-local-checkout --replace --repo origin-aggregated-logging -b master
You don't seem to have the GOPATH environment variable set on your system.
See: 'go help gopath' for more details about GOPATH.
Waiting for the cloning process to finish
Cloning origin-aggregated-logging ...
Submodule 'deployer/common' (https://github.com/openshift/origin-integration-common) registered for path 'deployer/common'
Submodule 'kibana-proxy' (https://github.com/fabric8io/openshift-auth-proxy.git) registered for path 'kibana-proxy'
Cloning into 'deployer/common'...
Submodule path 'deployer/common': checked out '45bf993212cdcbab5cbce3b3fab74a72b851402e'
Cloning into 'kibana-proxy'...
Submodule path 'kibana-proxy': checked out '118dfb40f7a8082d370ba7f4805255c9ec7c8178'
Origin repositories cloned into /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
+ pushd origin-aggregated-logging
~/jobs/test-origin-aggregated-logging/workspace/origin-aggregated-logging ~/jobs/test-origin-aggregated-logging/workspace
+ git checkout master
Already on 'master'
+ popd
~/jobs/test-origin-aggregated-logging/workspace
+ '[' -n '' ']'
+ vagrant origin-local-checkout --replace
You don't seem to have the GOPATH environment variable set on your system.
See: 'go help gopath' for more details about GOPATH.
Waiting for the cloning process to finish
Checking repo integrity for /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin
~/jobs/test-origin-aggregated-logging/workspace/origin ~/jobs/test-origin-aggregated-logging/workspace
# On branch master
# Untracked files:
# (use "git add <file>..." to include in what will be committed)
#
# artifacts/
nothing added to commit but untracked files present (use "git add" to track)
~/jobs/test-origin-aggregated-logging/workspace
Replacing: /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin
~/jobs/test-origin-aggregated-logging/workspace/origin ~/jobs/test-origin-aggregated-logging/workspace
Already on 'master'
HEAD is now at ba62cde Merge pull request #14474 from deads2k/client-ca-retry
Removing .vagrant-openshift.json
Removing .vagrant/
Removing artifacts/
fatal: branch name required
~/jobs/test-origin-aggregated-logging/workspace
Origin repositories cloned into /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
+ pushd origin
~/jobs/test-origin-aggregated-logging/workspace/origin ~/jobs/test-origin-aggregated-logging/workspace
+ INSTANCE_NAME=origin_logging-rhel7-1628
+ GIT_URL=https://github.com/openshift/origin-aggregated-logging
++ echo https://github.com/openshift/origin-aggregated-logging
++ sed s,https://,,
+ OAL_LOCAL_PATH=github.com/openshift/origin-aggregated-logging
+ OS_O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging
+ sort
+ env
_=/bin/env
BRANCH=master
BUILD_CAUSE=UPSTREAMTRIGGER
BUILD_CAUSE_UPSTREAMTRIGGER=true
BUILD_DISPLAY_NAME=#1628
BUILD_ID=1628
BUILD_NUMBER=1628
BUILD_TAG=jenkins-test-origin-aggregated-logging-1628
BUILD_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1628/
EXECUTOR_NUMBER=95
GITHUB_REPO=openshift
HOME=/var/lib/jenkins
HUDSON_COOKIE=0465109f-6570-4684-820d-368e21a2dc71
HUDSON_HOME=/var/lib/jenkins
HUDSON_SERVER_COOKIE=ec11f8b2841c966f
HUDSON_URL=https://ci.openshift.redhat.com/jenkins/
INSTANCE_TYPE=c4.xlarge
JENKINS_HOME=/var/lib/jenkins
JENKINS_SERVER_COOKIE=ec11f8b2841c966f
JENKINS_URL=https://ci.openshift.redhat.com/jenkins/
JOB_BASE_NAME=test-origin-aggregated-logging
JOB_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/display/redirect
JOB_NAME=test-origin-aggregated-logging
JOB_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/
LANG=en_US.UTF-8
LOGNAME=jenkins
MERGE=false
MERGE_SEVERITY=none
NLSPATH=/usr/dt/lib/nls/msg/%L/%N.cat
NODE_LABELS=master
NODE_NAME=master
OLDPWD=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
OPENSHIFT_ANSIBLE_TARGET_BRANCH=master
ORIGIN_AGGREGATED_LOGGING_PULL_ID=471
ORIGIN_AGGREGATED_LOGGING_TARGET_BRANCH=master
OS_ANSIBLE_BRANCH=master
OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible
OS=rhel7
OS_ROOT=/data/src/github.com/openshift/origin
PATH=/sbin:/usr/sbin:/bin:/usr/bin
PWD=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin
ROOT_BUILD_CAUSE=REMOTECAUSE
ROOT_BUILD_CAUSE_REMOTECAUSE=true
RUN_CHANGES_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1628/display/redirect?page=changes
RUN_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1628/display/redirect
SHELL=/bin/bash
SHLVL=3
TESTNAME=logging
TEST_PERF=false
USER=jenkins
WORKSPACE=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
XFILESEARCHPATH=/usr/dt/app-defaults/%L/Dt
+ vagrant origin-init --stage inst --os rhel7 --instance-type c4.xlarge origin_logging-rhel7-1628
Reading AWS credentials from /var/lib/jenkins/.awscred
Searching devenv-rhel7_* for latest base AMI (required_name_tag=)
Found: ami-83a1fc95 (devenv-rhel7_6323)
++ seq 0 2
+ for i in '$(seq 0 2)'
+ vagrant up --provider aws
Bringing machine 'openshiftdev' up with 'aws' provider...
==> openshiftdev: Warning! The AWS provider doesn't support any of the Vagrant
==> openshiftdev: high-level network configurations (`config.vm.network`). They
==> openshiftdev: will be silently ignored.
==> openshiftdev: Warning! You're launching this instance into a VPC without an
==> openshiftdev: elastic IP. Please verify you're properly connected to a VPN so
==> openshiftdev: you can access this machine, otherwise Vagrant will not be able
==> openshiftdev: to SSH into it.
==> openshiftdev: Launching an instance with the following settings...
==> openshiftdev: -- Type: c4.xlarge
==> openshiftdev: -- AMI: ami-83a1fc95
==> openshiftdev: -- Region: us-east-1
==> openshiftdev: -- Keypair: libra
==> openshiftdev: -- Subnet ID: subnet-cf57c596
==> openshiftdev: -- User Data: yes
==> openshiftdev: -- User Data:
==> openshiftdev: # cloud-config
==> openshiftdev:
==> openshiftdev: growpart:
==> openshiftdev: mode: auto
==> openshiftdev: devices: ['/']
==> openshiftdev: runcmd:
==> openshiftdev: - [ sh, -xc, "sed -i s/^Defaults.*requiretty/#Defaults requiretty/g /etc/sudoers"]
==> openshiftdev:
==> openshiftdev: -- Block Device Mapping: [{"DeviceName"=>"/dev/sda1", "Ebs.VolumeSize"=>25, "Ebs.VolumeType"=>"gp2"}, {"DeviceName"=>"/dev/sdb", "Ebs.VolumeSize"=>35, "Ebs.VolumeType"=>"gp2"}]
==> openshiftdev: -- Terminate On Shutdown: false
==> openshiftdev: -- Monitoring: false
==> openshiftdev: -- EBS optimized: false
==> openshiftdev: -- Assigning a public IP address in a VPC: false
==> openshiftdev: Waiting for instance to become "ready"...
==> openshiftdev: Waiting for SSH to become available...
==> openshiftdev: Machine is booted and ready for use!
==> openshiftdev: Running provisioner: setup (shell)...
openshiftdev: Running: /tmp/vagrant-shell20170608-31625-14rh4gs.sh
==> openshiftdev: Host: ec2-34-207-246-124.compute-1.amazonaws.com
+ break
+ vagrant sync-origin-aggregated-logging -c -s
Running ssh/sudo command 'rm -rf /data/src/github.com/openshift/origin-aggregated-logging-bare;
' with timeout 14400. Attempt #0
Running ssh/sudo command 'mkdir -p /ec2-user/.ssh;
mv /tmp/file20170608-788-v0d694 /ec2-user/.ssh/config &&
chown ec2-user:ec2-user /ec2-user/.ssh/config &&
chmod 0600 /ec2-user/.ssh/config' with timeout 14400. Attempt #0
Running ssh/sudo command 'mkdir -p /data/src/github.com/openshift/' with timeout 14400. Attempt #0
Running ssh/sudo command 'mkdir -p /data/src/github.com/openshift/builder && chown -R ec2-user:ec2-user /data/src/github.com/openshift/' with timeout 14400. Attempt #0
Running ssh/sudo command 'set -e
rm -fr /data/src/github.com/openshift/origin-aggregated-logging-bare;
if [ ! -d /data/src/github.com/openshift/origin-aggregated-logging-bare ]; then
git clone --quiet --bare https://github.com/openshift/origin-aggregated-logging.git /data/src/github.com/openshift/origin-aggregated-logging-bare >/dev/null
fi
' with timeout 14400. Attempt #0
Synchronizing local sources
Synchronizing [origin-aggregated-logging@master] from origin-aggregated-logging...
Warning: Permanently added '34.207.246.124' (ECDSA) to the list of known hosts.
Running ssh/sudo command 'set -e
if [ -d /data/src/github.com/openshift/origin-aggregated-logging-bare ]; then
rm -rf /data/src/github.com/openshift/origin-aggregated-logging
echo 'Cloning origin-aggregated-logging ...'
git clone --quiet --recurse-submodules /data/src/github.com/openshift/origin-aggregated-logging-bare /data/src/github.com/openshift/origin-aggregated-logging
else
MISSING_REPO+='origin-aggregated-logging-bare'
fi
if [ -n "$MISSING_REPO" ]; then
echo 'Missing required upstream repositories:'
echo $MISSING_REPO
echo 'To fix, execute command: vagrant clone-upstream-repos'
fi
' with timeout 14400. Attempt #0
Cloning origin-aggregated-logging ...
Submodule 'deployer/common' (https://github.com/openshift/origin-integration-common) registered for path 'deployer/common'
Submodule 'kibana-proxy' (https://github.com/fabric8io/openshift-auth-proxy.git) registered for path 'kibana-proxy'
Cloning into 'deployer/common'...
Submodule path 'deployer/common': checked out '45bf993212cdcbab5cbce3b3fab74a72b851402e'
Cloning into 'kibana-proxy'...
Submodule path 'kibana-proxy': checked out '118dfb40f7a8082d370ba7f4805255c9ec7c8178'
+ vagrant ssh -c 'if [ ! -d /tmp/openshift ] ; then mkdir /tmp/openshift ; fi ; sudo chmod 777 /tmp/openshift'
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/base-centos7 ...
pulling image openshift/base-centos7 ...
+ vagrant ssh -c 'docker pull openshift/base-centos7' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/base-centos7 ...
latest: Pulling from docker.io/openshift/base-centos7
45a2e645736c: Pulling fs layer
734fb161cf89: Pulling fs layer
78efc9e155c4: Pulling fs layer
8a3400b7e31a: Pulling fs layer
8a3400b7e31a: Waiting
734fb161cf89: Verifying Checksum
734fb161cf89: Download complete
8a3400b7e31a: Verifying Checksum
8a3400b7e31a: Download complete
78efc9e155c4: Download complete
45a2e645736c: Verifying Checksum
45a2e645736c: Download complete
45a2e645736c: Pull complete
734fb161cf89: Pull complete
78efc9e155c4: Pull complete
8a3400b7e31a: Pull complete
Digest: sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c
+ echo done with openshift/base-centos7
done with openshift/base-centos7
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image centos:centos7 ...
pulling image centos:centos7 ...
+ vagrant ssh -c 'docker pull centos:centos7' -- -n
Trying to pull repository docker.io/library/centos ...
centos7: Pulling from docker.io/library/centos
Digest: sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9
+ echo done with centos:centos7
done with centos:centos7
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-elasticsearch ...
pulling image openshift/origin-logging-elasticsearch ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-elasticsearch' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-elasticsearch ...
latest: Pulling from docker.io/openshift/origin-logging-elasticsearch
d5e46245fe40: Already exists
a7f338c4f8f1: Pulling fs layer
9e2e7a74201a: Pulling fs layer
fef68d5538a8: Pulling fs layer
8d01a96d29f1: Pulling fs layer
dbc1ff1ecc57: Pulling fs layer
1bd6b3975e11: Pulling fs layer
50f97f247f0a: Pulling fs layer
1661b1dc3fa9: Pulling fs layer
33ff5ec495e5: Pulling fs layer
239942808138: Pulling fs layer
1bd6b3975e11: Waiting
33ff5ec495e5: Waiting
239942808138: Waiting
1661b1dc3fa9: Waiting
50f97f247f0a: Waiting
8d01a96d29f1: Waiting
dbc1ff1ecc57: Waiting
fef68d5538a8: Verifying Checksum
fef68d5538a8: Download complete
a7f338c4f8f1: Verifying Checksum
a7f338c4f8f1: Download complete
8d01a96d29f1: Verifying Checksum
8d01a96d29f1: Download complete
dbc1ff1ecc57: Verifying Checksum
dbc1ff1ecc57: Download complete
1bd6b3975e11: Verifying Checksum
1bd6b3975e11: Download complete
50f97f247f0a: Verifying Checksum
50f97f247f0a: Download complete
33ff5ec495e5: Verifying Checksum
33ff5ec495e5: Download complete
239942808138: Verifying Checksum
239942808138: Download complete
1661b1dc3fa9: Verifying Checksum
1661b1dc3fa9: Download complete
9e2e7a74201a: Verifying Checksum
9e2e7a74201a: Download complete
a7f338c4f8f1: Pull complete
9e2e7a74201a: Pull complete
fef68d5538a8: Pull complete
8d01a96d29f1: Pull complete
dbc1ff1ecc57: Pull complete
1bd6b3975e11: Pull complete
50f97f247f0a: Pull complete
1661b1dc3fa9: Pull complete
33ff5ec495e5: Pull complete
239942808138: Pull complete
Digest: sha256:1e72563ad0551f5c15fc6aa8057a64cc9d0c21b2c40bca7efabdd1b55a4fc2e4
+ echo done with openshift/origin-logging-elasticsearch
done with openshift/origin-logging-elasticsearch
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-fluentd ...
pulling image openshift/origin-logging-fluentd ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-fluentd' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-fluentd ...
latest: Pulling from docker.io/openshift/origin-logging-fluentd
d5e46245fe40: Already exists
e4a1001ab6e5: Pulling fs layer
574b0fde62a3: Pulling fs layer
e153c28eb839: Pulling fs layer
38620628d3c7: Pulling fs layer
af3228b34eff: Pulling fs layer
af3228b34eff: Waiting
e153c28eb839: Verifying Checksum
e153c28eb839: Download complete
38620628d3c7: Verifying Checksum
38620628d3c7: Download complete
af3228b34eff: Verifying Checksum
af3228b34eff: Download complete
574b0fde62a3: Verifying Checksum
574b0fde62a3: Download complete
e4a1001ab6e5: Verifying Checksum
e4a1001ab6e5: Download complete
e4a1001ab6e5: Pull complete
574b0fde62a3: Pull complete
e153c28eb839: Pull complete
38620628d3c7: Pull complete
af3228b34eff: Pull complete
Digest: sha256:bc70848086a50bad58a2f41e166098e8ed351bf4dbe7af83caeb7a29f35b4395
+ echo done with openshift/origin-logging-fluentd
done with openshift/origin-logging-fluentd
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-curator ...
pulling image openshift/origin-logging-curator ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-curator' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-curator ...
latest: Pulling from docker.io/openshift/origin-logging-curator
d5e46245fe40: Already exists
9b159b6e6e2b: Pulling fs layer
e4616c6e28d7: Pulling fs layer
9b159b6e6e2b: Verifying Checksum
9b159b6e6e2b: Download complete
9b159b6e6e2b: Pull complete
e4616c6e28d7: Verifying Checksum
e4616c6e28d7: Download complete
e4616c6e28d7: Pull complete
Digest: sha256:e820338ca7fb0addfaec25d80d40a49f5ea25b24ff056ab6adbb42dd9eec94b4
+ echo done with openshift/origin-logging-curator
done with openshift/origin-logging-curator
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-kibana ...
pulling image openshift/origin-logging-kibana ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-kibana' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-kibana ...
latest: Pulling from docker.io/openshift/origin-logging-kibana
45a2e645736c: Already exists
734fb161cf89: Already exists
78efc9e155c4: Already exists
8a3400b7e31a: Already exists
51a36b029166: Pulling fs layer
e57c029afcc6: Pulling fs layer
89f2e4ae387a: Pulling fs layer
b036afb2cb60: Pulling fs layer
1c68a5b6ade6: Pulling fs layer
6e5af8882c65: Pulling fs layer
b036afb2cb60: Waiting
1c68a5b6ade6: Waiting
6e5af8882c65: Waiting
51a36b029166: Verifying Checksum
89f2e4ae387a: Verifying Checksum
89f2e4ae387a: Download complete
1c68a5b6ade6: Download complete
b036afb2cb60: Verifying Checksum
b036afb2cb60: Download complete
51a36b029166: Pull complete
6e5af8882c65: Verifying Checksum
6e5af8882c65: Download complete
e57c029afcc6: Verifying Checksum
e57c029afcc6: Download complete
e57c029afcc6: Pull complete
89f2e4ae387a: Pull complete
b036afb2cb60: Pull complete
1c68a5b6ade6: Pull complete
6e5af8882c65: Pull complete
Digest: sha256:70ead525ed596b73301e8df3ac229e33dd7f8431ec1233b37e96544c556530e9
+ echo done with openshift/origin-logging-kibana
done with openshift/origin-logging-kibana
+ vagrant test-origin-aggregated-logging -d --env GIT_URL=https://github.com/openshift/origin-aggregated-logging --env GIT_BRANCH=master --env O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging --env OS_ROOT=/data/src/github.com/openshift/origin --env ENABLE_OPS_CLUSTER=true --env USE_LOCAL_SOURCE=true --env TEST_PERF=false --env VERBOSE=1 --env OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible --env OS_ANSIBLE_BRANCH=master
***************************************************
Running GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh...
/data/src/github.com/openshift/origin /data/src/github.com/openshift/origin-aggregated-logging/hack/testing
/data/src/github.com/openshift/origin-aggregated-logging/hack/testing
/data/src/github.com/openshift/origin-aggregated-logging /data/src/github.com/openshift/origin-aggregated-logging/hack/testing
/data/src/github.com/openshift/origin-aggregated-logging/hack/testing
Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
Metadata Cache Created
Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
Resolving Dependencies
--> Running transaction check
---> Package ansible.noarch 0:2.3.0.0-3.el7 will be installed
--> Processing Dependency: sshpass for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-paramiko for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-keyczar for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-httplib2 for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-crypto for package: ansible-2.3.0.0-3.el7.noarch
---> Package python2-pip.noarch 0:8.1.2-5.el7 will be installed
---> Package python2-ruamel-yaml.x86_64 0:0.12.14-9.el7 will be installed
--> Processing Dependency: python2-typing for package: python2-ruamel-yaml-0.12.14-9.el7.x86_64
--> Processing Dependency: python2-ruamel-ordereddict for package: python2-ruamel-yaml-0.12.14-9.el7.x86_64
--> Running transaction check
---> Package python-httplib2.noarch 0:0.9.1-2.el7aos will be installed
---> Package python-keyczar.noarch 0:0.71c-2.el7aos will be installed
--> Processing Dependency: python-pyasn1 for package: python-keyczar-0.71c-2.el7aos.noarch
---> Package python-paramiko.noarch 0:2.1.1-1.el7 will be installed
--> Processing Dependency: python-cryptography for package: python-paramiko-2.1.1-1.el7.noarch
---> Package python2-crypto.x86_64 0:2.6.1-13.el7 will be installed
--> Processing Dependency: libtomcrypt.so.0()(64bit) for package: python2-crypto-2.6.1-13.el7.x86_64
---> Package python2-ruamel-ordereddict.x86_64 0:0.4.9-3.el7 will be installed
---> Package python2-typing.noarch 0:3.5.2.2-3.el7 will be installed
---> Package sshpass.x86_64 0:1.06-1.el7 will be installed
--> Running transaction check
---> Package libtomcrypt.x86_64 0:1.17-23.el7 will be installed
--> Processing Dependency: libtommath >= 0.42.0 for package: libtomcrypt-1.17-23.el7.x86_64
--> Processing Dependency: libtommath.so.0()(64bit) for package: libtomcrypt-1.17-23.el7.x86_64
---> Package python2-cryptography.x86_64 0:1.3.1-3.el7 will be installed
--> Processing Dependency: python-idna >= 2.0 for package: python2-cryptography-1.3.1-3.el7.x86_64
--> Processing Dependency: python-cffi >= 1.4.1 for package: python2-cryptography-1.3.1-3.el7.x86_64
--> Processing Dependency: python-ipaddress for package: python2-cryptography-1.3.1-3.el7.x86_64
--> Processing Dependency: python-enum34 for package: python2-cryptography-1.3.1-3.el7.x86_64
---> Package python2-pyasn1.noarch 0:0.1.9-7.el7 will be installed
--> Running transaction check
---> Package libtommath.x86_64 0:0.42.0-4.el7 will be installed
---> Package python-cffi.x86_64 0:1.6.0-5.el7 will be installed
--> Processing Dependency: python-pycparser for package: python-cffi-1.6.0-5.el7.x86_64
---> Package python-enum34.noarch 0:1.0.4-1.el7 will be installed
---> Package python-idna.noarch 0:2.0-1.el7 will be installed
---> Package python-ipaddress.noarch 0:1.0.16-2.el7 will be installed
--> Running transaction check
---> Package python-pycparser.noarch 0:2.14-1.el7 will be installed
--> Processing Dependency: python-ply for package: python-pycparser-2.14-1.el7.noarch
--> Running transaction check
---> Package python-ply.noarch 0:3.4-10.el7 will be installed
--> Finished Dependency Resolution
Dependencies Resolved
================================================================================
Package Arch Version Repository Size
================================================================================
Installing:
ansible noarch 2.3.0.0-3.el7 epel 5.7 M
python2-pip noarch 8.1.2-5.el7 epel 1.7 M
python2-ruamel-yaml x86_64 0.12.14-9.el7 li 245 k
Installing for dependencies:
libtomcrypt x86_64 1.17-23.el7 epel 224 k
libtommath x86_64 0.42.0-4.el7 epel 35 k
python-cffi x86_64 1.6.0-5.el7 oso-rhui-rhel-server-releases 218 k
python-enum34 noarch 1.0.4-1.el7 oso-rhui-rhel-server-releases 52 k
python-httplib2 noarch 0.9.1-2.el7aos li 115 k
python-idna noarch 2.0-1.el7 oso-rhui-rhel-server-releases 92 k
python-ipaddress noarch 1.0.16-2.el7 oso-rhui-rhel-server-releases 34 k
python-keyczar noarch 0.71c-2.el7aos rhel-7-server-ose-3.1-rpms 217 k
python-paramiko noarch 2.1.1-1.el7 rhel-7-server-ose-3.4-rpms 266 k
python-ply noarch 3.4-10.el7 oso-rhui-rhel-server-releases 123 k
python-pycparser noarch 2.14-1.el7 oso-rhui-rhel-server-releases 105 k
python2-crypto x86_64 2.6.1-13.el7 epel 476 k
python2-cryptography x86_64 1.3.1-3.el7 oso-rhui-rhel-server-releases 471 k
python2-pyasn1 noarch 0.1.9-7.el7 oso-rhui-rhel-server-releases 100 k
python2-ruamel-ordereddict
x86_64 0.4.9-3.el7 li 38 k
python2-typing noarch 3.5.2.2-3.el7 epel 39 k
sshpass x86_64 1.06-1.el7 epel 21 k
Transaction Summary
================================================================================
Install 3 Packages (+17 Dependent packages)
Total download size: 10 M
Installed size: 47 M
Downloading packages:
--------------------------------------------------------------------------------
Total 5.3 MB/s | 10 MB 00:01
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
Installing : python2-pyasn1-0.1.9-7.el7.noarch 1/20
Installing : sshpass-1.06-1.el7.x86_64 2/20
Installing : libtommath-0.42.0-4.el7.x86_64 3/20
Installing : libtomcrypt-1.17-23.el7.x86_64 4/20
Installing : python2-crypto-2.6.1-13.el7.x86_64 5/20
Installing : python-keyczar-0.71c-2.el7aos.noarch 6/20
Installing : python-enum34-1.0.4-1.el7.noarch 7/20
Installing : python-ply-3.4-10.el7.noarch 8/20
Installing : python-pycparser-2.14-1.el7.noarch 9/20
Installing : python-cffi-1.6.0-5.el7.x86_64 10/20
Installing : python-httplib2-0.9.1-2.el7aos.noarch 11/20
Installing : python-idna-2.0-1.el7.noarch 12/20
Installing : python2-ruamel-ordereddict-0.4.9-3.el7.x86_64 13/20
Installing : python2-typing-3.5.2.2-3.el7.noarch 14/20
Installing : python-ipaddress-1.0.16-2.el7.noarch 15/20
Installing : python2-cryptography-1.3.1-3.el7.x86_64 16/20
Installing : python-paramiko-2.1.1-1.el7.noarch 17/20
Installing : ansible-2.3.0.0-3.el7.noarch 18/20
Installing : python2-ruamel-yaml-0.12.14-9.el7.x86_64 19/20
Installing : python2-pip-8.1.2-5.el7.noarch 20/20
Verifying : python-pycparser-2.14-1.el7.noarch 1/20
Verifying : python-ipaddress-1.0.16-2.el7.noarch 2/20
Verifying : ansible-2.3.0.0-3.el7.noarch 3/20
Verifying : python2-typing-3.5.2.2-3.el7.noarch 4/20
Verifying : python2-pip-8.1.2-5.el7.noarch 5/20
Verifying : python2-pyasn1-0.1.9-7.el7.noarch 6/20
Verifying : libtomcrypt-1.17-23.el7.x86_64 7/20
Verifying : python-cffi-1.6.0-5.el7.x86_64 8/20
Verifying : python2-ruamel-yaml-0.12.14-9.el7.x86_64 9/20
Verifying : python2-ruamel-ordereddict-0.4.9-3.el7.x86_64 10/20
Verifying : python-idna-2.0-1.el7.noarch 11/20
Verifying : python-httplib2-0.9.1-2.el7aos.noarch 12/20
Verifying : python-ply-3.4-10.el7.noarch 13/20
Verifying : python-enum34-1.0.4-1.el7.noarch 14/20
Verifying : python-keyczar-0.71c-2.el7aos.noarch 15/20
Verifying : libtommath-0.42.0-4.el7.x86_64 16/20
Verifying : sshpass-1.06-1.el7.x86_64 17/20
Verifying : python2-cryptography-1.3.1-3.el7.x86_64 18/20
Verifying : python-paramiko-2.1.1-1.el7.noarch 19/20
Verifying : python2-crypto-2.6.1-13.el7.x86_64 20/20
Installed:
ansible.noarch 0:2.3.0.0-3.el7 python2-pip.noarch 0:8.1.2-5.el7
python2-ruamel-yaml.x86_64 0:0.12.14-9.el7
Dependency Installed:
libtomcrypt.x86_64 0:1.17-23.el7
libtommath.x86_64 0:0.42.0-4.el7
python-cffi.x86_64 0:1.6.0-5.el7
python-enum34.noarch 0:1.0.4-1.el7
python-httplib2.noarch 0:0.9.1-2.el7aos
python-idna.noarch 0:2.0-1.el7
python-ipaddress.noarch 0:1.0.16-2.el7
python-keyczar.noarch 0:0.71c-2.el7aos
python-paramiko.noarch 0:2.1.1-1.el7
python-ply.noarch 0:3.4-10.el7
python-pycparser.noarch 0:2.14-1.el7
python2-crypto.x86_64 0:2.6.1-13.el7
python2-cryptography.x86_64 0:1.3.1-3.el7
python2-pyasn1.noarch 0:0.1.9-7.el7
python2-ruamel-ordereddict.x86_64 0:0.4.9-3.el7
python2-typing.noarch 0:3.5.2.2-3.el7
sshpass.x86_64 0:1.06-1.el7
Complete!
Cloning into '/tmp/tmp.giX4WBVEkB/openhift-ansible'...
Copying oc from path to /usr/local/bin for use by openshift-ansible
Copying oc from path to /usr/bin for use by openshift-ansible
Copying oadm from path to /usr/local/bin for use by openshift-ansible
Copying oadm from path to /usr/bin for use by openshift-ansible
[INFO] Starting logging tests at Thu Jun 8 11:37:44 EDT 2017
Generated new key pair as /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/serviceaccounts.public.key and /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/serviceaccounts.private.key
Generating node credentials ...
Created node config for 172.18.3.237 in /tmp/openshift/origin-aggregated-logging/openshift.local.config/node-172.18.3.237
Wrote master config to: /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/master-config.yaml
Running hack/lib/start.sh:352: executing 'oc get --raw /healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s...
SUCCESS after 29.645s: hack/lib/start.sh:352: executing 'oc get --raw /healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s
Standard output from the command:
ok
Standard error from the command:
The connection to the server 172.18.3.237:8443 was refused - did you specify the right host or port?
... repeated 62 times
Error from server (Forbidden): User "system:admin" cannot "get" on "/healthz"
... repeated 6 times
Running hack/lib/start.sh:353: executing 'oc get --raw https://172.18.3.237:10250/healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.5s until completion or 120.000s...
SUCCESS after 0.213s: hack/lib/start.sh:353: executing 'oc get --raw https://172.18.3.237:10250/healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.5s until completion or 120.000s
Standard output from the command:
ok
There was no error output from the command.
Running hack/lib/start.sh:354: executing 'oc get --raw /healthz/ready --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s...
SUCCESS after 1.265s: hack/lib/start.sh:354: executing 'oc get --raw /healthz/ready --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s
Standard output from the command:
ok
Standard error from the command:
Error from server (InternalError): an error on the server ("") has prevented the request from succeeding
... repeated 2 times
Running hack/lib/start.sh:355: executing 'oc get service kubernetes --namespace default --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 160.000s...
SUCCESS after 0.460s: hack/lib/start.sh:355: executing 'oc get service kubernetes --namespace default --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 160.000s
Standard output from the command:
NAME CLUSTER-IP EXTERNAL-IP PORT(S) AGE
kubernetes 172.30.0.1 <none> 443/TCP,53/UDP,53/TCP 4s
There was no error output from the command.
Running hack/lib/start.sh:356: executing 'oc get --raw /api/v1/nodes/172.18.3.237 --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 80.000s...
SUCCESS after 0.260s: hack/lib/start.sh:356: executing 'oc get --raw /api/v1/nodes/172.18.3.237 --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 80.000s
Standard output from the command:
{"kind":"Node","apiVersion":"v1","metadata":{"name":"172.18.3.237","selfLink":"/api/v1/nodes/172.18.3.237","uid":"8768a5bc-4c60-11e7-94aa-0e1649350dc2","resourceVersion":"292","creationTimestamp":"2017-06-08T15:38:33Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/hostname":"172.18.3.237"},"annotations":{"volumes.kubernetes.io/controller-managed-attach-detach":"true"}},"spec":{"externalID":"172.18.3.237","providerID":"aws:////i-06878b3e9e9644cee"},"status":{"capacity":{"cpu":"4","memory":"7231688Ki","pods":"40"},"allocatable":{"cpu":"4","memory":"7129288Ki","pods":"40"},"conditions":[{"type":"OutOfDisk","status":"False","lastHeartbeatTime":"2017-06-08T15:38:33Z","lastTransitionTime":"2017-06-08T15:38:33Z","reason":"KubeletHasSufficientDisk","message":"kubelet has sufficient disk space available"},{"type":"MemoryPressure","status":"False","lastHeartbeatTime":"2017-06-08T15:38:33Z","lastTransitionTime":"2017-06-08T15:38:33Z","reason":"KubeletHasSufficientMemory","message":"kubelet has sufficient memory available"},{"type":"DiskPressure","status":"False","lastHeartbeatTime":"2017-06-08T15:38:33Z","lastTransitionTime":"2017-06-08T15:38:33Z","reason":"KubeletHasNoDiskPressure","message":"kubelet has no disk pressure"},{"type":"Ready","status":"True","lastHeartbeatTime":"2017-06-08T15:38:33Z","lastTransitionTime":"2017-06-08T15:38:33Z","reason":"KubeletReady","message":"kubelet is posting ready status"}],"addresses":[{"type":"LegacyHostIP","address":"172.18.3.237"},{"type":"InternalIP","address":"172.18.3.237"},{"type":"Hostname","address":"172.18.3.237"}],"daemonEndpoints":{"kubeletEndpoint":{"Port":10250}},"nodeInfo":{"machineID":"f9370ed252a14f73b014c1301a9b6d1b","systemUUID":"EC2388AB-03B5-9846-ECC4-052DA3A164CF","bootID":"c2cfc298-5593-4726-958d-742f09f4df0d","kernelVersion":"3.10.0-327.22.2.el7.x86_64","osImage":"Red Hat Enterprise Linux Server 7.3 (Maipo)","containerRuntimeVersion":"docker://1.12.6","kubeletVersion":"v1.6.1+5115d708d7","kubeProxyVersion":"v1.6.1+5115d708d7","operatingSystem":"linux","architecture":"amd64"},"images":[{"names":["openshift/origin-federation:6acabdc","openshift/origin-federation:latest"],"sizeBytes":1205885664},{"names":["openshift/origin-docker-registry:6acabdc","openshift/origin-docker-registry:latest"],"sizeBytes":1100164272},{"names":["openshift/origin-gitserver:6acabdc","openshift/origin-gitserver:latest"],"sizeBytes":1086520226},{"names":["openshift/openvswitch:6acabdc","openshift/openvswitch:latest"],"sizeBytes":1053403667},{"names":["openshift/node:6acabdc","openshift/node:latest"],"sizeBytes":1051721928},{"names":["openshift/origin-keepalived-ipfailover:6acabdc","openshift/origin-keepalived-ipfailover:latest"],"sizeBytes":1028529711},{"names":["openshift/origin-haproxy-router:6acabdc","openshift/origin-haproxy-router:latest"],"sizeBytes":1022758742},{"names":["openshift/origin:6acabdc","openshift/origin:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-f5-router:6acabdc","openshift/origin-f5-router:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-sti-builder:6acabdc","openshift/origin-sti-builder:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-recycler:6acabdc","openshift/origin-recycler:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-deployer:6acabdc","openshift/origin-deployer:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-docker-builder:6acabdc","openshift/origin-docker-builder:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-cluster-capacity:6acabdc","openshift/origin-cluster-capacity:latest"],"sizeBytes":962455026},{"names":["rhel7.1:latest"],"sizeBytes":765301508},{"names":["openshift/dind-master:latest"],"sizeBytes":731456758},{"names":["openshift/dind-node:latest"],"sizeBytes":731453034},{"names":["\u003cnone\u003e@\u003cnone\u003e","\u003cnone\u003e:\u003cnone\u003e"],"sizeBytes":709532011},{"names":["docker.io/openshift/origin-logging-kibana@sha256:70ead525ed596b73301e8df3ac229e33dd7f8431ec1233b37e96544c556530e9","docker.io/openshift/origin-logging-kibana:latest"],"sizeBytes":682851528},{"names":["openshift/dind:latest"],"sizeBytes":640650210},{"names":["docker.io/openshift/origin-logging-elasticsearch@sha256:1e72563ad0551f5c15fc6aa8057a64cc9d0c21b2c40bca7efabdd1b55a4fc2e4","docker.io/openshift/origin-logging-elasticsearch:latest"],"sizeBytes":425433997},{"names":["docker.io/openshift/base-centos7@sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c","docker.io/openshift/base-centos7:latest"],"sizeBytes":383049978},{"names":["rhel7.2:latest"],"sizeBytes":377493597},{"names":["openshift/origin-egress-router:6acabdc","openshift/origin-egress-router:latest"],"sizeBytes":364745713},{"names":["openshift/origin-base:latest"],"sizeBytes":363070172},{"names":["\u003cnone\u003e@\u003cnone\u003e","\u003cnone\u003e:\u003cnone\u003e"],"sizeBytes":363024702},{"names":["docker.io/openshift/origin-logging-fluentd@sha256:bc70848086a50bad58a2f41e166098e8ed351bf4dbe7af83caeb7a29f35b4395","docker.io/openshift/origin-logging-fluentd:latest"],"sizeBytes":359217371},{"names":["docker.io/fedora@sha256:69281ddd7b2600e5f2b17f1e12d7fba25207f459204fb2d15884f8432c479136","docker.io/fedora:25"],"sizeBytes":230864375},{"names":["docker.io/openshift/origin-logging-curator@sha256:e820338ca7fb0addfaec25d80d40a49f5ea25b24ff056ab6adbb42dd9eec94b4","docker.io/openshift/origin-logging-curator:latest"],"sizeBytes":224977691},{"names":["rhel7.3:latest","rhel7:latest"],"sizeBytes":219121266},{"names":["openshift/origin-pod:6acabdc","openshift/origin-pod:latest"],"sizeBytes":213199843},{"names":["registry.access.redhat.com/rhel7.2@sha256:98e6ca5d226c26e31a95cd67716afe22833c943e1926a21daf1a030906a02249","registry.access.redhat.com/rhel7.2:latest"],"sizeBytes":201376319},{"names":["registry.access.redhat.com/rhel7.3@sha256:1e232401d8e0ba53b36b757b4712fbcbd1dab9c21db039c45a84871a74e89e68","registry.access.redhat.com/rhel7.3:latest"],"sizeBytes":192693772},{"names":["docker.io/centos@sha256:bba1de7c9d900a898e3cadbae040dfe8a633c06bc104a0df76ae24483e03c077"],"sizeBytes":192548999},{"names":["openshift/origin-source:latest"],"sizeBytes":192548894},{"names":["docker.io/centos@sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9","docker.io/centos:7","docker.io/centos:centos7"],"sizeBytes":192548537},{"names":["registry.access.redhat.com/rhel7.1@sha256:1bc5a4c43bbb29a5a96a61896ff696933be3502e2f5fdc4cde02d9e101731fdd","registry.access.redhat.com/rhel7.1:latest"],"sizeBytes":158229901},{"names":["openshift/hello-openshift:6acabdc","openshift/hello-openshift:latest"],"sizeBytes":5643318}]}}
There was no error output from the command.
serviceaccount "registry" created
clusterrolebinding "registry-registry-role" created
deploymentconfig "docker-registry" created
service "docker-registry" created
--> Creating router router ...
info: password for stats user admin has been set to xJlzxVymlr
serviceaccount "router" created
clusterrolebinding "router-router-role" created
deploymentconfig "router" created
service "router" created
--> Success
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:162: executing 'oadm new-project logging --node-selector=''' expecting success...
SUCCESS after 0.425s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:162: executing 'oadm new-project logging --node-selector=''' expecting success
Standard output from the command:
Created project logging
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:163: executing 'oc project logging > /dev/null' expecting success...
SUCCESS after 0.219s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:163: executing 'oc project logging > /dev/null' expecting success
There was no output from the command.
There was no error output from the command.
apiVersion: v1
items:
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-elasticsearch
component: development
logging-infra: development
provider: openshift
name: logging-elasticsearch
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-fluentd
component: development
logging-infra: development
provider: openshift
name: logging-fluentd
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-kibana
component: development
logging-infra: development
provider: openshift
name: logging-kibana
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-curator
component: development
logging-infra: development
provider: openshift
name: logging-curator
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-auth-proxy
component: development
logging-infra: development
provider: openshift
name: logging-auth-proxy
spec: {}
- apiVersion: v1
kind: ImageStream
metadata:
labels:
build: logging-deployment
component: development
logging-infra: development
provider: openshift
name: origin
spec:
dockerImageRepository: openshift/origin
tags:
- from:
kind: DockerImage
name: openshift/origin:v1.5.0-alpha.2
name: v1.5.0-alpha.2
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
app: logging-elasticsearch
component: development
logging-infra: development
provider: openshift
name: logging-elasticsearch
spec:
output:
to:
kind: ImageStreamTag
name: logging-elasticsearch:latest
resources: {}
source:
contextDir: elasticsearch
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: openshift/base-centos7
type: Docker
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
build: logging-fluentd
component: development
logging-infra: development
provider: openshift
name: logging-fluentd
spec:
output:
to:
kind: ImageStreamTag
name: logging-fluentd:latest
resources: {}
source:
contextDir: fluentd
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: openshift/base-centos7
type: Docker
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
build: logging-kibana
component: development
logging-infra: development
provider: openshift
name: logging-kibana
spec:
output:
to:
kind: ImageStreamTag
name: logging-kibana:latest
resources: {}
source:
contextDir: kibana
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: openshift/base-centos7
type: Docker
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
build: logging-curator
component: development
logging-infra: development
provider: openshift
name: logging-curator
spec:
output:
to:
kind: ImageStreamTag
name: logging-curator:latest
resources: {}
source:
contextDir: curator
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: openshift/base-centos7
type: Docker
- apiVersion: v1
kind: BuildConfig
metadata:
labels:
build: logging-auth-proxy
component: development
logging-infra: development
provider: openshift
name: logging-auth-proxy
spec:
output:
to:
kind: ImageStreamTag
name: logging-auth-proxy:latest
resources: {}
source:
contextDir: kibana-proxy
git:
ref: master
uri: https://github.com/openshift/origin-aggregated-logging
type: Git
strategy:
dockerStrategy:
from:
kind: DockerImage
name: library/node:0.10.36
type: Docker
kind: List
metadata: {}
Running hack/testing/build-images:31: executing 'oc process -o yaml -f /data/src/github.com/openshift/origin-aggregated-logging/hack/templates/dev-builds-wo-deployer.yaml -p LOGGING_FORK_URL=https://github.com/openshift/origin-aggregated-logging -p LOGGING_FORK_BRANCH=master | build_filter | oc create -f -' expecting success...
SUCCESS after 0.346s: hack/testing/build-images:31: executing 'oc process -o yaml -f /data/src/github.com/openshift/origin-aggregated-logging/hack/templates/dev-builds-wo-deployer.yaml -p LOGGING_FORK_URL=https://github.com/openshift/origin-aggregated-logging -p LOGGING_FORK_BRANCH=master | build_filter | oc create -f -' expecting success
Standard output from the command:
imagestream "logging-elasticsearch" created
imagestream "logging-fluentd" created
imagestream "logging-kibana" created
imagestream "logging-curator" created
imagestream "logging-auth-proxy" created
imagestream "origin" created
buildconfig "logging-elasticsearch" created
buildconfig "logging-fluentd" created
buildconfig "logging-kibana" created
buildconfig "logging-curator" created
buildconfig "logging-auth-proxy" created
There was no error output from the command.
Running hack/testing/build-images:9: executing 'oc get imagestreamtag origin:latest' expecting success; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 0.625s: hack/testing/build-images:9: executing 'oc get imagestreamtag origin:latest' expecting success; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
NAME DOCKER REF UPDATED IMAGENAME
origin:latest openshift/origin@sha256:4ce85347f606fb161cee6f0f58c68ddbd557716b6742e18e9ca7d9183372480e Less than a second ago sha256:4ce85347f606fb161cee6f0f58c68ddbd557716b6742e18e9ca7d9183372480e
Standard error from the command:
Error from server (NotFound): imagestreamtags.image.openshift.io "origin:latest" not found
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-auth-proxy-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-curator-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-elasticsearch-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-fluentd-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-kibana-1" started
Running hack/testing/build-images:33: executing 'wait_for_builds_complete' expecting success...
SUCCESS after 699.338s: hack/testing/build-images:33: executing 'wait_for_builds_complete' expecting success
Standard output from the command:
build "logging-kibana-2" started
build "logging-kibana-3" started
build in progress for logging-kibana - delete failed build logging-kibana-2 status running
build "logging-kibana-2" deleted
build in progress for logging-kibana - delete failed build logging-kibana-1 status complete
build "logging-kibana-1" deleted
Builds are complete
Standard error from the command:
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
/tmp/tmp.giX4WBVEkB/openhift-ansible /data/src/github.com/openshift/origin-aggregated-logging
### Created host inventory file ###
[oo_first_master]
openshift
[oo_first_master:vars]
ansible_become=true
ansible_connection=local
containerized=true
docker_protect_installed_version=true
openshift_deployment_type=origin
deployment_type=origin
required_packages=[]
openshift_hosted_logging_hostname=kibana.127.0.0.1.xip.io
openshift_master_logging_public_url=https://kibana.127.0.0.1.xip.io
openshift_logging_master_public_url=https://172.18.3.237:8443
openshift_logging_image_prefix=172.30.255.47:5000/logging/
openshift_logging_use_ops=true
openshift_logging_fluentd_journal_read_from_head=False
openshift_logging_es_log_appenders=['console']
openshift_logging_use_mux=false
openshift_logging_mux_allow_external=false
openshift_logging_use_mux_client=false
###################################
Running hack/testing/init-log-stack:58: executing 'oc login -u system:admin' expecting success...
SUCCESS after 0.216s: hack/testing/init-log-stack:58: executing 'oc login -u system:admin' expecting success
Standard output from the command:
Logged into "https://172.18.3.237:8443" as "system:admin" using existing credentials.
You have access to the following projects and can switch between them with 'oc project <projectname>':
default
kube-public
kube-system
* logging
openshift
openshift-infra
Using project "logging".
There was no error output from the command.
Using /tmp/tmp.giX4WBVEkB/openhift-ansible/ansible.cfg as config file
PLAYBOOK: openshift-logging.yml ************************************************
4 plays in /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/byo/openshift-cluster/openshift-logging.yml
PLAY [Create initial host groups for localhost] ********************************
META: ran handlers
TASK [include_vars] ************************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/byo/openshift-cluster/initialize_groups.yml:10
ok: [localhost] => {
"ansible_facts": {
"g_all_hosts": "{{ g_master_hosts | union(g_node_hosts) | union(g_etcd_hosts) | union(g_lb_hosts) | union(g_nfs_hosts) | union(g_new_node_hosts)| union(g_new_master_hosts) | default([]) }}",
"g_etcd_hosts": "{{ groups.etcd | default([]) }}",
"g_glusterfs_hosts": "{{ groups.glusterfs | default([]) }}",
"g_glusterfs_registry_hosts": "{{ groups.glusterfs_registry | default(g_glusterfs_hosts) }}",
"g_lb_hosts": "{{ groups.lb | default([]) }}",
"g_master_hosts": "{{ groups.masters | default([]) }}",
"g_new_master_hosts": "{{ groups.new_masters | default([]) }}",
"g_new_node_hosts": "{{ groups.new_nodes | default([]) }}",
"g_nfs_hosts": "{{ groups.nfs | default([]) }}",
"g_node_hosts": "{{ groups.nodes | default([]) }}"
},
"changed": false
}
META: ran handlers
META: ran handlers
PLAY [Populate config host groups] *********************************************
META: ran handlers
TASK [Evaluate groups - g_etcd_hosts required] *********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:8
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate groups - g_master_hosts or g_new_master_hosts required] *********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:13
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate groups - g_node_hosts or g_new_node_hosts required] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:18
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate groups - g_lb_hosts required] ***********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:23
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate groups - g_nfs_hosts required] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:28
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate groups - g_nfs_hosts is single host] ****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:33
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate groups - g_glusterfs_hosts required] ****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:38
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate oo_all_hosts] ***************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:43
TASK [Evaluate oo_masters] *****************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:52
TASK [Evaluate oo_first_master] ************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:61
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate oo_masters_to_config] *******************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:70
TASK [Evaluate oo_etcd_to_config] **********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:79
TASK [Evaluate oo_first_etcd] **************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:88
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate oo_etcd_hosts_to_upgrade] ***************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:100
TASK [Evaluate oo_etcd_hosts_to_backup] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:107
creating host via 'add_host': hostname=openshift
ok: [localhost] => (item=openshift) => {
"add_host": {
"groups": [
"oo_etcd_hosts_to_backup"
],
"host_name": "openshift",
"host_vars": {}
},
"changed": false,
"item": "openshift"
}
TASK [Evaluate oo_nodes_to_config] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:114
TASK [Add master to oo_nodes_to_config] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:124
TASK [Evaluate oo_lb_to_config] ************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:134
TASK [Evaluate oo_nfs_to_config] ***********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:143
TASK [Evaluate oo_glusterfs_to_config] *****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:152
META: ran handlers
META: ran handlers
PLAY [OpenShift Aggregated Logging] ********************************************
TASK [Gathering Facts] *********************************************************
ok: [openshift]
META: ran handlers
TASK [openshift_sanitize_inventory : Abort when conflicting deployment type variables are set] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:2
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_sanitize_inventory : Standardize on latest variable names] *****
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:15
ok: [openshift] => {
"ansible_facts": {
"deployment_type": "origin",
"openshift_deployment_type": "origin"
},
"changed": false
}
TASK [openshift_sanitize_inventory : Abort when deployment type is invalid] ****
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:23
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_sanitize_inventory : Normalize openshift_release] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:31
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_sanitize_inventory : Abort when openshift_release is invalid] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:41
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Detecting Operating System] ****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:2
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_facts : set_fact] **********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:8
ok: [openshift] => {
"ansible_facts": {
"l_is_atomic": false
},
"changed": false
}
TASK [openshift_facts : set_fact] **********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:10
ok: [openshift] => {
"ansible_facts": {
"l_is_containerized": true,
"l_is_etcd_system_container": false,
"l_is_master_system_container": false,
"l_is_node_system_container": false,
"l_is_openvswitch_system_container": false
},
"changed": false
}
TASK [openshift_facts : set_fact] **********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:16
ok: [openshift] => {
"ansible_facts": {
"l_any_system_container": false
},
"changed": false
}
TASK [openshift_facts : set_fact] **********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:18
ok: [openshift] => {
"ansible_facts": {
"l_etcd_runtime": "docker"
},
"changed": false
}
TASK [openshift_facts : Validate python version] *******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:22
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Validate python version] *******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:29
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Determine Atomic Host Docker Version] ******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:42
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : assert] ************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:46
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Load variables] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:53
ok: [openshift] => (item=/tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/vars/default.yml) => {
"ansible_facts": {
"required_packages": [
"iproute",
"python-dbus",
"PyYAML",
"yum-utils"
]
},
"item": "/tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/vars/default.yml"
}
TASK [openshift_facts : Ensure various deps are installed] *********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:59
ok: [openshift] => (item=iproute) => {
"changed": false,
"item": "iproute",
"rc": 0,
"results": [
"iproute-3.10.0-74.el7.x86_64 providing iproute is already installed"
]
}
ok: [openshift] => (item=python-dbus) => {
"changed": false,
"item": "python-dbus",
"rc": 0,
"results": [
"dbus-python-1.1.1-9.el7.x86_64 providing python-dbus is already installed"
]
}
ok: [openshift] => (item=PyYAML) => {
"changed": false,
"item": "PyYAML",
"rc": 0,
"results": [
"PyYAML-3.10-11.el7.x86_64 providing PyYAML is already installed"
]
}
ok: [openshift] => (item=yum-utils) => {
"changed": false,
"item": "yum-utils",
"rc": 0,
"results": [
"yum-utils-1.1.31-40.el7.noarch providing yum-utils is already installed"
]
}
TASK [openshift_facts : Ensure various deps for running system containers are installed] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:64
skipping: [openshift] => (item=atomic) => {
"changed": false,
"item": "atomic",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [openshift] => (item=ostree) => {
"changed": false,
"item": "ostree",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [openshift] => (item=runc) => {
"changed": false,
"item": "runc",
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Gather Cluster facts and set is_containerized if needed] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:71
changed: [openshift] => {
"ansible_facts": {
"openshift": {
"common": {
"admin_binary": "/usr/local/bin/oadm",
"all_hostnames": [
"ip-172-18-3-237.ec2.internal",
"172.18.3.237",
"ec2-34-207-246-124.compute-1.amazonaws.com",
"34.207.246.124"
],
"cli_image": "openshift/origin",
"client_binary": "/usr/local/bin/oc",
"cluster_id": "default",
"config_base": "/etc/origin",
"data_dir": "/var/lib/origin",
"debug_level": "2",
"deployer_image": "openshift/origin-deployer",
"deployment_subtype": "basic",
"deployment_type": "origin",
"dns_domain": "cluster.local",
"etcd_runtime": "docker",
"examples_content_version": "v3.6",
"generate_no_proxy_hosts": true,
"hostname": "ip-172-18-3-237.ec2.internal",
"install_examples": true,
"internal_hostnames": [
"ip-172-18-3-237.ec2.internal",
"172.18.3.237"
],
"ip": "172.18.3.237",
"is_atomic": false,
"is_containerized": true,
"is_etcd_system_container": false,
"is_master_system_container": false,
"is_node_system_container": false,
"is_openvswitch_system_container": false,
"kube_svc_ip": "172.30.0.1",
"pod_image": "openshift/origin-pod",
"portal_net": "172.30.0.0/16",
"public_hostname": "ec2-34-207-246-124.compute-1.amazonaws.com",
"public_ip": "34.207.246.124",
"registry_image": "openshift/origin-docker-registry",
"router_image": "openshift/origin-haproxy-router",
"sdn_network_plugin_name": "redhat/openshift-ovs-subnet",
"service_type": "origin",
"use_calico": false,
"use_contiv": false,
"use_dnsmasq": true,
"use_flannel": false,
"use_manageiq": true,
"use_nuage": false,
"use_openshift_sdn": true,
"version_gte_3_1_1_or_1_1_1": true,
"version_gte_3_1_or_1_1": true,
"version_gte_3_2_or_1_2": true,
"version_gte_3_3_or_1_3": true,
"version_gte_3_4_or_1_4": true,
"version_gte_3_5_or_1_5": true,
"version_gte_3_6": true
},
"current_config": {
"roles": [
"node",
"docker"
]
},
"docker": {
"api_version": 1.24,
"disable_push_dockerhub": false,
"gte_1_10": true,
"options": "--log-driver=journald",
"service_name": "docker",
"version": "1.12.6"
},
"hosted": {
"logging": {
"selector": null
},
"metrics": {
"selector": null
},
"registry": {
"selector": "region=infra"
},
"router": {
"selector": "region=infra"
}
},
"node": {
"annotations": {},
"iptables_sync_period": "30s",
"kubelet_args": {
"node-labels": []
},
"labels": {},
"local_quota_per_fsgroup": "",
"node_image": "openshift/node",
"node_system_image": "openshift/node",
"nodename": "ip-172-18-3-237.ec2.internal",
"ovs_image": "openshift/openvswitch",
"ovs_system_image": "openshift/openvswitch",
"registry_url": "openshift/origin-${component}:${version}",
"schedulable": true,
"sdn_mtu": "8951",
"set_node_ip": false,
"storage_plugin_deps": [
"ceph",
"glusterfs",
"iscsi"
]
},
"provider": {
"metadata": {
"ami-id": "ami-83a1fc95",
"ami-launch-index": "0",
"ami-manifest-path": "(unknown)",
"block-device-mapping": {
"ami": "/dev/sda1",
"ebs17": "sdb",
"root": "/dev/sda1"
},
"hostname": "ip-172-18-3-237.ec2.internal",
"instance-action": "none",
"instance-id": "i-06878b3e9e9644cee",
"instance-type": "c4.xlarge",
"local-hostname": "ip-172-18-3-237.ec2.internal",
"local-ipv4": "172.18.3.237",
"mac": "0e:16:49:35:0d:c2",
"metrics": {
"vhostmd": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>"
},
"network": {
"interfaces": {
"macs": {
"0e:16:49:35:0d:c2": {
"device-number": "0",
"interface-id": "eni-a2404b78",
"ipv4-associations": {
"34.207.246.124": "172.18.3.237"
},
"local-hostname": "ip-172-18-3-237.ec2.internal",
"local-ipv4s": "172.18.3.237",
"mac": "0e:16:49:35:0d:c2",
"owner-id": "531415883065",
"public-hostname": "ec2-34-207-246-124.compute-1.amazonaws.com",
"public-ipv4s": "34.207.246.124",
"security-group-ids": "sg-7e73221a",
"security-groups": "default",
"subnet-id": "subnet-cf57c596",
"subnet-ipv4-cidr-block": "172.18.0.0/20",
"vpc-id": "vpc-69705d0c",
"vpc-ipv4-cidr-block": "172.18.0.0/16",
"vpc-ipv4-cidr-blocks": "172.18.0.0/16"
}
}
}
},
"placement": {
"availability-zone": "us-east-1d"
},
"profile": "default-hvm",
"public-hostname": "ec2-34-207-246-124.compute-1.amazonaws.com",
"public-ipv4": "34.207.246.124",
"public-keys/": "0=libra",
"reservation-id": "r-01f13c65359b7bb11",
"security-groups": "default",
"services": {
"domain": "amazonaws.com",
"partition": "aws"
}
},
"name": "aws",
"network": {
"hostname": "ip-172-18-3-237.ec2.internal",
"interfaces": [
{
"ips": [
"172.18.3.237"
],
"network_id": "subnet-cf57c596",
"network_type": "vpc",
"public_ips": [
"34.207.246.124"
]
}
],
"ip": "172.18.3.237",
"ipv6_enabled": false,
"public_hostname": "ec2-34-207-246-124.compute-1.amazonaws.com",
"public_ip": "34.207.246.124"
},
"zone": "us-east-1d"
}
}
},
"changed": true
}
TASK [openshift_facts : Set repoquery command] *********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_facts/tasks/main.yml:99
ok: [openshift] => {
"ansible_facts": {
"repoquery_cmd": "repoquery --plugins"
},
"changed": false
}
TASK [openshift_logging : fail] ************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/main.yaml:2
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Set default image variables based on deployment_type] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/main.yaml:6
ok: [openshift] => (item=/tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/vars/default_images.yml) => {
"ansible_facts": {
"__openshift_logging_image_prefix": "{{ openshift_hosted_logging_deployer_prefix | default('docker.io/openshift/origin-') }}",
"__openshift_logging_image_version": "{{ openshift_hosted_logging_deployer_version | default('latest') }}"
},
"item": "/tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/vars/default_images.yml"
}
TASK [openshift_logging : Set logging image facts] *****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/main.yaml:12
ok: [openshift] => {
"ansible_facts": {
"openshift_logging_image_prefix": "172.30.255.47:5000/logging/",
"openshift_logging_image_version": "latest"
},
"changed": false
}
TASK [openshift_logging : Create temp directory for doing work in] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/main.yaml:17
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.002045",
"end": "2017-06-08 11:59:46.742595",
"rc": 0,
"start": "2017-06-08 11:59:46.740550"
}
STDOUT:
/tmp/openshift-logging-ansible-KrPjVl
TASK [openshift_logging : debug] ***********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/main.yaml:24
ok: [openshift] => {
"changed": false
}
MSG:
Created temp dir /tmp/openshift-logging-ansible-KrPjVl
TASK [openshift_logging : Create local temp directory for doing work in] *******
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/main.yaml:26
ok: [openshift -> 127.0.0.1] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.002029",
"end": "2017-06-08 11:59:46.897125",
"rc": 0,
"start": "2017-06-08 11:59:46.895096"
}
STDOUT:
/tmp/openshift-logging-ansible-BE0YYi
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/main.yaml:33
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml for openshift
TASK [openshift_logging : Gather OpenShift Logging Facts] **********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:2
ok: [openshift] => {
"ansible_facts": {
"openshift_logging_facts": {
"curator": {
"clusterrolebindings": {},
"configmaps": {},
"daemonsets": {},
"deploymentconfigs": {},
"oauthclients": {},
"pvcs": {},
"rolebindings": {},
"routes": {},
"sccs": {},
"secrets": {},
"services": {}
},
"curator_ops": {
"clusterrolebindings": {},
"configmaps": {},
"daemonsets": {},
"deploymentconfigs": {},
"oauthclients": {},
"pvcs": {},
"rolebindings": {},
"routes": {},
"sccs": {},
"secrets": {},
"services": {}
},
"elasticsearch": {
"clusterrolebindings": {},
"configmaps": {},
"daemonsets": {},
"deploymentconfigs": {},
"oauthclients": {},
"pvcs": {},
"rolebindings": {},
"routes": {},
"sccs": {},
"secrets": {},
"services": {}
},
"elasticsearch_ops": {
"clusterrolebindings": {},
"configmaps": {},
"daemonsets": {},
"deploymentconfigs": {},
"oauthclients": {},
"pvcs": {},
"rolebindings": {},
"routes": {},
"sccs": {},
"secrets": {},
"services": {}
},
"fluentd": {
"clusterrolebindings": {},
"configmaps": {},
"daemonsets": {},
"deploymentconfigs": {},
"oauthclients": {},
"pvcs": {},
"rolebindings": {},
"routes": {},
"sccs": {},
"secrets": {},
"services": {}
},
"kibana": {
"clusterrolebindings": {},
"configmaps": {},
"daemonsets": {},
"deploymentconfigs": {},
"oauthclients": {},
"pvcs": {},
"rolebindings": {},
"routes": {},
"sccs": {},
"secrets": {},
"services": {}
},
"kibana_ops": {
"clusterrolebindings": {},
"configmaps": {},
"daemonsets": {},
"deploymentconfigs": {},
"oauthclients": {},
"pvcs": {},
"rolebindings": {},
"routes": {},
"sccs": {},
"secrets": {},
"services": {}
}
}
},
"changed": false
}
TASK [openshift_logging : Set logging project] *********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:7
ok: [openshift] => {
"changed": false,
"results": {
"cmd": "/bin/oc get namespace logging -o json",
"results": {
"apiVersion": "v1",
"kind": "Namespace",
"metadata": {
"annotations": {
"openshift.io/description": "",
"openshift.io/display-name": "",
"openshift.io/node-selector": "",
"openshift.io/sa.scc.mcs": "s0:c7,c4",
"openshift.io/sa.scc.supplemental-groups": "1000050000/10000",
"openshift.io/sa.scc.uid-range": "1000050000/10000"
},
"creationTimestamp": "2017-06-08T15:38:35Z",
"name": "logging",
"resourceVersion": "813",
"selfLink": "/api/v1/namespaces/logging",
"uid": "8904b0ac-4c60-11e7-94aa-0e1649350dc2"
},
"spec": {
"finalizers": [
"openshift.io/origin",
"kubernetes"
]
},
"status": {
"phase": "Active"
}
},
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging : Labeling logging project] ****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:13
TASK [openshift_logging : Labeling logging project] ****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:26
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Create logging cert directory] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:39
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/etc/origin/logging",
"secontext": "unconfined_u:object_r:etc_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:47
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml for openshift
TASK [openshift_logging : Checking for ca.key] *********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:3
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for ca.crt] *********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:8
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for ca.serial.txt] **************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:13
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Generate certificates] *******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:18
changed: [openshift] => {
"changed": true,
"cmd": [
"/usr/local/bin/oc",
"adm",
"--config=/tmp/openshift-logging-ansible-KrPjVl/admin.kubeconfig",
"ca",
"create-signer-cert",
"--key=/etc/origin/logging/ca.key",
"--cert=/etc/origin/logging/ca.crt",
"--serial=/etc/origin/logging/ca.serial.txt",
"--name=logging-signer-test"
],
"delta": "0:00:00.582442",
"end": "2017-06-08 11:59:51.387575",
"rc": 0,
"start": "2017-06-08 11:59:50.805133"
}
TASK [openshift_logging : Checking for signing.conf] ***************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:29
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : template] ********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:34
changed: [openshift] => {
"changed": true,
"checksum": "a5a1bda430be44f982fa9097778b7d35d2e42780",
"dest": "/etc/origin/logging/signing.conf",
"gid": 0,
"group": "root",
"md5sum": "449087446670073f2899aac33113350c",
"mode": "0644",
"owner": "root",
"secontext": "system_u:object_r:etc_t:s0",
"size": 4263,
"src": "/root/.ansible/tmp/ansible-tmp-1496937591.55-221646280654359/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:39
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml for openshift
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml for openshift
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml for openshift
TASK [openshift_logging : Checking for kibana.crt] *****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:2
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for kibana.key] *****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:7
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Trying to discover server cert variable name for kibana] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Trying to discover the server key variable name for kibana] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:20
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Creating signed server cert and key for kibana] ******
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:28
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Copying server key for kibana to generated certs directory] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:40
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Copying Server cert for kibana to generated certs directory] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:50
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Checking for kibana-ops.crt] *************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:2
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for kibana-ops.key] *************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:7
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Trying to discover server cert variable name for kibana-ops] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Trying to discover the server key variable name for kibana-ops] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:20
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Creating signed server cert and key for kibana-ops] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:28
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Copying server key for kibana-ops to generated certs directory] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:40
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Copying Server cert for kibana-ops to generated certs directory] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:50
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Checking for kibana-internal.crt] ********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:2
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for kibana-internal.key] ********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:7
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Trying to discover server cert variable name for kibana-internal] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Trying to discover the server key variable name for kibana-internal] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:20
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Creating signed server cert and key for kibana-internal] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:28
changed: [openshift] => {
"changed": true,
"cmd": [
"/usr/local/bin/oc",
"adm",
"--config=/tmp/openshift-logging-ansible-KrPjVl/admin.kubeconfig",
"ca",
"create-server-cert",
"--key=/etc/origin/logging/kibana-internal.key",
"--cert=/etc/origin/logging/kibana-internal.crt",
"--hostnames=kibana, kibana-ops, kibana.127.0.0.1.xip.io, kibana-ops.router.default.svc.cluster.local",
"--signer-cert=/etc/origin/logging/ca.crt",
"--signer-key=/etc/origin/logging/ca.key",
"--signer-serial=/etc/origin/logging/ca.serial.txt"
],
"delta": "0:00:00.297097",
"end": "2017-06-08 11:59:53.501379",
"rc": 0,
"start": "2017-06-08 11:59:53.204282"
}
TASK [openshift_logging : Copying server key for kibana-internal to generated certs directory] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:40
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Copying Server cert for kibana-internal to generated certs directory] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:50
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:48
skipping: [openshift] => (item={u'procure_component': u'mux', u'hostnames': u'logging-mux, mux.router.default.svc.cluster.local'}) => {
"cert_info": {
"hostnames": "logging-mux, mux.router.default.svc.cluster.local",
"procure_component": "mux"
},
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:56
skipping: [openshift] => (item={u'procure_component': u'mux'}) => {
"changed": false,
"shared_key_info": {
"procure_component": "mux"
},
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:63
skipping: [openshift] => (item={u'procure_component': u'es', u'hostnames': u'es, es.router.default.svc.cluster.local'}) => {
"cert_info": {
"hostnames": "es, es.router.default.svc.cluster.local",
"procure_component": "es"
},
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:71
skipping: [openshift] => (item={u'procure_component': u'es-ops', u'hostnames': u'es-ops, es-ops.router.default.svc.cluster.local'}) => {
"cert_info": {
"hostnames": "es-ops, es-ops.router.default.svc.cluster.local",
"procure_component": "es-ops"
},
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Copy proxy TLS configuration file] *******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:81
changed: [openshift] => {
"changed": true,
"checksum": "36991681e03970736a99be9f084773521c44db06",
"dest": "/etc/origin/logging/server-tls.json",
"gid": 0,
"group": "root",
"md5sum": "2a954195add2b2fdde4ed09ff5c8e1c5",
"mode": "0644",
"owner": "root",
"secontext": "system_u:object_r:etc_t:s0",
"size": 321,
"src": "/root/.ansible/tmp/ansible-tmp-1496937593.95-210641088745208/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging : Copy proxy TLS configuration file] *******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:86
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Checking for ca.db] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:91
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : copy] ************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:96
changed: [openshift] => {
"changed": true,
"checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
"dest": "/etc/origin/logging/ca.db",
"gid": 0,
"group": "root",
"md5sum": "d41d8cd98f00b204e9800998ecf8427e",
"mode": "0644",
"owner": "root",
"secontext": "system_u:object_r:etc_t:s0",
"size": 0,
"src": "/root/.ansible/tmp/ansible-tmp-1496937594.31-170215528232455/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging : Checking for ca.crt.srl] *****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:101
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : copy] ************************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:106
changed: [openshift] => {
"changed": true,
"checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
"dest": "/etc/origin/logging/ca.crt.srl",
"gid": 0,
"group": "root",
"md5sum": "d41d8cd98f00b204e9800998ecf8427e",
"mode": "0644",
"owner": "root",
"secontext": "system_u:object_r:etc_t:s0",
"size": 0,
"src": "/root/.ansible/tmp/ansible-tmp-1496937594.64-112690480367514/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging : Generate PEM certs] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:111
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml for openshift
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml for openshift
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml for openshift
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml for openshift
TASK [openshift_logging : Checking for system.logging.fluentd.key] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:2
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for system.logging.fluentd.crt] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:7
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Creating cert req for system.logging.fluentd] ********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Creating cert req for system.logging.fluentd] ********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:22
changed: [openshift] => {
"changed": true,
"cmd": [
"openssl",
"req",
"-out",
"/etc/origin/logging/system.logging.fluentd.csr",
"-new",
"-newkey",
"rsa:2048",
"-keyout",
"/etc/origin/logging/system.logging.fluentd.key",
"-subj",
"/CN=system.logging.fluentd/OU=OpenShift/O=Logging",
"-days",
"712",
"-nodes"
],
"delta": "0:00:00.259215",
"end": "2017-06-08 11:59:55.688838",
"rc": 0,
"start": "2017-06-08 11:59:55.429623"
}
STDERR:
Generating a 2048 bit RSA private key
...........................................................+++
...........+++
writing new private key to '/etc/origin/logging/system.logging.fluentd.key'
-----
TASK [openshift_logging : Sign cert request with CA for system.logging.fluentd] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:31
changed: [openshift] => {
"changed": true,
"cmd": [
"openssl",
"ca",
"-in",
"/etc/origin/logging/system.logging.fluentd.csr",
"-notext",
"-out",
"/etc/origin/logging/system.logging.fluentd.crt",
"-config",
"/etc/origin/logging/signing.conf",
"-extensions",
"v3_req",
"-batch",
"-extensions",
"server_ext"
],
"delta": "0:00:00.009966",
"end": "2017-06-08 11:59:55.821577",
"rc": 0,
"start": "2017-06-08 11:59:55.811611"
}
STDERR:
Using configuration from /etc/origin/logging/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
Serial Number: 2 (0x2)
Validity
Not Before: Jun 8 15:59:55 2017 GMT
Not After : Jun 8 15:59:55 2019 GMT
Subject:
organizationName = Logging
organizationalUnitName = OpenShift
commonName = system.logging.fluentd
X509v3 extensions:
X509v3 Key Usage: critical
Digital Signature, Key Encipherment
X509v3 Basic Constraints:
CA:FALSE
X509v3 Extended Key Usage:
TLS Web Server Authentication, TLS Web Client Authentication
X509v3 Subject Key Identifier:
42:39:F9:94:22:C5:AD:8D:D1:F9:CF:87:47:11:5B:6E:98:B1:04:07
X509v3 Authority Key Identifier:
0.
Certificate is to be certified until Jun 8 15:59:55 2019 GMT (730 days)
Write out database with 1 new entries
Data Base Updated
TASK [openshift_logging : Checking for system.logging.kibana.key] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:2
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for system.logging.kibana.crt] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:7
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Creating cert req for system.logging.kibana] *********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Creating cert req for system.logging.kibana] *********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:22
changed: [openshift] => {
"changed": true,
"cmd": [
"openssl",
"req",
"-out",
"/etc/origin/logging/system.logging.kibana.csr",
"-new",
"-newkey",
"rsa:2048",
"-keyout",
"/etc/origin/logging/system.logging.kibana.key",
"-subj",
"/CN=system.logging.kibana/OU=OpenShift/O=Logging",
"-days",
"712",
"-nodes"
],
"delta": "0:00:00.089781",
"end": "2017-06-08 11:59:56.288993",
"rc": 0,
"start": "2017-06-08 11:59:56.199212"
}
STDERR:
Generating a 2048 bit RSA private key
..........................................+++
..................+++
writing new private key to '/etc/origin/logging/system.logging.kibana.key'
-----
TASK [openshift_logging : Sign cert request with CA for system.logging.kibana] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:31
changed: [openshift] => {
"changed": true,
"cmd": [
"openssl",
"ca",
"-in",
"/etc/origin/logging/system.logging.kibana.csr",
"-notext",
"-out",
"/etc/origin/logging/system.logging.kibana.crt",
"-config",
"/etc/origin/logging/signing.conf",
"-extensions",
"v3_req",
"-batch",
"-extensions",
"server_ext"
],
"delta": "0:00:00.007585",
"end": "2017-06-08 11:59:56.415873",
"rc": 0,
"start": "2017-06-08 11:59:56.408288"
}
STDERR:
Using configuration from /etc/origin/logging/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
Serial Number: 3 (0x3)
Validity
Not Before: Jun 8 15:59:56 2017 GMT
Not After : Jun 8 15:59:56 2019 GMT
Subject:
organizationName = Logging
organizationalUnitName = OpenShift
commonName = system.logging.kibana
X509v3 extensions:
X509v3 Key Usage: critical
Digital Signature, Key Encipherment
X509v3 Basic Constraints:
CA:FALSE
X509v3 Extended Key Usage:
TLS Web Server Authentication, TLS Web Client Authentication
X509v3 Subject Key Identifier:
61:46:5D:FC:8F:37:47:30:8C:6B:3A:3D:14:B6:CA:80:99:DD:6D:45
X509v3 Authority Key Identifier:
0.
Certificate is to be certified until Jun 8 15:59:56 2019 GMT (730 days)
Write out database with 1 new entries
Data Base Updated
TASK [openshift_logging : Checking for system.logging.curator.key] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:2
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for system.logging.curator.crt] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:7
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Creating cert req for system.logging.curator] ********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Creating cert req for system.logging.curator] ********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:22
changed: [openshift] => {
"changed": true,
"cmd": [
"openssl",
"req",
"-out",
"/etc/origin/logging/system.logging.curator.csr",
"-new",
"-newkey",
"rsa:2048",
"-keyout",
"/etc/origin/logging/system.logging.curator.key",
"-subj",
"/CN=system.logging.curator/OU=OpenShift/O=Logging",
"-days",
"712",
"-nodes"
],
"delta": "0:00:00.057296",
"end": "2017-06-08 11:59:56.866563",
"rc": 0,
"start": "2017-06-08 11:59:56.809267"
}
STDERR:
Generating a 2048 bit RSA private key
...........+++
.........................+++
writing new private key to '/etc/origin/logging/system.logging.curator.key'
-----
TASK [openshift_logging : Sign cert request with CA for system.logging.curator] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:31
changed: [openshift] => {
"changed": true,
"cmd": [
"openssl",
"ca",
"-in",
"/etc/origin/logging/system.logging.curator.csr",
"-notext",
"-out",
"/etc/origin/logging/system.logging.curator.crt",
"-config",
"/etc/origin/logging/signing.conf",
"-extensions",
"v3_req",
"-batch",
"-extensions",
"server_ext"
],
"delta": "0:00:00.007582",
"end": "2017-06-08 11:59:56.994991",
"rc": 0,
"start": "2017-06-08 11:59:56.987409"
}
STDERR:
Using configuration from /etc/origin/logging/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
Serial Number: 4 (0x4)
Validity
Not Before: Jun 8 15:59:56 2017 GMT
Not After : Jun 8 15:59:56 2019 GMT
Subject:
organizationName = Logging
organizationalUnitName = OpenShift
commonName = system.logging.curator
X509v3 extensions:
X509v3 Key Usage: critical
Digital Signature, Key Encipherment
X509v3 Basic Constraints:
CA:FALSE
X509v3 Extended Key Usage:
TLS Web Server Authentication, TLS Web Client Authentication
X509v3 Subject Key Identifier:
48:F9:32:2E:80:FB:05:29:35:02:2B:6E:57:70:41:76:53:C0:48:B7
X509v3 Authority Key Identifier:
0.
Certificate is to be certified until Jun 8 15:59:56 2019 GMT (730 days)
Write out database with 1 new entries
Data Base Updated
TASK [openshift_logging : Checking for system.admin.key] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:2
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for system.admin.crt] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:7
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Creating cert req for system.admin] ******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Creating cert req for system.admin] ******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:22
changed: [openshift] => {
"changed": true,
"cmd": [
"openssl",
"req",
"-out",
"/etc/origin/logging/system.admin.csr",
"-new",
"-newkey",
"rsa:2048",
"-keyout",
"/etc/origin/logging/system.admin.key",
"-subj",
"/CN=system.admin/OU=OpenShift/O=Logging",
"-days",
"712",
"-nodes"
],
"delta": "0:00:00.032556",
"end": "2017-06-08 11:59:57.406777",
"rc": 0,
"start": "2017-06-08 11:59:57.374221"
}
STDERR:
Generating a 2048 bit RSA private key
......+++
...........+++
writing new private key to '/etc/origin/logging/system.admin.key'
-----
TASK [openshift_logging : Sign cert request with CA for system.admin] **********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:31
changed: [openshift] => {
"changed": true,
"cmd": [
"openssl",
"ca",
"-in",
"/etc/origin/logging/system.admin.csr",
"-notext",
"-out",
"/etc/origin/logging/system.admin.crt",
"-config",
"/etc/origin/logging/signing.conf",
"-extensions",
"v3_req",
"-batch",
"-extensions",
"server_ext"
],
"delta": "0:00:00.007482",
"end": "2017-06-08 11:59:57.533566",
"rc": 0,
"start": "2017-06-08 11:59:57.526084"
}
STDERR:
Using configuration from /etc/origin/logging/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
Serial Number: 5 (0x5)
Validity
Not Before: Jun 8 15:59:57 2017 GMT
Not After : Jun 8 15:59:57 2019 GMT
Subject:
organizationName = Logging
organizationalUnitName = OpenShift
commonName = system.admin
X509v3 extensions:
X509v3 Key Usage: critical
Digital Signature, Key Encipherment
X509v3 Basic Constraints:
CA:FALSE
X509v3 Extended Key Usage:
TLS Web Server Authentication, TLS Web Client Authentication
X509v3 Subject Key Identifier:
B0:C6:1B:D6:A5:28:D3:7B:21:04:E9:CC:2E:93:3F:1F:CB:4A:C9:22
X509v3 Authority Key Identifier:
0.
Certificate is to be certified until Jun 8 15:59:57 2019 GMT (730 days)
Write out database with 1 new entries
Data Base Updated
TASK [openshift_logging : Generate PEM cert for mux] ***************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:121
skipping: [openshift] => (item=system.logging.mux) => {
"changed": false,
"node_name": "system.logging.mux",
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Generate PEM cert for Elasticsearch external route] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:129
skipping: [openshift] => (item=system.logging.es) => {
"changed": false,
"node_name": "system.logging.es",
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Creating necessary JKS certs] ************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:137
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml for openshift
TASK [openshift_logging : Checking for elasticsearch.jks] **********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:3
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for logging-es.jks] *************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:8
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for system.admin.jks] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:13
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Checking for truststore.jks] *************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:18
ok: [openshift] => {
"changed": false,
"stat": {
"exists": false
}
}
TASK [openshift_logging : Create placeholder for previously created JKS certs to prevent recreating...] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:23
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Create placeholder for previously created JKS certs to prevent recreating...] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:28
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Create placeholder for previously created JKS certs to prevent recreating...] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:33
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Create placeholder for previously created JKS certs to prevent recreating...] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:38
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : pulling down signing items from host] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:43
changed: [openshift] => (item=ca.crt) => {
"changed": true,
"checksum": "408ca533c4ddee73cdcdea3694942d9fa1f4023c",
"dest": "/tmp/openshift-logging-ansible-BE0YYi/ca.crt",
"item": "ca.crt",
"md5sum": "08e2da8a513b9a8b579e36a99d1b0731",
"remote_checksum": "408ca533c4ddee73cdcdea3694942d9fa1f4023c",
"remote_md5sum": null
}
changed: [openshift] => (item=ca.key) => {
"changed": true,
"checksum": "90d789d9455d5e74b9b0a59f3b697d4e18f72fe4",
"dest": "/tmp/openshift-logging-ansible-BE0YYi/ca.key",
"item": "ca.key",
"md5sum": "3cead80beb0dda3b4d1c43607797835d",
"remote_checksum": "90d789d9455d5e74b9b0a59f3b697d4e18f72fe4",
"remote_md5sum": null
}
changed: [openshift] => (item=ca.serial.txt) => {
"changed": true,
"checksum": "b649682b92a811746098e5c91e891e5142a41950",
"dest": "/tmp/openshift-logging-ansible-BE0YYi/ca.serial.txt",
"item": "ca.serial.txt",
"md5sum": "76b01ce73ac53fdac1c67d27ac040473",
"remote_checksum": "b649682b92a811746098e5c91e891e5142a41950",
"remote_md5sum": null
}
ok: [openshift] => (item=ca.crl.srl) => {
"changed": false,
"file": "/etc/origin/logging/ca.crl.srl",
"item": "ca.crl.srl"
}
MSG:
the remote file does not exist, not transferring, ignored
changed: [openshift] => (item=ca.db) => {
"changed": true,
"checksum": "c462d41bbb8a53918c077d56ecbb924ff1c9c982",
"dest": "/tmp/openshift-logging-ansible-BE0YYi/ca.db",
"item": "ca.db",
"md5sum": "95d9e9cad6562976cca128442dbd9ce1",
"remote_checksum": "c462d41bbb8a53918c077d56ecbb924ff1c9c982",
"remote_md5sum": null
}
TASK [openshift_logging : template] ********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:56
changed: [openshift -> 127.0.0.1] => {
"changed": true,
"checksum": "ff2f0e5383f01a141f8b96704a4b206a86ef76c1",
"dest": "/tmp/openshift-logging-ansible-BE0YYi/signing.conf",
"gid": 0,
"group": "root",
"md5sum": "9ecfaa5d12c371657e3a6b42e508bde1",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 4281,
"src": "/root/.ansible/tmp/ansible-tmp-1496937598.99-277628297843340/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging : Run JKS generation script] ***************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:61
changed: [openshift -> 127.0.0.1] => {
"changed": true,
"rc": 0
}
STDOUT:
Generating keystore and certificate for node system.admin
Generating certificate signing request for node system.admin
Sign certificate request with CA
Import back to keystore (including CA chain)
All done for system.admin
Generating keystore and certificate for node elasticsearch
Generating certificate signing request for node elasticsearch
Sign certificate request with CA
Import back to keystore (including CA chain)
All done for elasticsearch
Generating keystore and certificate for node logging-es
Generating certificate signing request for node logging-es
Sign certificate request with CA
Import back to keystore (including CA chain)
All done for logging-es
Import CA to truststore for validating client certs
STDERR:
+ '[' 2 -lt 1 ']'
+ dir=/tmp/openshift-logging-ansible-BE0YYi
+ SCRATCH_DIR=/tmp/openshift-logging-ansible-BE0YYi
+ PROJECT=logging
+ [[ ! -f /tmp/openshift-logging-ansible-BE0YYi/system.admin.jks ]]
+ generate_JKS_client_cert system.admin
+ NODE_NAME=system.admin
+ ks_pass=kspass
+ ts_pass=tspass
+ dir=/tmp/openshift-logging-ansible-BE0YYi
+ echo Generating keystore and certificate for node system.admin
+ keytool -genkey -alias system.admin -keystore /tmp/openshift-logging-ansible-BE0YYi/system.admin.jks -keyalg RSA -keysize 2048 -validity 712 -keypass kspass -storepass kspass -dname 'CN=system.admin, OU=OpenShift, O=Logging'
+ echo Generating certificate signing request for node system.admin
+ keytool -certreq -alias system.admin -keystore /tmp/openshift-logging-ansible-BE0YYi/system.admin.jks -file /tmp/openshift-logging-ansible-BE0YYi/system.admin.jks.csr -keyalg rsa -keypass kspass -storepass kspass -dname 'CN=system.admin, OU=OpenShift, O=Logging'
+ echo Sign certificate request with CA
+ openssl ca -in /tmp/openshift-logging-ansible-BE0YYi/system.admin.jks.csr -notext -out /tmp/openshift-logging-ansible-BE0YYi/system.admin.jks.crt -config /tmp/openshift-logging-ansible-BE0YYi/signing.conf -extensions v3_req -batch -extensions server_ext
Using configuration from /tmp/openshift-logging-ansible-BE0YYi/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
Serial Number: 6 (0x6)
Validity
Not Before: Jun 8 16:00:21 2017 GMT
Not After : Jun 8 16:00:21 2019 GMT
Subject:
organizationName = Logging
organizationalUnitName = OpenShift
commonName = system.admin
X509v3 extensions:
X509v3 Key Usage: critical
Digital Signature, Key Encipherment
X509v3 Basic Constraints:
CA:FALSE
X509v3 Extended Key Usage:
TLS Web Server Authentication, TLS Web Client Authentication
X509v3 Subject Key Identifier:
AE:A3:70:E2:AE:E6:AE:A2:ED:24:C5:1F:FB:9F:CC:49:80:7A:CB:9D
X509v3 Authority Key Identifier:
0.
Certificate is to be certified until Jun 8 16:00:21 2019 GMT (730 days)
Write out database with 1 new entries
Data Base Updated
+ echo 'Import back to keystore (including CA chain)'
+ keytool -import -file /tmp/openshift-logging-ansible-BE0YYi/ca.crt -keystore /tmp/openshift-logging-ansible-BE0YYi/system.admin.jks -storepass kspass -noprompt -alias sig-ca
Certificate was added to keystore
+ keytool -import -file /tmp/openshift-logging-ansible-BE0YYi/system.admin.jks.crt -keystore /tmp/openshift-logging-ansible-BE0YYi/system.admin.jks -storepass kspass -noprompt -alias system.admin
Certificate reply was installed in keystore
+ echo All done for system.admin
+ [[ ! -f /tmp/openshift-logging-ansible-BE0YYi/elasticsearch.jks ]]
++ join , logging-es logging-es-ops
++ local IFS=,
++ shift
++ echo logging-es,logging-es-ops
+ generate_JKS_chain true elasticsearch logging-es,logging-es-ops
+ dir=/tmp/openshift-logging-ansible-BE0YYi
+ ADD_OID=true
+ NODE_NAME=elasticsearch
+ CERT_NAMES=logging-es,logging-es-ops
+ ks_pass=kspass
+ ts_pass=tspass
+ rm -rf elasticsearch
+ extension_names=
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es-ops
+ '[' true = true ']'
+ extension_names=,dns:logging-es,dns:logging-es-ops,oid:1.2.3.4.5.5
+ echo Generating keystore and certificate for node elasticsearch
+ keytool -genkey -alias elasticsearch -keystore /tmp/openshift-logging-ansible-BE0YYi/elasticsearch.jks -keypass kspass -storepass kspass -keyalg RSA -keysize 2048 -validity 712 -dname 'CN=elasticsearch, OU=OpenShift, O=Logging' -ext san=dns:localhost,ip:127.0.0.1,dns:logging-es,dns:logging-es-ops,oid:1.2.3.4.5.5
+ echo Generating certificate signing request for node elasticsearch
+ keytool -certreq -alias elasticsearch -keystore /tmp/openshift-logging-ansible-BE0YYi/elasticsearch.jks -storepass kspass -file /tmp/openshift-logging-ansible-BE0YYi/elasticsearch.csr -keyalg rsa -dname 'CN=elasticsearch, OU=OpenShift, O=Logging' -ext san=dns:localhost,ip:127.0.0.1,dns:logging-es,dns:logging-es-ops,oid:1.2.3.4.5.5
+ echo Sign certificate request with CA
+ openssl ca -in /tmp/openshift-logging-ansible-BE0YYi/elasticsearch.csr -notext -out /tmp/openshift-logging-ansible-BE0YYi/elasticsearch.crt -config /tmp/openshift-logging-ansible-BE0YYi/signing.conf -extensions v3_req -batch -extensions server_ext
Using configuration from /tmp/openshift-logging-ansible-BE0YYi/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
Serial Number: 7 (0x7)
Validity
Not Before: Jun 8 16:00:23 2017 GMT
Not After : Jun 8 16:00:23 2019 GMT
Subject:
organizationName = Logging
organizationalUnitName = OpenShift
commonName = elasticsearch
X509v3 extensions:
X509v3 Key Usage: critical
Digital Signature, Key Encipherment
X509v3 Basic Constraints:
CA:FALSE
X509v3 Extended Key Usage:
TLS Web Server Authentication, TLS Web Client Authentication
X509v3 Subject Key Identifier:
C9:78:D8:A6:73:BE:F3:98:20:99:D7:D1:6C:B7:5E:27:6D:40:02:26
X509v3 Authority Key Identifier:
0.
X509v3 Subject Alternative Name:
DNS:localhost, IP Address:127.0.0.1, DNS:logging-es, DNS:logging-es-ops, Registered ID:1.2.3.4.5.5
Certificate is to be certified until Jun 8 16:00:23 2019 GMT (730 days)
Write out database with 1 new entries
Data Base Updated
+ echo 'Import back to keystore (including CA chain)'
+ keytool -import -file /tmp/openshift-logging-ansible-BE0YYi/ca.crt -keystore /tmp/openshift-logging-ansible-BE0YYi/elasticsearch.jks -storepass kspass -noprompt -alias sig-ca
Certificate was added to keystore
+ keytool -import -file /tmp/openshift-logging-ansible-BE0YYi/elasticsearch.crt -keystore /tmp/openshift-logging-ansible-BE0YYi/elasticsearch.jks -storepass kspass -noprompt -alias elasticsearch
Certificate reply was installed in keystore
+ echo All done for elasticsearch
+ [[ ! -f /tmp/openshift-logging-ansible-BE0YYi/logging-es.jks ]]
++ join , logging-es logging-es.logging.svc.cluster.local logging-es-cluster logging-es-cluster.logging.svc.cluster.local logging-es-ops logging-es-ops.logging.svc.cluster.local logging-es-ops-cluster logging-es-ops-cluster.logging.svc.cluster.local
++ local IFS=,
++ shift
++ echo logging-es,logging-es.logging.svc.cluster.local,logging-es-cluster,logging-es-cluster.logging.svc.cluster.local,logging-es-ops,logging-es-ops.logging.svc.cluster.local,logging-es-ops-cluster,logging-es-ops-cluster.logging.svc.cluster.local
+ generate_JKS_chain false logging-es logging-es,logging-es.logging.svc.cluster.local,logging-es-cluster,logging-es-cluster.logging.svc.cluster.local,logging-es-ops,logging-es-ops.logging.svc.cluster.local,logging-es-ops-cluster,logging-es-ops-cluster.logging.svc.cluster.local
+ dir=/tmp/openshift-logging-ansible-BE0YYi
+ ADD_OID=false
+ NODE_NAME=logging-es
+ CERT_NAMES=logging-es,logging-es.logging.svc.cluster.local,logging-es-cluster,logging-es-cluster.logging.svc.cluster.local,logging-es-ops,logging-es-ops.logging.svc.cluster.local,logging-es-ops-cluster,logging-es-ops-cluster.logging.svc.cluster.local
+ ks_pass=kspass
+ ts_pass=tspass
+ rm -rf logging-es
+ extension_names=
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local,dns:logging-es-ops-cluster
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local,dns:logging-es-ops-cluster,dns:logging-es-ops-cluster.logging.svc.cluster.local
+ '[' false = true ']'
+ echo Generating keystore and certificate for node logging-es
+ keytool -genkey -alias logging-es -keystore /tmp/openshift-logging-ansible-BE0YYi/logging-es.jks -keypass kspass -storepass kspass -keyalg RSA -keysize 2048 -validity 712 -dname 'CN=logging-es, OU=OpenShift, O=Logging' -ext san=dns:localhost,ip:127.0.0.1,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local,dns:logging-es-ops-cluster,dns:logging-es-ops-cluster.logging.svc.cluster.local
+ echo Generating certificate signing request for node logging-es
+ keytool -certreq -alias logging-es -keystore /tmp/openshift-logging-ansible-BE0YYi/logging-es.jks -storepass kspass -file /tmp/openshift-logging-ansible-BE0YYi/logging-es.csr -keyalg rsa -dname 'CN=logging-es, OU=OpenShift, O=Logging' -ext san=dns:localhost,ip:127.0.0.1,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local,dns:logging-es-ops-cluster,dns:logging-es-ops-cluster.logging.svc.cluster.local
+ echo Sign certificate request with CA
+ openssl ca -in /tmp/openshift-logging-ansible-BE0YYi/logging-es.csr -notext -out /tmp/openshift-logging-ansible-BE0YYi/logging-es.crt -config /tmp/openshift-logging-ansible-BE0YYi/signing.conf -extensions v3_req -batch -extensions server_ext
Using configuration from /tmp/openshift-logging-ansible-BE0YYi/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
Serial Number: 8 (0x8)
Validity
Not Before: Jun 8 16:00:24 2017 GMT
Not After : Jun 8 16:00:24 2019 GMT
Subject:
organizationName = Logging
organizationalUnitName = OpenShift
commonName = logging-es
X509v3 extensions:
X509v3 Key Usage: critical
Digital Signature, Key Encipherment
X509v3 Basic Constraints:
CA:FALSE
X509v3 Extended Key Usage:
TLS Web Server Authentication, TLS Web Client Authentication
X509v3 Subject Key Identifier:
0E:BF:C1:CD:01:77:93:9A:F5:D2:39:6A:E4:D3:F0:1B:8B:41:6E:1E
X509v3 Authority Key Identifier:
0.
X509v3 Subject Alternative Name:
DNS:localhost, IP Address:127.0.0.1, DNS:logging-es, DNS:logging-es.logging.svc.cluster.local, DNS:logging-es-cluster, DNS:logging-es-cluster.logging.svc.cluster.local, DNS:logging-es-ops, DNS:logging-es-ops.logging.svc.cluster.local, DNS:logging-es-ops-cluster, DNS:logging-es-ops-cluster.logging.svc.cluster.local
Certificate is to be certified until Jun 8 16:00:24 2019 GMT (730 days)
Write out database with 1 new entries
Data Base Updated
+ echo 'Import back to keystore (including CA chain)'
+ keytool -import -file /tmp/openshift-logging-ansible-BE0YYi/ca.crt -keystore /tmp/openshift-logging-ansible-BE0YYi/logging-es.jks -storepass kspass -noprompt -alias sig-ca
Certificate was added to keystore
+ keytool -import -file /tmp/openshift-logging-ansible-BE0YYi/logging-es.crt -keystore /tmp/openshift-logging-ansible-BE0YYi/logging-es.jks -storepass kspass -noprompt -alias logging-es
Certificate reply was installed in keystore
+ echo All done for logging-es
+ '[' '!' -f /tmp/openshift-logging-ansible-BE0YYi/truststore.jks ']'
+ createTruststore
+ echo 'Import CA to truststore for validating client certs'
+ keytool -import -file /tmp/openshift-logging-ansible-BE0YYi/ca.crt -keystore /tmp/openshift-logging-ansible-BE0YYi/truststore.jks -storepass tspass -noprompt -alias sig-ca
Certificate was added to keystore
+ exit 0
TASK [openshift_logging : Pushing locally generated JKS certs to remote host...] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:66
changed: [openshift] => {
"changed": true,
"checksum": "12ad3bc24c0505614cf37f35d113e7ae588cdd47",
"dest": "/etc/origin/logging/elasticsearch.jks",
"gid": 0,
"group": "root",
"md5sum": "524522b36c261199897a59f654147f1c",
"mode": "0644",
"owner": "root",
"secontext": "system_u:object_r:etc_t:s0",
"size": 3767,
"src": "/root/.ansible/tmp/ansible-tmp-1496937625.06-182569862996872/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging : Pushing locally generated JKS certs to remote host...] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:72
changed: [openshift] => {
"changed": true,
"checksum": "502d38c4ba4f34cc8b20216f66bfcfd72f001199",
"dest": "/etc/origin/logging/logging-es.jks",
"gid": 0,
"group": "root",
"md5sum": "26e373f81bb36607e90b54ee66dbf221",
"mode": "0644",
"owner": "root",
"secontext": "system_u:object_r:etc_t:s0",
"size": 3983,
"src": "/root/.ansible/tmp/ansible-tmp-1496937625.29-243263064809690/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging : Pushing locally generated JKS certs to remote host...] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:78
changed: [openshift] => {
"changed": true,
"checksum": "74b7e3209448ef49a8c721fb6ce86b4cbc743b35",
"dest": "/etc/origin/logging/system.admin.jks",
"gid": 0,
"group": "root",
"md5sum": "7ced55ae688e4db6a6c16fabcd152501",
"mode": "0644",
"owner": "root",
"secontext": "system_u:object_r:etc_t:s0",
"size": 3701,
"src": "/root/.ansible/tmp/ansible-tmp-1496937625.51-254001572636126/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging : Pushing locally generated JKS certs to remote host...] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:84
changed: [openshift] => {
"changed": true,
"checksum": "5752a16d649628ad1b5680ea79e0711ebda31a22",
"dest": "/etc/origin/logging/truststore.jks",
"gid": 0,
"group": "root",
"md5sum": "8222e3d8fe006ed1855128bc73fd18be",
"mode": "0644",
"owner": "root",
"secontext": "system_u:object_r:etc_t:s0",
"size": 797,
"src": "/root/.ansible/tmp/ansible-tmp-1496937625.74-103763546674424/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging : Generate proxy session] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:141
ok: [openshift] => {
"ansible_facts": {
"session_secret": "LgrO9p3nbn5KHZdE1531nGe5MJSFIsc2HC68y2dsK83jBz1YmH47Y9SWz88V58vccrFSK5yTeWA8pnT8WEdij4y4LZ1c2Tu0zOkTkrBYtPDXQkZ2Uezmbtp78pikLOjBWqiubuy8a6VOHYufpNeJ00a4gxL8GVYCqL8tUCyQmjSCtrirJMYpOEUTkzn69tUCjNjTdJEh"
},
"changed": false
}
TASK [openshift_logging : Generate oauth client secret] ************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:146
ok: [openshift] => {
"ansible_facts": {
"oauth_secret": "An6ifjugXdZohKXfYwG3QyRdTC3ek5i6nSWY2JZ3m2qyZdXp3DCkhsvkZ2gFJ9a5"
},
"changed": false
}
TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:53
TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:57
ok: [openshift] => {
"ansible_facts": {
"es_indices": "[]"
},
"changed": false
}
TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:60
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:64
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:85
statically included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml
TASK [openshift_logging_elasticsearch : Validate Elasticsearch cluster size] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:2
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Validate Elasticsearch Ops cluster size] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:6
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:10
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:14
ok: [openshift] => {
"ansible_facts": {
"elasticsearch_name": "logging-elasticsearch",
"es_component": "es"
},
"changed": false
}
TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"es_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_elasticsearch : debug] *********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:11
ok: [openshift] => {
"changed": false,
"openshift_logging_image_version": "latest"
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:14
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:17
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Create temp directory for doing work in] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:21
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.002103",
"end": "2017-06-08 12:00:26.668680",
"rc": 0,
"start": "2017-06-08 12:00:26.666577"
}
STDOUT:
/tmp/openshift-logging-ansible-8OdbB5
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:26
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-8OdbB5"
},
"changed": false
}
TASK [openshift_logging_elasticsearch : Create templates subdirectory] *********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:30
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-8OdbB5/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_elasticsearch : Create ES service account] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:40
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Create ES service account] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:48
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-elasticsearch -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-elasticsearch-dockercfg-fmdws"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:27Z",
"name": "aggregated-logging-elasticsearch",
"namespace": "logging",
"resourceVersion": "1438",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-elasticsearch",
"uid": "96eae493-4c63-11e7-94aa-0e1649350dc2"
},
"secrets": [
{
"name": "aggregated-logging-elasticsearch-token-p6q9b"
},
{
"name": "aggregated-logging-elasticsearch-dockercfg-fmdws"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:57
changed: [openshift] => {
"changed": true,
"checksum": "e5015364391ac609da8655a9a1224131599a5cea",
"dest": "/tmp/openshift-logging-ansible-8OdbB5/rolebinding-reader.yml",
"gid": 0,
"group": "root",
"md5sum": "446fb96447527f48f97e69bb41bad7be",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 135,
"src": "/root/.ansible/tmp/ansible-tmp-1496937627.96-15090371375507/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : Create rolebinding-reader role] ********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:61
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get clusterrole rolebinding-reader -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "ClusterRole",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:28Z",
"name": "rolebinding-reader",
"resourceVersion": "122",
"selfLink": "/oapi/v1/clusterroles/rolebinding-reader",
"uid": "979e7514-4c63-11e7-94aa-0e1649350dc2"
},
"rules": [
{
"apiGroups": [
""
],
"attributeRestrictions": null,
"resources": [
"clusterrolebindings"
],
"verbs": [
"get"
]
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Set rolebinding-reader permissions for ES] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:72
changed: [openshift] => {
"changed": true,
"present": "present",
"results": {
"cmd": "/bin/oc adm policy add-cluster-role-to-user rolebinding-reader system:serviceaccount:logging:aggregated-logging-elasticsearch -n logging",
"results": "",
"returncode": 0
}
}
TASK [openshift_logging_elasticsearch : Generate logging-elasticsearch-view-role] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:81
ok: [openshift] => {
"changed": false,
"checksum": "d752c09323565f80ed14fa806d42284f0c5aef2a",
"dest": "/tmp/openshift-logging-ansible-8OdbB5/logging-elasticsearch-view-role.yaml",
"gid": 0,
"group": "root",
"md5sum": "8299dca2fb036c06ba7c4f620680e0f6",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 183,
"src": "/root/.ansible/tmp/ansible-tmp-1496937629.71-47747142833431/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : Set logging-elasticsearch-view-role role] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:94
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get rolebinding logging-elasticsearch-view-role -o json -n logging",
"results": [
{
"apiVersion": "v1",
"groupNames": null,
"kind": "RoleBinding",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:30Z",
"name": "logging-elasticsearch-view-role",
"namespace": "logging",
"resourceVersion": "880",
"selfLink": "/oapi/v1/namespaces/logging/rolebindings/logging-elasticsearch-view-role",
"uid": "98a6437b-4c63-11e7-94aa-0e1649350dc2"
},
"roleRef": {
"name": "view"
},
"subjects": [
{
"kind": "ServiceAccount",
"name": "aggregated-logging-elasticsearch",
"namespace": "logging"
}
],
"userNames": [
"system:serviceaccount:logging:aggregated-logging-elasticsearch"
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : template] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:105
ok: [openshift] => {
"changed": false,
"checksum": "f91458d5dad42c496e2081ef872777a6f6eb9ff9",
"dest": "/tmp/openshift-logging-ansible-8OdbB5/elasticsearch-logging.yml",
"gid": 0,
"group": "root",
"md5sum": "e4be7c33c1927bbdd8c909bfbe3d9f0b",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2171,
"src": "/root/.ansible/tmp/ansible-tmp-1496937630.73-37111401919690/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : template] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:111
ok: [openshift] => {
"changed": false,
"checksum": "6d4f976f6e77a6e0c8dca7e01fb5bedb68678b1d",
"dest": "/tmp/openshift-logging-ansible-8OdbB5/elasticsearch.yml",
"gid": 0,
"group": "root",
"md5sum": "75abfd3a190832e593a8e5e7c5695e8e",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2454,
"src": "/root/.ansible/tmp/ansible-tmp-1496937630.97-208896343508904/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:121
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:127
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Set ES configmap] **********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:133
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get configmap logging-elasticsearch -o json -n logging",
"results": [
{
"apiVersion": "v1",
"data": {
"elasticsearch.yml": "cluster:\n name: ${CLUSTER_NAME}\n\nscript:\n inline: on\n indexed: on\n\nindex:\n number_of_shards: 1\n number_of_replicas: 0\n unassigned.node_left.delayed_timeout: 2m\n translog:\n flush_threshold_size: 256mb\n flush_threshold_period: 5m\n\nnode:\n master: ${IS_MASTER}\n data: ${HAS_DATA}\n\nnetwork:\n host: 0.0.0.0\n\ncloud:\n kubernetes:\n service: ${SERVICE_DNS}\n namespace: ${NAMESPACE}\n\ndiscovery:\n type: kubernetes\n zen.ping.multicast.enabled: false\n zen.minimum_master_nodes: ${NODE_QUORUM}\n\ngateway:\n recover_after_nodes: ${NODE_QUORUM}\n expected_nodes: ${RECOVER_EXPECTED_NODES}\n recover_after_time: ${RECOVER_AFTER_TIME}\n\nio.fabric8.elasticsearch.authentication.users: [\"system.logging.kibana\", \"system.logging.fluentd\", \"system.logging.curator\", \"system.admin\"]\nio.fabric8.elasticsearch.kibana.mapping.app: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\nio.fabric8.elasticsearch.kibana.mapping.ops: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\nio.fabric8.elasticsearch.kibana.mapping.empty: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\n\nopenshift.config:\n use_common_data_model: true\n project_index_prefix: \"project\"\n time_field_name: \"@timestamp\"\n\nopenshift.searchguard:\n keystore.path: /etc/elasticsearch/secret/admin.jks\n truststore.path: /etc/elasticsearch/secret/searchguard.truststore\n\nopenshift.operations.allow_cluster_reader: false\n\npath:\n data: /elasticsearch/persistent/${CLUSTER_NAME}/data\n logs: /elasticsearch/${CLUSTER_NAME}/logs\n work: /elasticsearch/${CLUSTER_NAME}/work\n scripts: /elasticsearch/${CLUSTER_NAME}/scripts\n\nsearchguard:\n authcz.admin_dn:\n - CN=system.admin,OU=OpenShift,O=Logging\n config_index_name: \".searchguard.${HOSTNAME}\"\n ssl:\n transport:\n enabled: true\n enforce_hostname_verification: false\n keystore_type: JKS\n keystore_filepath: /etc/elasticsearch/secret/searchguard.key\n keystore_password: kspass\n truststore_type: JKS\n truststore_filepath: /etc/elasticsearch/secret/searchguard.truststore\n truststore_password: tspass\n http:\n enabled: true\n keystore_type: JKS\n keystore_filepath: /etc/elasticsearch/secret/key\n keystore_password: kspass\n clientauth_mode: OPTIONAL\n truststore_type: JKS\n truststore_filepath: /etc/elasticsearch/secret/truststore\n truststore_password: tspass\n",
"logging.yml": "# you can override this using by setting a system property, for example -Des.logger.level=DEBUG\nes.logger.level: INFO\nrootLogger: ${es.logger.level}, console, file\nlogger:\n # log action execution errors for easier debugging\n action: WARN\n # reduce the logging for aws, too much is logged under the default INFO\n com.amazonaws: WARN\n io.fabric8.elasticsearch: ${PLUGIN_LOGLEVEL}\n io.fabric8.kubernetes: ${PLUGIN_LOGLEVEL}\n\n # gateway\n #gateway: DEBUG\n #index.gateway: DEBUG\n\n # peer shard recovery\n #indices.recovery: DEBUG\n\n # discovery\n #discovery: TRACE\n\n index.search.slowlog: TRACE, index_search_slow_log_file\n index.indexing.slowlog: TRACE, index_indexing_slow_log_file\n\n # search-guard\n com.floragunn.searchguard: WARN\n\nadditivity:\n index.search.slowlog: false\n index.indexing.slowlog: false\n\nappender:\n console:\n type: console\n layout:\n type: consolePattern\n conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n file:\n type: dailyRollingFile\n file: ${path.logs}/${cluster.name}.log\n datePattern: \"'.'yyyy-MM-dd\"\n layout:\n type: pattern\n conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n # Use the following log4j-extras RollingFileAppender to enable gzip compression of log files.\n # For more information see https://logging.apache.org/log4j/extras/apidocs/org/apache/log4j/rolling/RollingFileAppender.html\n #file:\n #type: extrasRollingFile\n #file: ${path.logs}/${cluster.name}.log\n #rollingPolicy: timeBased\n #rollingPolicy.FileNamePattern: ${path.logs}/${cluster.name}.log.%d{yyyy-MM-dd}.gz\n #layout:\n #type: pattern\n #conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n index_search_slow_log_file:\n type: dailyRollingFile\n file: ${path.logs}/${cluster.name}_index_search_slowlog.log\n datePattern: \"'.'yyyy-MM-dd\"\n layout:\n type: pattern\n conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n index_indexing_slow_log_file:\n type: dailyRollingFile\n file: ${path.logs}/${cluster.name}_index_indexing_slowlog.log\n datePattern: \"'.'yyyy-MM-dd\"\n layout:\n type: pattern\n conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n"
},
"kind": "ConfigMap",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:31Z",
"name": "logging-elasticsearch",
"namespace": "logging",
"resourceVersion": "1447",
"selfLink": "/api/v1/namespaces/logging/configmaps/logging-elasticsearch",
"uid": "997b951d-4c63-11e7-94aa-0e1649350dc2"
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Set ES secret] *************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:144
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc secrets new logging-elasticsearch key=/etc/origin/logging/logging-es.jks truststore=/etc/origin/logging/truststore.jks searchguard.key=/etc/origin/logging/elasticsearch.jks searchguard.truststore=/etc/origin/logging/truststore.jks admin-key=/etc/origin/logging/system.admin.key admin-cert=/etc/origin/logging/system.admin.crt admin-ca=/etc/origin/logging/ca.crt admin.jks=/etc/origin/logging/system.admin.jks -n logging",
"results": "",
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Set logging-es-cluster service] ********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:168
changed: [openshift] => {
"changed": true,
"results": {
"clusterip": "172.30.122.90",
"cmd": "/bin/oc get service logging-es-cluster -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Service",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:33Z",
"name": "logging-es-cluster",
"namespace": "logging",
"resourceVersion": "1450",
"selfLink": "/api/v1/namespaces/logging/services/logging-es-cluster",
"uid": "9a8a683b-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"clusterIP": "172.30.122.90",
"ports": [
{
"port": 9300,
"protocol": "TCP",
"targetPort": 9300
}
],
"selector": {
"component": "es",
"provider": "openshift"
},
"sessionAffinity": "None",
"type": "ClusterIP"
},
"status": {
"loadBalancer": {}
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Set logging-es service] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:182
changed: [openshift] => {
"changed": true,
"results": {
"clusterip": "172.30.247.96",
"cmd": "/bin/oc get service logging-es -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Service",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:34Z",
"name": "logging-es",
"namespace": "logging",
"resourceVersion": "1454",
"selfLink": "/api/v1/namespaces/logging/services/logging-es",
"uid": "9b26e7b9-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"clusterIP": "172.30.247.96",
"ports": [
{
"port": 9200,
"protocol": "TCP",
"targetPort": "restapi"
}
],
"selector": {
"component": "es",
"provider": "openshift"
},
"sessionAffinity": "None",
"type": "ClusterIP"
},
"status": {
"loadBalancer": {}
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Creating ES storage template] **********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:197
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Creating ES storage template] **********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:210
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Set ES storage] ************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:225
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:237
ok: [openshift] => {
"ansible_facts": {
"es_deploy_name": "logging-es-data-master-9s2p0i8l"
},
"changed": false
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:241
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Set ES dc templates] *******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:246
changed: [openshift] => {
"changed": true,
"checksum": "5094c4abee414f2a99bc897c6bfc94c8d2ea757b",
"dest": "/tmp/openshift-logging-ansible-8OdbB5/templates/logging-es-dc.yml",
"gid": 0,
"group": "root",
"md5sum": "11281434d5e801083236afee7a0de4d5",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 3138,
"src": "/root/.ansible/tmp/ansible-tmp-1496937635.13-191183517237032/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : Set ES dc] *****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:262
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get dc logging-es-data-master-9s2p0i8l -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "DeploymentConfig",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:35Z",
"generation": 2,
"labels": {
"component": "es",
"deployment": "logging-es-data-master-9s2p0i8l",
"logging-infra": "elasticsearch",
"provider": "openshift"
},
"name": "logging-es-data-master-9s2p0i8l",
"namespace": "logging",
"resourceVersion": "1468",
"selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-es-data-master-9s2p0i8l",
"uid": "9bde876d-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"replicas": 1,
"selector": {
"component": "es",
"deployment": "logging-es-data-master-9s2p0i8l",
"logging-infra": "elasticsearch",
"provider": "openshift"
},
"strategy": {
"activeDeadlineSeconds": 21600,
"recreateParams": {
"timeoutSeconds": 600
},
"resources": {},
"type": "Recreate"
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "es",
"deployment": "logging-es-data-master-9s2p0i8l",
"logging-infra": "elasticsearch",
"provider": "openshift"
},
"name": "logging-es-data-master-9s2p0i8l"
},
"spec": {
"containers": [
{
"env": [
{
"name": "NAMESPACE",
"valueFrom": {
"fieldRef": {
"apiVersion": "v1",
"fieldPath": "metadata.namespace"
}
}
},
{
"name": "KUBERNETES_TRUST_CERT",
"value": "true"
},
{
"name": "SERVICE_DNS",
"value": "logging-es-cluster"
},
{
"name": "CLUSTER_NAME",
"value": "logging-es"
},
{
"name": "INSTANCE_RAM",
"value": "8Gi"
},
{
"name": "NODE_QUORUM",
"value": "1"
},
{
"name": "RECOVER_EXPECTED_NODES",
"value": "1"
},
{
"name": "RECOVER_AFTER_TIME",
"value": "5m"
},
{
"name": "READINESS_PROBE_TIMEOUT",
"value": "30"
},
{
"name": "IS_MASTER",
"value": "true"
},
{
"name": "HAS_DATA",
"value": "true"
}
],
"image": "172.30.255.47:5000/logging/logging-elasticsearch:latest",
"imagePullPolicy": "Always",
"name": "elasticsearch",
"ports": [
{
"containerPort": 9200,
"name": "restapi",
"protocol": "TCP"
},
{
"containerPort": 9300,
"name": "cluster",
"protocol": "TCP"
}
],
"readinessProbe": {
"exec": {
"command": [
"/usr/share/elasticsearch/probe/readiness.sh"
]
},
"failureThreshold": 3,
"initialDelaySeconds": 10,
"periodSeconds": 5,
"successThreshold": 1,
"timeoutSeconds": 30
},
"resources": {
"limits": {
"cpu": "1",
"memory": "8Gi"
},
"requests": {
"memory": "512Mi"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/etc/elasticsearch/secret",
"name": "elasticsearch",
"readOnly": true
},
{
"mountPath": "/usr/share/java/elasticsearch/config",
"name": "elasticsearch-config",
"readOnly": true
},
{
"mountPath": "/elasticsearch/persistent",
"name": "elasticsearch-storage"
}
]
}
],
"dnsPolicy": "ClusterFirst",
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {
"supplementalGroups": [
65534
]
},
"serviceAccount": "aggregated-logging-elasticsearch",
"serviceAccountName": "aggregated-logging-elasticsearch",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"name": "elasticsearch",
"secret": {
"defaultMode": 420,
"secretName": "logging-elasticsearch"
}
},
{
"configMap": {
"defaultMode": 420,
"name": "logging-elasticsearch"
},
"name": "elasticsearch-config"
},
{
"emptyDir": {},
"name": "elasticsearch-storage"
}
]
}
},
"test": false,
"triggers": [
{
"type": "ConfigChange"
}
]
},
"status": {
"availableReplicas": 0,
"conditions": [
{
"lastTransitionTime": "2017-06-08T16:00:35Z",
"lastUpdateTime": "2017-06-08T16:00:35Z",
"message": "Deployment config does not have minimum availability.",
"status": "False",
"type": "Available"
},
{
"lastTransitionTime": "2017-06-08T16:00:35Z",
"lastUpdateTime": "2017-06-08T16:00:35Z",
"message": "replication controller \"logging-es-data-master-9s2p0i8l-1\" is waiting for pod \"logging-es-data-master-9s2p0i8l-1-deploy\" to run",
"status": "Unknown",
"type": "Progressing"
}
],
"details": {
"causes": [
{
"type": "ConfigChange"
}
],
"message": "config change"
},
"latestVersion": 1,
"observedGeneration": 2,
"replicas": 0,
"unavailableReplicas": 0,
"updatedReplicas": 0
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Delete temp directory] *****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:274
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-8OdbB5",
"state": "absent"
}
TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:99
TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:105
ok: [openshift] => {
"ansible_facts": {
"es_ops_indices": "[]"
},
"changed": false
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:109
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:132
statically included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml
TASK [openshift_logging_elasticsearch : Validate Elasticsearch cluster size] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:2
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Validate Elasticsearch Ops cluster size] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:6
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:10
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:14
ok: [openshift] => {
"ansible_facts": {
"elasticsearch_name": "logging-elasticsearch-ops",
"es_component": "es-ops"
},
"changed": false
}
TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"es_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_elasticsearch : debug] *********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:11
ok: [openshift] => {
"changed": false,
"openshift_logging_image_version": "latest"
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:14
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:17
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Create temp directory for doing work in] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:21
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.002030",
"end": "2017-06-08 12:00:37.079699",
"rc": 0,
"start": "2017-06-08 12:00:37.077669"
}
STDOUT:
/tmp/openshift-logging-ansible-iwnCe7
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:26
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-iwnCe7"
},
"changed": false
}
TASK [openshift_logging_elasticsearch : Create templates subdirectory] *********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:30
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-iwnCe7/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_elasticsearch : Create ES service account] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:40
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Create ES service account] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:48
ok: [openshift] => {
"changed": false,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-elasticsearch -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-elasticsearch-dockercfg-fmdws"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:27Z",
"name": "aggregated-logging-elasticsearch",
"namespace": "logging",
"resourceVersion": "1438",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-elasticsearch",
"uid": "96eae493-4c63-11e7-94aa-0e1649350dc2"
},
"secrets": [
{
"name": "aggregated-logging-elasticsearch-token-p6q9b"
},
{
"name": "aggregated-logging-elasticsearch-dockercfg-fmdws"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:57
changed: [openshift] => {
"changed": true,
"checksum": "e5015364391ac609da8655a9a1224131599a5cea",
"dest": "/tmp/openshift-logging-ansible-iwnCe7/rolebinding-reader.yml",
"gid": 0,
"group": "root",
"md5sum": "446fb96447527f48f97e69bb41bad7be",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 135,
"src": "/root/.ansible/tmp/ansible-tmp-1496937637.87-257237133629870/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : Create rolebinding-reader role] ********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:61
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get clusterrole rolebinding-reader -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "ClusterRole",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:28Z",
"name": "rolebinding-reader",
"resourceVersion": "122",
"selfLink": "/oapi/v1/clusterroles/rolebinding-reader",
"uid": "979e7514-4c63-11e7-94aa-0e1649350dc2"
},
"rules": [
{
"apiGroups": [
""
],
"attributeRestrictions": null,
"resources": [
"clusterrolebindings"
],
"verbs": [
"get"
]
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Set rolebinding-reader permissions for ES] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:72
ok: [openshift] => {
"changed": false,
"present": "present"
}
TASK [openshift_logging_elasticsearch : Generate logging-elasticsearch-view-role] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:81
ok: [openshift] => {
"changed": false,
"checksum": "d752c09323565f80ed14fa806d42284f0c5aef2a",
"dest": "/tmp/openshift-logging-ansible-iwnCe7/logging-elasticsearch-view-role.yaml",
"gid": 0,
"group": "root",
"md5sum": "8299dca2fb036c06ba7c4f620680e0f6",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 183,
"src": "/root/.ansible/tmp/ansible-tmp-1496937639.56-203126685507818/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : Set logging-elasticsearch-view-role role] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:94
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get rolebinding logging-elasticsearch-view-role -o json -n logging",
"results": [
{
"apiVersion": "v1",
"groupNames": null,
"kind": "RoleBinding",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:30Z",
"name": "logging-elasticsearch-view-role",
"namespace": "logging",
"resourceVersion": "1445",
"selfLink": "/oapi/v1/namespaces/logging/rolebindings/logging-elasticsearch-view-role",
"uid": "98a6437b-4c63-11e7-94aa-0e1649350dc2"
},
"roleRef": {
"name": "view"
},
"subjects": [
{
"kind": "ServiceAccount",
"name": "aggregated-logging-elasticsearch",
"namespace": "logging"
}
],
"userNames": [
"system:serviceaccount:logging:aggregated-logging-elasticsearch"
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : template] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:105
ok: [openshift] => {
"changed": false,
"checksum": "f91458d5dad42c496e2081ef872777a6f6eb9ff9",
"dest": "/tmp/openshift-logging-ansible-iwnCe7/elasticsearch-logging.yml",
"gid": 0,
"group": "root",
"md5sum": "e4be7c33c1927bbdd8c909bfbe3d9f0b",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2171,
"src": "/root/.ansible/tmp/ansible-tmp-1496937640.91-66504919822938/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : template] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:111
ok: [openshift] => {
"changed": false,
"checksum": "6d4f976f6e77a6e0c8dca7e01fb5bedb68678b1d",
"dest": "/tmp/openshift-logging-ansible-iwnCe7/elasticsearch.yml",
"gid": 0,
"group": "root",
"md5sum": "75abfd3a190832e593a8e5e7c5695e8e",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2454,
"src": "/root/.ansible/tmp/ansible-tmp-1496937641.2-171805950672149/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:121
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:127
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Set ES configmap] **********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:133
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get configmap logging-elasticsearch-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"data": {
"elasticsearch.yml": "cluster:\n name: ${CLUSTER_NAME}\n\nscript:\n inline: on\n indexed: on\n\nindex:\n number_of_shards: 1\n number_of_replicas: 0\n unassigned.node_left.delayed_timeout: 2m\n translog:\n flush_threshold_size: 256mb\n flush_threshold_period: 5m\n\nnode:\n master: ${IS_MASTER}\n data: ${HAS_DATA}\n\nnetwork:\n host: 0.0.0.0\n\ncloud:\n kubernetes:\n service: ${SERVICE_DNS}\n namespace: ${NAMESPACE}\n\ndiscovery:\n type: kubernetes\n zen.ping.multicast.enabled: false\n zen.minimum_master_nodes: ${NODE_QUORUM}\n\ngateway:\n recover_after_nodes: ${NODE_QUORUM}\n expected_nodes: ${RECOVER_EXPECTED_NODES}\n recover_after_time: ${RECOVER_AFTER_TIME}\n\nio.fabric8.elasticsearch.authentication.users: [\"system.logging.kibana\", \"system.logging.fluentd\", \"system.logging.curator\", \"system.admin\"]\nio.fabric8.elasticsearch.kibana.mapping.app: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\nio.fabric8.elasticsearch.kibana.mapping.ops: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\nio.fabric8.elasticsearch.kibana.mapping.empty: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\n\nopenshift.config:\n use_common_data_model: true\n project_index_prefix: \"project\"\n time_field_name: \"@timestamp\"\n\nopenshift.searchguard:\n keystore.path: /etc/elasticsearch/secret/admin.jks\n truststore.path: /etc/elasticsearch/secret/searchguard.truststore\n\nopenshift.operations.allow_cluster_reader: false\n\npath:\n data: /elasticsearch/persistent/${CLUSTER_NAME}/data\n logs: /elasticsearch/${CLUSTER_NAME}/logs\n work: /elasticsearch/${CLUSTER_NAME}/work\n scripts: /elasticsearch/${CLUSTER_NAME}/scripts\n\nsearchguard:\n authcz.admin_dn:\n - CN=system.admin,OU=OpenShift,O=Logging\n config_index_name: \".searchguard.${HOSTNAME}\"\n ssl:\n transport:\n enabled: true\n enforce_hostname_verification: false\n keystore_type: JKS\n keystore_filepath: /etc/elasticsearch/secret/searchguard.key\n keystore_password: kspass\n truststore_type: JKS\n truststore_filepath: /etc/elasticsearch/secret/searchguard.truststore\n truststore_password: tspass\n http:\n enabled: true\n keystore_type: JKS\n keystore_filepath: /etc/elasticsearch/secret/key\n keystore_password: kspass\n clientauth_mode: OPTIONAL\n truststore_type: JKS\n truststore_filepath: /etc/elasticsearch/secret/truststore\n truststore_password: tspass\n",
"logging.yml": "# you can override this using by setting a system property, for example -Des.logger.level=DEBUG\nes.logger.level: INFO\nrootLogger: ${es.logger.level}, console, file\nlogger:\n # log action execution errors for easier debugging\n action: WARN\n # reduce the logging for aws, too much is logged under the default INFO\n com.amazonaws: WARN\n io.fabric8.elasticsearch: ${PLUGIN_LOGLEVEL}\n io.fabric8.kubernetes: ${PLUGIN_LOGLEVEL}\n\n # gateway\n #gateway: DEBUG\n #index.gateway: DEBUG\n\n # peer shard recovery\n #indices.recovery: DEBUG\n\n # discovery\n #discovery: TRACE\n\n index.search.slowlog: TRACE, index_search_slow_log_file\n index.indexing.slowlog: TRACE, index_indexing_slow_log_file\n\n # search-guard\n com.floragunn.searchguard: WARN\n\nadditivity:\n index.search.slowlog: false\n index.indexing.slowlog: false\n\nappender:\n console:\n type: console\n layout:\n type: consolePattern\n conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n file:\n type: dailyRollingFile\n file: ${path.logs}/${cluster.name}.log\n datePattern: \"'.'yyyy-MM-dd\"\n layout:\n type: pattern\n conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n # Use the following log4j-extras RollingFileAppender to enable gzip compression of log files.\n # For more information see https://logging.apache.org/log4j/extras/apidocs/org/apache/log4j/rolling/RollingFileAppender.html\n #file:\n #type: extrasRollingFile\n #file: ${path.logs}/${cluster.name}.log\n #rollingPolicy: timeBased\n #rollingPolicy.FileNamePattern: ${path.logs}/${cluster.name}.log.%d{yyyy-MM-dd}.gz\n #layout:\n #type: pattern\n #conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n index_search_slow_log_file:\n type: dailyRollingFile\n file: ${path.logs}/${cluster.name}_index_search_slowlog.log\n datePattern: \"'.'yyyy-MM-dd\"\n layout:\n type: pattern\n conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n index_indexing_slow_log_file:\n type: dailyRollingFile\n file: ${path.logs}/${cluster.name}_index_indexing_slowlog.log\n datePattern: \"'.'yyyy-MM-dd\"\n layout:\n type: pattern\n conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n"
},
"kind": "ConfigMap",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:42Z",
"name": "logging-elasticsearch-ops",
"namespace": "logging",
"resourceVersion": "1495",
"selfLink": "/api/v1/namespaces/logging/configmaps/logging-elasticsearch-ops",
"uid": "9f9a8426-4c63-11e7-94aa-0e1649350dc2"
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Set ES secret] *************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:144
ok: [openshift] => {
"changed": false,
"results": {
"apiVersion": "v1",
"data": {
"admin-ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU1Gb1hEVEl5TURZd056RTFOVGsxTVZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU9relNqcnBVaVRaRytrUUh5UFNtOFFVcnBJTzFMaE1XRUdiZjF5bERXZ00KdDdJVkFKZ05FQXcwRy8wK1pVSFpVQUN1T1M2NEtaZkE1NnhHMTIrL24vYjRtR2Z1QXhjakxBZ2xhNmRadmVqdApEZTdLSnducVBVNXordysrQWRhc3BBS3kyNlJyZCtWVHllbk5hZiswMjdRdDR4bjNETnhLcjFuOXFJcjN3cnZWCnZYZGZVU215RmJjY0N2ZnBpRjZqZE1aL1BhR2VEYXhFelpSUDJrL2liVEIvSTlxTmhESE1xcUxsQXZERmkzZlMKU0JxMTdWbEh4WHJPeENyQm4vNzVyOE9DZVk5a25pTXB1N0NmVFY3Nm85eU81SnlnVWg3TldTQmY5a2w0cHVIMQpLV1htcmVZSWdyR3FxakVFQ2Qxa0cyRVZDMVgvZ1FWZEJHbjBqMkk4d1BjQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFBaE8KbE5uS1RJNjRhWE8yOEVYa1pwcTZtMkdwY2xQTnpzM0dDVGE0RWYvMkJCa3pSd0ttNlgvN2NpZzh1S0lCNDE3Ugp1V1Jia2kvaHBGVGJYakhaRTJDOHEwSEpMVlVJKzdPQkcvS0cyOG5venpPNzMwZ0p4Q1o5NWZZNldsUjV4SURLCkJVeE9xYWhhRWlRTEx6MEtCUDBudUNxNlZCZWRIVEJPY1JGRDJlSXp6RlMvTFZqS2lqZ1MwZFZsb1VSV3JnZGYKdG1LSnhZYlZDOW9sK0MvdzFqYnp1cnFHZ1dRQ24wZGlOZ05HZ3I4TXVYRTZmQ0hMcnJSMVVGUTNXN0R6OUZ2SwpHTGNRcHJEQ3hUbUo0eTk5cy85aUEyUDhDbGQvWXMrVlB6WlB6UmVZcVRkdkVjQXpHb0drdVRiQXhQbmUyUEJICm5wMDM3WkY0UGFwZFdudGdxR3M9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K",
"admin-cert": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURQRENDQWlTZ0F3SUJBZ0lCQlRBTkJna3Foa2lHOXcwQkFRVUZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU4xb1hEVEU1TURZd09ERTFOVGsxTjFvdwpQVEVRTUE0R0ExVUVDZ3dIVEc5bloybHVaekVTTUJBR0ExVUVDd3dKVDNCbGJsTm9hV1owTVJVd0V3WURWUVFECkRBeHplWE4wWlcwdVlXUnRhVzR3Z2dFaU1BMEdDU3FHU0liM0RRRUJBUVVBQTRJQkR3QXdnZ0VLQW9JQkFRRFMKZ3ZqNjVzZ1FtYW1OMk5BYUlRQi9TNW8xeS9KUjNUbVUwR2JVRHp5TGxmNHB5ZjJvaHFHRmJMZVQvUExheUcxTwpaQ3hGcU5ZMS9oMUpoYjQyWU5OT3NDelJEZ0MxU2dFandubHZCQnpMdlN3SG5GVnZPZHhKUE1GdEhTbkErcys5CmhpL3p2SGpOeGVPT3pqRkRVYzl3bXNwR2o4NFpXcGlwWkRrQ2lrTGtZc0lrTnB3TEswdnBvNjFoU1NRRlZjYVMKT0xWdk83ZWtjRjlZbWhuYTBMUXNkN0RrYWw2NFBUY2FsMUJBVmR1QW1NUVV1TFlnRGV2aTZRTmRqenE0VW5LdwpGL29JUy91TEhqVTZQeGU3Z25mRzBYelFwSUx2R0lpb0lsZExiWXp1bU4xS1VRckJJWm4xZmIyNzFQUzM3ZXNECmxOV1NqNHMrL0pjcjNPYVNBV09sQWdNQkFBR2paakJrTUE0R0ExVWREd0VCL3dRRUF3SUZvREFKQmdOVkhSTUUKQWpBQU1CMEdBMVVkSlFRV01CUUdDQ3NHQVFVRkJ3TUJCZ2dyQmdFRkJRY0RBakFkQmdOVkhRNEVGZ1FVc01ZYgoxcVVvMDNzaEJPbk1McE0vSDh0S3lTSXdDUVlEVlIwakJBSXdBREFOQmdrcWhraUc5dzBCQVFVRkFBT0NBUUVBCjVzVUlVTHFGcklLZDhNWXpHSU1qcHZkTzZFNDZRYlJiS1ZDWjR6RGpEc1gxOTliVTBXbXhYUG9wNFNhcDVQNVMKdnRuNkZJWmp1cjFBU3NjUDdEUHlRVlREUm5FLzR4Tmd5Mnl0V1pibnBkZG9BNUNkL01sODJ3REUzSFp4dC9zMApZVEpwUVNQUzEzNnh3amtuNm5SVHpQWDFKU096NmVpSHhkeFNWaFc2VnVpQ2gwYk92KzJ1eVdxbmdTNEliRWR6CnhoalQvNXVuRUNTc05IRFk0M1MwUDhFZW1Ca1FmbitsbE5reVNNc1djV3BHdHdQbm1kcmxYbVJNTGpVQWpkcW0KMWpldWdwV3RHTVd5QjhLbkhrSUg0OFdmbUVaMGpRQTBsbU51VFF0d3d2L3dlWU1yVFdhdmVYeVlwT2pvekJUeQpZYVR1VFQ3UEhDWDliWUpmR0pEMWJnPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
"admin-key": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV3QUlCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktvd2dnU21BZ0VBQW9JQkFRRFNndmo2NXNnUW1hbU4KMk5BYUlRQi9TNW8xeS9KUjNUbVUwR2JVRHp5TGxmNHB5ZjJvaHFHRmJMZVQvUExheUcxT1pDeEZxTlkxL2gxSgpoYjQyWU5OT3NDelJEZ0MxU2dFandubHZCQnpMdlN3SG5GVnZPZHhKUE1GdEhTbkErcys5aGkvenZIak54ZU9PCnpqRkRVYzl3bXNwR2o4NFpXcGlwWkRrQ2lrTGtZc0lrTnB3TEswdnBvNjFoU1NRRlZjYVNPTFZ2Tzdla2NGOVkKbWhuYTBMUXNkN0RrYWw2NFBUY2FsMUJBVmR1QW1NUVV1TFlnRGV2aTZRTmRqenE0VW5Ld0Yvb0lTL3VMSGpVNgpQeGU3Z25mRzBYelFwSUx2R0lpb0lsZExiWXp1bU4xS1VRckJJWm4xZmIyNzFQUzM3ZXNEbE5XU2o0cysvSmNyCjNPYVNBV09sQWdNQkFBRUNnZ0VCQUxrSWdGcmhZWGNkNmVnUmJCR2Jjbmx4ZFpIVnV4L3o2UG9DR0hCTEdEZkYKVm90YXpZaEZGbEpWMWlMUkVwSDEzQVlzakxHS0R3ZVVYaC9TMi9lbU1mWVhhcjB5cHJLeTRyK1EzL1lSMGxITwo3SldmOTRSMWdrZ3BqQWNodkNLeEVzcU8yVFZQTTR3RFpkcVBLTVA0Yy9LUlREeXJpRWZHMXlzRk9nc2VobCtCClNyaXJabjNTU0xHWEpQTElBK3RKSGNQYkN4bXJ6anJib2lBdlI3VmlnWmovOU0yUVZlWXRFcU9zUmlPUUNwK3MKUW5IcmUzTmVFN1dEdjJNOGFoMUdhWG9kbUJhRnY1Y1JqZ1k3MXlJUS9FRGxVL2dQOWNYdTB5QjYyVENpeXJGYQpnOGxzVkZhQ05aQjFMc3N1U0RqZkhRUG0zWVRFR1YrUklrcnRacFZKTlowQ2dZRUE4QitLcW1lbFhBekRmYjIyCnZGSUJieW1MRkpmeXZkdGdCUlNvRkhkRHdCRHBWaTdPaXdDU0tHN0tPL3ltd0tlRDlRVU1wS3lpSWl6azZSdkkKN2pYdzJxWW1pNm9ydUVYUDlxV1drN3NPMno4MmJVaVUvb2hFeThmODRzcDNBUmRzQU4rMFMvcjg2L2VkQTF0VgpNaHNUd0tySXRvbVRhRjJPQUp3dTlZVUc4cGNDZ1lFQTRHNDFhQ1Nlb1Z2ck0xVVJXTFIxUlkyRUVWRWcxQnNRCjlpVmh2WitOcHI1RGtPNzU5OGg5TkU4WURrTGRrUkh4aTNHdWhpcDB4Z0xMUTNRVDZhekF3N2VYQlRHdWhXTEQKd1B2UWVaZ2RESFF2NFNTRjNQc3F5SWJMVHc5M1lldEluMVpzSXh6WUxNT3dQeWFGZzNMWWNRNWtHT2RvUkxwZwpGbmxzeXp1dnJ5TUNnWUVBclJzd0RqVDJJdXdZcXJkdjIwZUxQU1pIMjZySEZsUG1Yd05MUUtYU004NTlTT0lwCmlFLzdEaWNBcUVvMXRNR1BBYjZJSk5kSGN6ZkdOcjhQRlp2UmJPOUc3bnpaVTlrUVdaRjZnam5pTmZEV1BlYUwKY3d1dC9QTEp2bjJUV1RnWFpuSGZPcjlRUXBTWm51ZmkvV0pEMVN2SjByNVBoSGVkZlpjYXkya0JYelVDZ1lFQQp1L3kwY1lKUytWZlVJQkErVW0zOTJQK2J3eUdRZlBvYThBTUo1dHF4dlJ5YjNkZSt1OFZNMHJiNXNHQmoxUE1SCnZ6cSs5QlMxOFhJL3JrV1p0RjhHckNtb3J5b1pSQk1oWXhIS1l6OTkvWU9JNDI1RFRvRjRLYlYyc01lRXVSbHEKb0VTMG1ZaThybXBQdytYUmFmU1Z4Y1ZISlBFWWpwRUtrV2l1TDJIVVc3OENnWUVBZzB3RjVqMHNKY3MyaTRFbQo2aHRQL081c2poUXQ1UzhoczE4UWVmeDVWZXExLzBJaTFPWnc2SVFTRCtXdmtsT3FNaDdRMmN1OHcwNmFGcmtGClNZUHJoR2xUb05GYjAzVWg0d2V2VHl0M3M2Q212M3grRFg1cWtDR1ZLNUFIaThuS2VORkF6UVFlNmk2b3lWOTQKSFZOOGVhWGZBTkJCN0padmwrQVFGUENWUmFnPQotLS0tLUVORCBQUklWQVRFIEtFWS0tLS0tCg==",
"admin.jks": "/u3+7QAAAAIAAAACAAAAAQAMc3lzdGVtLmFkbWluAAABXIhvalwAAAUCMIIE/jAOBgorBgEEASoCEQEBBQAEggTqPQ4OuuNmeyOfeWORyaMLLtPvE62Ab/z29cXC8E7tr9YENdfseUsep+NWlIOeIwo6a7ylNHyGlnLlcBGDd5878mshRQge0Nc/2i1RRMr5bNOyTXd0pIb3lj5IEDl75+dmPymR+6pM8kfOhMmSHocxqGANZvKsazqAR38tLIjh9o0DtZ0AtIVc3FRciM7Qz3hYlblEBlp3ZCyrHXui2mHyNcPb0bO6MLMvtssA801fu+vJ/7yqlJI36psrGU20ISDxOlXre+Jnvc/z1u6MMJTmeO58g3nF2pH9x7snxL+T2Nuiq5LKkmTOR+GJK1cUoJYhY8Xiaz5sHi3xoQB9EXf5WKR6U+zmTGKmNeu/lUJQGl/jH+eQaPIKOMbg3ck+HoZ0xiwABgTgfgZ3bi8fcC8I+W7YKjH6NmrC0VcyfpbEicjtY7c42X/KfkOeu9qbF1jLjMcGRDSDe4eIxpbNxBaSfx/ThB9SF3Q++KPbdikHeRk1XpMaK5waPSAlN51eqsfZx81/C1Mz/8x8EJUn3U01a4I+yeqPkbIrWYKysGBV6D3kLRhCrmreemCeyXXpmPjUWvoPA8fwaYEmtINltP2LxqrYlMFbH9ZDuxgKlBOEa1uDPKHjYGyBQvhWjaQB7xgpla/rN5qEd/x5fyeGVHMCcSdHyc9cv86sWvibbYnsGqKgnVNHowd+miYmx7DuMa2PKYxxLNfJeVE0Vh3o9kXz+wGM+ptKIyINfNavJh8dSBlVqnHJwv0Vu/wfUAUD9DdHvZIFw4o3yr4dQdm5pOw3f+SEHUsqG30fvu8U1Ad4Fg5h2Rgg343EYYSM/D1StO0b0fF+OMGjZwErFuzM7GD8h1FvWUKoNzfD8MgO7jmwObTGQ1BSgGhixGMZVr67CzF9lC8uv8YBBeFlxpV/77eZuuAS8bcgoS6TzqxrF1HyWHwhYHboyEktTNWJQ3xTz8CLm4tvH0LyD8rrFtUav3EXwLQ4LHHJ+E8sI/rO2WGtm7DWz+WO2GfcTjm28dKfy6/jFqkdLv2STBNtPdiPvyW4uJEZKq5d7hr6xqLN4zUG7BfE0wyiIksGqZknAF1oMX15YKiqII/3sTSLhrO294snG8PPlU6AJj7KW+9SjB0b3fJ1JjQt3me37FQ3wx1PtGrUYi1tYzJthO7fIcJOgvM8Et0n6WrAbWkXnNgTCdt/G+Ywqvanh2226GBRn/RPa75QH+90lvHRpVl2vjZ9m6Sb+XKPdCFse/uPIGPlb4brSddME8n4oKDZMir4YJM+BQNXmzsR16M4kyE9DIYbhMuKq/g19TxLo+Ujhuu3IIcvdWfI9NIJuBMHw1k7nhM1lqkfXZG02zA9UzhkpC2pkAo9iwBmZwhvLtsWho1AodELCXIP+NseWjg6rcIBIbJH7YMYLFoyweb5mCV2U6uwgPP3+1H+wkNYPgc6rXUbsq/dvfXxCPWjbGqRbRatL7Qw6TPHNPxf6cD40P0DHzDljRFcS7TL1C7uGO4Fq6YHoFi3dwUWpvguBECT9TwLJVqmd6hC0Y+ZtvOKL3zmVRT52qUJsqgt8jRKsRn76jXs8Fi5C+kfzVfu0XJTs7RSY4TLDwT49Jo+BJiXdxCcZc9c1O5GPipPsMD7fTePI20qRdWgJPQ97SP29mgfqajk11BHncsNruwgPvSn/6D3KQAAAAIABVguNTA5AAADQDCCAzwwggIkoAMCAQICAQYwDQYJKoZIhvcNAQEFBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNjAwMjFaFw0xOTA2MDgxNjAwMjFaMD0xEDAOBgNVBAoTB0xvZ2dpbmcxEjAQBgNVBAsTCU9wZW5TaGlmdDEVMBMGA1UEAxMMc3lzdGVtLmFkbWluMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAqQVBMeOt4bYuvWznPbHW+zagEdrVoefxJ9XqwfUWSnV+IGzWFWaMCa9oI7Xif5c/YIbeqVhm++PKtVPGen/naUFbyHUEj+h9/8+SngD5JNmRfuntA825NS4ttAJ4qOwI86nHwY99oN/gP4TIWWTC0HDXdrB5kWS60+YVec3ZfvLSPBkA9Q/4xXjGA/MHlsQHOzc2UxFOt/jaHHZszxDJytnzf0nQUwk96kG2HBpN04MlY+EoNe1+klSEray5SVWcgS143mygbld2CDB7YluubH9iI2wDXyiiOW4bRMoJGUXbiZ7v4KkPAKDCr72x0k6VOK1uPkmgkB8gRTuUGkj+yQIDAQABo2YwZDAOBgNVHQ8BAf8EBAMCBaAwCQYDVR0TBAIwADAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwHQYDVR0OBBYEFK6jcOKu5q6i7STFH/ufzEmAesudMAkGA1UdIwQCMAAwDQYJKoZIhvcNAQEFBQADggEBACQDL2lUz+RXU92QJQmMfZusy49JcJt56Nn1CvU1wiPMYAspaVOIFYQuQ+6VnzGtoGYG+vtlmL+uPiFKtcHRZs1xrop0O/i6Ye47PJFQPFsprMB4DWx8bqyibrrg+AZjJ/xBxF3d9IlFXtZOb79l4dtrjuDrUOwqMeUxL6cTu8aUBP1rEPUxexze6F7bOkG7IkT2uE6yI+ni/0vHZNMZpbawazxc5jZBOcbf3zBWz9wPNbS5tJITnXZaDmMgDqOsmX+vr0Vv6mrHD0lpvbCjRm6+7O8qWOfWnzUjoaQ6a4sbbGqV6W8k5p1cT5lcyyF7Nc2Squkxpt+OTR29AxYuW08ABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNTU5NTBaFw0yMjA2MDcxNTU5NTFaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDpM0o66VIk2RvpEB8j0pvEFK6SDtS4TFhBm39cpQ1oDLeyFQCYDRAMNBv9PmVB2VAArjkuuCmXwOesRtdvv5/2+Jhn7gMXIywIJWunWb3o7Q3uyicJ6j1Oc/sPvgHWrKQCstuka3flU8npzWn/tNu0LeMZ9wzcSq9Z/aiK98K71b13X1EpshW3HAr36Yheo3TGfz2hng2sRM2UT9pP4m0wfyPajYQxzKqi5QLwxYt30kgate1ZR8V6zsQqwZ/++a/DgnmPZJ4jKbuwn01e+qPcjuScoFIezVkgX/ZJeKbh9Sll5q3mCIKxqqoxBAndZBthFQtV/4EFXQRp9I9iPMD3AgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQAITpTZykyOuGlztvBF5GaaupthqXJTzc7Nxgk2uBH/9gQZM0cCpul/+3IoPLiiAeNe0blkW5Iv4aRU214x2RNgvKtByS1VCPuzgRvyhtvJ6M8zu99ICcQmfeX2OlpUecSAygVMTqmoWhIkCy89CgT9J7gqulQXnR0wTnERQ9niM8xUvy1Yyoo4EtHVZaFEVq4HX7ZiicWG1QvaJfgv8NY287q6hoFkAp9HYjYDRoK/DLlxOnwhy660dVBUN1uw8/Rbyhi3EKawwsU5ieMvfbP/YgNj/ApXf2LPlT82T80XmKk3bxHAMxqBpLk2wMT53tjwR56dN+2ReD2qXVp7YKhrAAAAAgAGc2lnLWNhAAABXIhvac4ABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNTU5NTBaFw0yMjA2MDcxNTU5NTFaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDpM0o66VIk2RvpEB8j0pvEFK6SDtS4TFhBm39cpQ1oDLeyFQCYDRAMNBv9PmVB2VAArjkuuCmXwOesRtdvv5/2+Jhn7gMXIywIJWunWb3o7Q3uyicJ6j1Oc/sPvgHWrKQCstuka3flU8npzWn/tNu0LeMZ9wzcSq9Z/aiK98K71b13X1EpshW3HAr36Yheo3TGfz2hng2sRM2UT9pP4m0wfyPajYQxzKqi5QLwxYt30kgate1ZR8V6zsQqwZ/++a/DgnmPZJ4jKbuwn01e+qPcjuScoFIezVkgX/ZJeKbh9Sll5q3mCIKxqqoxBAndZBthFQtV/4EFXQRp9I9iPMD3AgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQAITpTZykyOuGlztvBF5GaaupthqXJTzc7Nxgk2uBH/9gQZM0cCpul/+3IoPLiiAeNe0blkW5Iv4aRU214x2RNgvKtByS1VCPuzgRvyhtvJ6M8zu99ICcQmfeX2OlpUecSAygVMTqmoWhIkCy89CgT9J7gqulQXnR0wTnERQ9niM8xUvy1Yyoo4EtHVZaFEVq4HX7ZiicWG1QvaJfgv8NY287q6hoFkAp9HYjYDRoK/DLlxOnwhy660dVBUN1uw8/Rbyhi3EKawwsU5ieMvfbP/YgNj/ApXf2LPlT82T80XmKk3bxHAMxqBpLk2wMT53tjwR56dN+2ReD2qXVp7YKhrG2qAbAa/3cfall8uSXrWyuAEMfo=",
"key": "/u3+7QAAAAIAAAACAAAAAQAKbG9nZ2luZy1lcwAAAVyIb3TYAAAFAjCCBP4wDgYKKwYBBAEqAhEBAQUABIIE6hvnLvF/Ov9/KzGOI4OGLLQEQ31e2zAfK0Sf8ROlbh9HweUuRk8PniphFo6fFKQV4IEd78RFj2PIpe0hxlhLu9KjHEIx5DnA79alYAHjn+zhgTFL1ubelJLhiMhQ1qxvfvPzUPD0+N4ZBsnFS42B3rDgEqN8w1vUgoR6Ps8q2R4RfcMtCnyNSey6qCYL33l/eYxYdQAMss1+Kb1SbYHY109Q9I1Rn+2vKcetJ66x/Q7TX/BDXiKHfC5nr0N+sOOEqerxMMVCok8vFQCL9AxO5mJqHCjsOsIH5AwJ+5b2waAwg4H2LSGDnJwNndkqLQj1jblh9v3S3TMc4MnuIG4f28yTi+0NX34Kcd0XCr0VtJq1DmsitD01ffHpq9JeSi9X4N8Mn2uGe12QcJUHFv4Zt0FpPwtfWmZh24tb2iUyUED7QgdjuAXQzFOc7e2oh0uoJAkezcu9hBAFvvf2sWOS8O4xZ82CiX68wCPsSrSmOCCuZqvl/ra6BGYRzym2a7PylXm8C/n0ic5sgxyNi9SqOAfEQnlDOyQNvFkaNcBovIPuYwg2UYSF2f1kDL0Uok4yzC/z/emuX2X/kUPWy9RUtfHwBFabqLCUX7CT6QFaS+XDK3uXneunI/Vuj3pUAHGJzERnJuRfgSZH62St9n4dhv9VcYsHSI9OCXz7JxCyvS/ESvzPB2swLpBYNS/DZPg3DIyQoUgTCWjuI9BUSA8aaQ4SIpsSrTWiLHTM58sUTw3Knm8okG+2X8gdtulPoEp0K6Mzp4krsLkumYIOt9NdpYUZTgMDDs7uUbeXrHrlnHLGH3DrVp6dZc5dhO592V5yE1Y7oBRYwRDS74cVl3NKIRvopp3ia9+XD8etJZzlkSEmQQ0YEEia5aP+t6xw9dSso8+bkM109HmxIld22nS7kXzshP/vBba0IZ9LlR89/lQyArWlq7eW/r0exASioTHMjffBBCyXVaZf8OZi57WlvOacfUsQDmGtn8GDTyOmsgNgYHrqM+xane1+uDYjp40jhJzkbdCGHx/aq1dpriuUhUgEDUgSX4IiCVEnhKd9Y6LP0hP/DdpaW/IRJNfQ7KmAdPfd4s7saNTb4r0XFAY+mcn0zJIe/eSuiw0wYUd9Zl7LAakNL/pXoHpCklbdQKeJrDRYvnksnJBMJ65S044NOHCP9Dh5HWwW7eO/9lUoWPIRmQYmvymz8oHEiv0Ki0HDWg5ylOmCfDgnHZMKH451EnmEdZhV5pPDPAhnUWTmY3eyV4aq7yBBAXkA/HrfcmmuqJjkp/4KjfyNAPkXdkOOVBtLbKIGaM7DOyyd5Qezlb9SeQqSAUR8e1SZJm3iFefxqa8m+hQfmf2cPDqh3kjdLJd6E7ChklzsW3+tI5VwjAVWA2w92zgTTZwGc6jYLvDpwNOjt7zSOA4bQ+oSfqiXYeJ9pZdIkoRxYFOImSYheYAte50QZaOsMis+MAQ57VspeELwW/rq3DJHcRPHim2TZU1jOztGgWRpiky/eR1tGHHME5Y6qsOYixyQZGts8F3MNBqS6/+hXFfbPlKmLJ825LHxbQRB4CmkNHVO1I0axm46SdGu9wl+4Z8Vq1uldIu7Zc9l2u9euX8/9oO/P8y4g4Oltjuac4/yu3Uz302leuF1/LAOcb8cJ7UcHfceDZwRc0DS9wQPsIfNWOoAAAACAAVYLjUwOQAABFwwggRYMIIDQKADAgECAgEIMA0GCSqGSIb3DQEBBQUAMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwHhcNMTcwNjA4MTYwMDI0WhcNMTkwNjA4MTYwMDI0WjA7MRAwDgYDVQQKEwdMb2dnaW5nMRIwEAYDVQQLEwlPcGVuU2hpZnQxEzARBgNVBAMTCmxvZ2dpbmctZXMwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQC5kOp30M9ozJrh/7+gwq7KyA4modmXS/jshySLckGMnHSpNjB2NbvXOMhF+fL7s8mYiEnHMCOAjQBh7M8lC3nJh+z9/Trvyl/cvCHSqeH3ab83m+FwW2HsMYtgOGutOOi7v6SSj9Uq0urRIevYfz+p8oEtPXMG2y01n12YCWdlvu0TmnmAYIhSrq+rcUScybKdad2yFjykcWlAA/yfmDaGY6eEs+7w2yaf7QfT5jWZJitNTqqBhrCdZ/nVwwfJjJY5hFJjewRsT3gwk/T330yECxhB7I0FIXCBFhm4K5+q/xy6oIVNIhGy9pyjFnHx/H+fR4xQ6xdKTgJFBiB2z19bAgMBAAGjggGCMIIBfjAOBgNVHQ8BAf8EBAMCBaAwCQYDVR0TBAIwADAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwHQYDVR0OBBYEFA6/wc0Bd5Oa9dI5auTT8BuLQW4eMAkGA1UdIwQCMAAwggEWBgNVHREEggENMIIBCYIJbG9jYWxob3N0hwR/AAABggpsb2dnaW5nLWVzgiRsb2dnaW5nLWVzLmxvZ2dpbmcuc3ZjLmNsdXN0ZXIubG9jYWyCEmxvZ2dpbmctZXMtY2x1c3RlcoIsbG9nZ2luZy1lcy1jbHVzdGVyLmxvZ2dpbmcuc3ZjLmNsdXN0ZXIubG9jYWyCDmxvZ2dpbmctZXMtb3Bzgihsb2dnaW5nLWVzLW9wcy5sb2dnaW5nLnN2Yy5jbHVzdGVyLmxvY2FsghZsb2dnaW5nLWVzLW9wcy1jbHVzdGVygjBsb2dnaW5nLWVzLW9wcy1jbHVzdGVyLmxvZ2dpbmcuc3ZjLmNsdXN0ZXIubG9jYWwwDQYJKoZIhvcNAQEFBQADggEBAILd+Uavx6zxTHLk73S5o7QkqXj+yauw9wdNVkQIeT5EzOc41dcgcVu9lc4SvXReH+RBghXHt1eClUj3jbOPgB54fYV4RcaglEZ2ckGhemyQTQe5bnf0s+F2Upbngd92/E9C1jYxzSoVziJM9n1hSneBj3s64PKj0LEzqxNWp6yq2R+SOgarAkXbivdRY6Ba7aM9Uhrd5+oRIFD4q2uJZUejS8sm0XP2jtK0H8frULzFyDdj198CvvMiF/q9nX6x5GP3ijUtXIktT6PsVtJPRjSuS25WeE1DnqOa2qaXaO7K/hlLBhAFBMvqkahOauYgEQ/DyEXJiFyaHn7jPIbKrsUABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNTU5NTBaFw0yMjA2MDcxNTU5NTFaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDpM0o66VIk2RvpEB8j0pvEFK6SDtS4TFhBm39cpQ1oDLeyFQCYDRAMNBv9PmVB2VAArjkuuCmXwOesRtdvv5/2+Jhn7gMXIywIJWunWb3o7Q3uyicJ6j1Oc/sPvgHWrKQCstuka3flU8npzWn/tNu0LeMZ9wzcSq9Z/aiK98K71b13X1EpshW3HAr36Yheo3TGfz2hng2sRM2UT9pP4m0wfyPajYQxzKqi5QLwxYt30kgate1ZR8V6zsQqwZ/++a/DgnmPZJ4jKbuwn01e+qPcjuScoFIezVkgX/ZJeKbh9Sll5q3mCIKxqqoxBAndZBthFQtV/4EFXQRp9I9iPMD3AgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQAITpTZykyOuGlztvBF5GaaupthqXJTzc7Nxgk2uBH/9gQZM0cCpul/+3IoPLiiAeNe0blkW5Iv4aRU214x2RNgvKtByS1VCPuzgRvyhtvJ6M8zu99ICcQmfeX2OlpUecSAygVMTqmoWhIkCy89CgT9J7gqulQXnR0wTnERQ9niM8xUvy1Yyoo4EtHVZaFEVq4HX7ZiicWG1QvaJfgv8NY287q6hoFkAp9HYjYDRoK/DLlxOnwhy660dVBUN1uw8/Rbyhi3EKawwsU5ieMvfbP/YgNj/ApXf2LPlT82T80XmKk3bxHAMxqBpLk2wMT53tjwR56dN+2ReD2qXVp7YKhrAAAAAgAGc2lnLWNhAAABXIhvdEkABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNTU5NTBaFw0yMjA2MDcxNTU5NTFaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDpM0o66VIk2RvpEB8j0pvEFK6SDtS4TFhBm39cpQ1oDLeyFQCYDRAMNBv9PmVB2VAArjkuuCmXwOesRtdvv5/2+Jhn7gMXIywIJWunWb3o7Q3uyicJ6j1Oc/sPvgHWrKQCstuka3flU8npzWn/tNu0LeMZ9wzcSq9Z/aiK98K71b13X1EpshW3HAr36Yheo3TGfz2hng2sRM2UT9pP4m0wfyPajYQxzKqi5QLwxYt30kgate1ZR8V6zsQqwZ/++a/DgnmPZJ4jKbuwn01e+qPcjuScoFIezVkgX/ZJeKbh9Sll5q3mCIKxqqoxBAndZBthFQtV/4EFXQRp9I9iPMD3AgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQAITpTZykyOuGlztvBF5GaaupthqXJTzc7Nxgk2uBH/9gQZM0cCpul/+3IoPLiiAeNe0blkW5Iv4aRU214x2RNgvKtByS1VCPuzgRvyhtvJ6M8zu99ICcQmfeX2OlpUecSAygVMTqmoWhIkCy89CgT9J7gqulQXnR0wTnERQ9niM8xUvy1Yyoo4EtHVZaFEVq4HX7ZiicWG1QvaJfgv8NY287q6hoFkAp9HYjYDRoK/DLlxOnwhy660dVBUN1uw8/Rbyhi3EKawwsU5ieMvfbP/YgNj/ApXf2LPlT82T80XmKk3bxHAMxqBpLk2wMT53tjwR56dN+2ReD2qXVp7YKhrwgGAvnFN0GoVzryxm68LpaLWUL4=",
"searchguard.key": "/u3+7QAAAAIAAAACAAAAAQANZWxhc3RpY3NlYXJjaAAAAVyIb2+vAAAFATCCBP0wDgYKKwYBBAEqAhEBAQUABIIE6e1Q1v40d1cE607MNpsK0TMVjZCdmMyo3WJ1KcS4fO3RaMKAON9jwD6afLRJS5ehUdaybNItPdnFgacyavhNH3XdEuDnOo+9cJMbJRr96+H3yJuCbQWAmaXzk4vvu4IPXEqd6RePX4U6BccKI4rmoEf0PIhsNTnLNz6zaaUhIvd/+9jh6XUHsFRVWR7ysfCXNNGf1orOxnGVgp8YNLRNdh98VHSQpMh3DwHxOZ7h1MpFfOwCPTh4HkCPEtjGIO7LjuottSwFEjIOcuaqF1VhU/bi6pdmdTwZMMo6PXl0AEV1811dHVhSQV2VBd0v1RdNumzFAR2Vf9CXKERPdRbM3/FkCmP4KZMhVQ31GgdPUxXORcXJufgsTTuFdYCU3QBY6r2ZAawOgnujwj0Rm7xOz4KlYNGYFv3XUMRYbkFGTRTvoYf3qVFL1Yf0xqik8vWsmjUYpYPgv0tKT8yPFa0xKwu2JAbaz46FU1RqQt73oCnX0uvIwKXhMoTYjnT5KXUPcfYBYleif/I9SJVor0koX0f4iN/8tYf3Btj3TUom8NoSzloFo0dApNJoUcBSuoYlcy9Z7/FriLS2fY5Q1/swAOmF32ulEAAlFI0tNnbDZcRn9zHXouYJ2EqIvfHmjuB+uO7Cl2+I1WomX5+fueI0Msc13VXR72rmBYuKzL9LOHMxWAahpBJ61mWYVHluaK9X+p/sVUfjQHFea86n1VksVuBjJN1xR9pyR5XvXlYj4X/ibh7nZckiSmsWLIV5URpEE2/at0rf+wshNQjZOPcJdOeuBoxAUKZNkTxuWOVj3Jcw32fl3Hq6X81ndnrjZR8Q7DYh9yR63QbMttqFX8KlU3yDhL4CJOYR91U8k/9iU0r1jUuNWGr0Sr3vlGvcY36bifcsX6Jm8fr9+KPpaqg72TYAQ55GMj9JhiDU3SMwld9Lxt/XgoHPS7wup/Q0mKGSp2vvXS5QZruKyRnb92zHU3IaItASJlhNtSORQlBAaYuZzEoj1QHHJAIJ2iBej/teMtG/h1jECLpSq/PREBmx7jGGTOAg6qRShM8c+8Xi8YTp7XgbY1/YGyvUIBgQ+M6MTTfD2824wcGrXdYIgdvA0SZeKEcjc07dg+DFGCgozGlyU7aItjU7Y01JmWqYguYIXJFXZmQb1K73zUQ3bSaNd/LGuJryrSlYEIlBHLKPOo+aD24aQ75PR0QY+zUq0dJKtUAtx43bXeSNWraQkWQhJ/ii2JLTlZ0UbQfV6/LGhS5F5sd7B9RNUzAuHvc90ozt31iVqOrv0TCizw1p/KkLIoV1gVAiIfDWErmLZQt/EbMx3pObW9+VCW/E0ybV5E7XP1IU2OKytgW4dBZPcVD7G5/DzUJeY+2sQqkdBsCt8XizMhRDJNm6X1pC3zoaIQu/zCeqQ5eywm9Z4AFB2drW0+9Ed3yJIMf7tVBd5W3vZp47Dzks8DqLpxpG8nJoFS8slpj4f7Oix3lvL01DSzB6fjSJBLxoNfrgQ3KnRExYPnE7NcJJQ7dXGfCEE4rsM5gRd6q9q2yDhSZpbW/mr1O/uG6r8L19d2+eGvl8WzZgclhzODaFb9vjO3SrwyPrZ8I0kpO4Lv3OHDYgYY2hd59trfbANb2F3mHK5rhfxRsONu2cpR672T7z2MGenXHwcc1YrC8NNCSQsHe3SQAAAAIABVguNTA5AAADgjCCA34wggJmoAMCAQICAQcwDQYJKoZIhvcNAQEFBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNjAwMjNaFw0xOTA2MDgxNjAwMjNaMD4xEDAOBgNVBAoTB0xvZ2dpbmcxEjAQBgNVBAsTCU9wZW5TaGlmdDEWMBQGA1UEAxMNZWxhc3RpY3NlYXJjaDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBANiEX0EJO46rcm3KlOicoE4ekmG6RFhW+24WJi1JXjKos8mO17TKLXTVT0NzPcD4lvuvXxe66Pd9TXky99BBZ7/r0pExbo5P6pxZn/jDhQH16iqoz2jQvGDao/aHU36q+eWCpNvdQQeZ6ZjEkhTA8H213LQaQrqrlqm7hEg29/c+r/RbudwU/OheixR67PcQ7MRA12+o7/QXhsZZfHSuDwQQceA8xiJze3ESU6kz/MF+/qLCXUeHH0UeT1lLdEA6A3Oxw/hLbQRtfZSedZuvxvX4GmCMJnwnNWqiFeCOnGOne5DpaC4kWhks5u8mrnCT+b2i4Ni/6utdjc7eW0FbMc0CAwEAAaOBpjCBozAOBgNVHQ8BAf8EBAMCBaAwCQYDVR0TBAIwADAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwHQYDVR0OBBYEFMl42KZzvvOYIJnX0Wy3XidtQAImMAkGA1UdIwQCMAAwPQYDVR0RBDYwNIIJbG9jYWxob3N0hwR/AAABggpsb2dnaW5nLWVzgg5sb2dnaW5nLWVzLW9wc4gFKgMEBQUwDQYJKoZIhvcNAQEFBQADggEBAJmR/6f3nA6rdZecm+k+U7mZu4eD9tcpRROCijakpohQ6HMHPivWwElMZUiHwTf2dHekbed8oh7VMNU/Rl3iCAQFkaFz42bfUAgjVCrL84Z+lq8kElOvC/kHdVMD6YeDn727oj8zKKGXd6EiGOlEaal1Z22r+FeazW/I+fFScyhDkbHoizQjU4vAm42I1HkD13wQEYNowZX8AbBlHePxAcmtG3VG1kQDOo8qRgYp87X+k2XrRZ1Iu/zYE0xG77oTr8GmGHHPlNW0I+lg3bAqyYD6NUFNUpi8/wbdkCMWHBlfpiM+F+geIajyx0hBt9xJ0ZwPQb9L/vkQ8ZMMJfTZWckABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNTU5NTBaFw0yMjA2MDcxNTU5NTFaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDpM0o66VIk2RvpEB8j0pvEFK6SDtS4TFhBm39cpQ1oDLeyFQCYDRAMNBv9PmVB2VAArjkuuCmXwOesRtdvv5/2+Jhn7gMXIywIJWunWb3o7Q3uyicJ6j1Oc/sPvgHWrKQCstuka3flU8npzWn/tNu0LeMZ9wzcSq9Z/aiK98K71b13X1EpshW3HAr36Yheo3TGfz2hng2sRM2UT9pP4m0wfyPajYQxzKqi5QLwxYt30kgate1ZR8V6zsQqwZ/++a/DgnmPZJ4jKbuwn01e+qPcjuScoFIezVkgX/ZJeKbh9Sll5q3mCIKxqqoxBAndZBthFQtV/4EFXQRp9I9iPMD3AgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQAITpTZykyOuGlztvBF5GaaupthqXJTzc7Nxgk2uBH/9gQZM0cCpul/+3IoPLiiAeNe0blkW5Iv4aRU214x2RNgvKtByS1VCPuzgRvyhtvJ6M8zu99ICcQmfeX2OlpUecSAygVMTqmoWhIkCy89CgT9J7gqulQXnR0wTnERQ9niM8xUvy1Yyoo4EtHVZaFEVq4HX7ZiicWG1QvaJfgv8NY287q6hoFkAp9HYjYDRoK/DLlxOnwhy660dVBUN1uw8/Rbyhi3EKawwsU5ieMvfbP/YgNj/ApXf2LPlT82T80XmKk3bxHAMxqBpLk2wMT53tjwR56dN+2ReD2qXVp7YKhrAAAAAgAGc2lnLWNhAAABXIhvbw8ABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNTU5NTBaFw0yMjA2MDcxNTU5NTFaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDpM0o66VIk2RvpEB8j0pvEFK6SDtS4TFhBm39cpQ1oDLeyFQCYDRAMNBv9PmVB2VAArjkuuCmXwOesRtdvv5/2+Jhn7gMXIywIJWunWb3o7Q3uyicJ6j1Oc/sPvgHWrKQCstuka3flU8npzWn/tNu0LeMZ9wzcSq9Z/aiK98K71b13X1EpshW3HAr36Yheo3TGfz2hng2sRM2UT9pP4m0wfyPajYQxzKqi5QLwxYt30kgate1ZR8V6zsQqwZ/++a/DgnmPZJ4jKbuwn01e+qPcjuScoFIezVkgX/ZJeKbh9Sll5q3mCIKxqqoxBAndZBthFQtV/4EFXQRp9I9iPMD3AgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQAITpTZykyOuGlztvBF5GaaupthqXJTzc7Nxgk2uBH/9gQZM0cCpul/+3IoPLiiAeNe0blkW5Iv4aRU214x2RNgvKtByS1VCPuzgRvyhtvJ6M8zu99ICcQmfeX2OlpUecSAygVMTqmoWhIkCy89CgT9J7gqulQXnR0wTnERQ9niM8xUvy1Yyoo4EtHVZaFEVq4HX7ZiicWG1QvaJfgv8NY287q6hoFkAp9HYjYDRoK/DLlxOnwhy660dVBUN1uw8/Rbyhi3EKawwsU5ieMvfbP/YgNj/ApXf2LPlT82T80XmKk3bxHAMxqBpLk2wMT53tjwR56dN+2ReD2qXVp7YKhr7Donb/QsEUP5Iy97pGJudf/URMc=",
"searchguard.truststore": "/u3+7QAAAAIAAAABAAAAAgAGc2lnLWNhAAABXIhvdV8ABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNTU5NTBaFw0yMjA2MDcxNTU5NTFaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDpM0o66VIk2RvpEB8j0pvEFK6SDtS4TFhBm39cpQ1oDLeyFQCYDRAMNBv9PmVB2VAArjkuuCmXwOesRtdvv5/2+Jhn7gMXIywIJWunWb3o7Q3uyicJ6j1Oc/sPvgHWrKQCstuka3flU8npzWn/tNu0LeMZ9wzcSq9Z/aiK98K71b13X1EpshW3HAr36Yheo3TGfz2hng2sRM2UT9pP4m0wfyPajYQxzKqi5QLwxYt30kgate1ZR8V6zsQqwZ/++a/DgnmPZJ4jKbuwn01e+qPcjuScoFIezVkgX/ZJeKbh9Sll5q3mCIKxqqoxBAndZBthFQtV/4EFXQRp9I9iPMD3AgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQAITpTZykyOuGlztvBF5GaaupthqXJTzc7Nxgk2uBH/9gQZM0cCpul/+3IoPLiiAeNe0blkW5Iv4aRU214x2RNgvKtByS1VCPuzgRvyhtvJ6M8zu99ICcQmfeX2OlpUecSAygVMTqmoWhIkCy89CgT9J7gqulQXnR0wTnERQ9niM8xUvy1Yyoo4EtHVZaFEVq4HX7ZiicWG1QvaJfgv8NY287q6hoFkAp9HYjYDRoK/DLlxOnwhy660dVBUN1uw8/Rbyhi3EKawwsU5ieMvfbP/YgNj/ApXf2LPlT82T80XmKk3bxHAMxqBpLk2wMT53tjwR56dN+2ReD2qXVp7YKhrvSCG3VFQZaAYThWkjpSvCSmzvAQ=",
"truststore": "/u3+7QAAAAIAAAABAAAAAgAGc2lnLWNhAAABXIhvdV8ABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDgxNTU5NTBaFw0yMjA2MDcxNTU5NTFaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDpM0o66VIk2RvpEB8j0pvEFK6SDtS4TFhBm39cpQ1oDLeyFQCYDRAMNBv9PmVB2VAArjkuuCmXwOesRtdvv5/2+Jhn7gMXIywIJWunWb3o7Q3uyicJ6j1Oc/sPvgHWrKQCstuka3flU8npzWn/tNu0LeMZ9wzcSq9Z/aiK98K71b13X1EpshW3HAr36Yheo3TGfz2hng2sRM2UT9pP4m0wfyPajYQxzKqi5QLwxYt30kgate1ZR8V6zsQqwZ/++a/DgnmPZJ4jKbuwn01e+qPcjuScoFIezVkgX/ZJeKbh9Sll5q3mCIKxqqoxBAndZBthFQtV/4EFXQRp9I9iPMD3AgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQAITpTZykyOuGlztvBF5GaaupthqXJTzc7Nxgk2uBH/9gQZM0cCpul/+3IoPLiiAeNe0blkW5Iv4aRU214x2RNgvKtByS1VCPuzgRvyhtvJ6M8zu99ICcQmfeX2OlpUecSAygVMTqmoWhIkCy89CgT9J7gqulQXnR0wTnERQ9niM8xUvy1Yyoo4EtHVZaFEVq4HX7ZiicWG1QvaJfgv8NY287q6hoFkAp9HYjYDRoK/DLlxOnwhy660dVBUN1uw8/Rbyhi3EKawwsU5ieMvfbP/YgNj/ApXf2LPlT82T80XmKk3bxHAMxqBpLk2wMT53tjwR56dN+2ReD2qXVp7YKhrvSCG3VFQZaAYThWkjpSvCSmzvAQ="
},
"kind": "Secret",
"metadata": {
"creationTimestamp": null,
"name": "logging-elasticsearch"
},
"type": "Opaque"
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Set logging-es-ops-cluster service] ****
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:168
changed: [openshift] => {
"changed": true,
"results": {
"clusterip": "172.30.187.28",
"cmd": "/bin/oc get service logging-es-ops-cluster -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Service",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:43Z",
"name": "logging-es-ops-cluster",
"namespace": "logging",
"resourceVersion": "1497",
"selfLink": "/api/v1/namespaces/logging/services/logging-es-ops-cluster",
"uid": "a0afb75a-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"clusterIP": "172.30.187.28",
"ports": [
{
"port": 9300,
"protocol": "TCP",
"targetPort": 9300
}
],
"selector": {
"component": "es-ops",
"provider": "openshift"
},
"sessionAffinity": "None",
"type": "ClusterIP"
},
"status": {
"loadBalancer": {}
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Set logging-es-ops service] ************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:182
changed: [openshift] => {
"changed": true,
"results": {
"clusterip": "172.30.179.225",
"cmd": "/bin/oc get service logging-es-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Service",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:45Z",
"name": "logging-es-ops",
"namespace": "logging",
"resourceVersion": "1500",
"selfLink": "/api/v1/namespaces/logging/services/logging-es-ops",
"uid": "a1729743-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"clusterIP": "172.30.179.225",
"ports": [
{
"port": 9200,
"protocol": "TCP",
"targetPort": "restapi"
}
],
"selector": {
"component": "es-ops",
"provider": "openshift"
},
"sessionAffinity": "None",
"type": "ClusterIP"
},
"status": {
"loadBalancer": {}
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Creating ES storage template] **********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:197
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Creating ES storage template] **********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:210
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Set ES storage] ************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:225
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:237
ok: [openshift] => {
"ansible_facts": {
"es_deploy_name": "logging-es-ops-data-master-obwim1kt"
},
"changed": false
}
TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:241
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_elasticsearch : Set ES dc templates] *******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:246
changed: [openshift] => {
"changed": true,
"checksum": "1a5233758f7b1ba5b0e82e3367d1aa07537a3820",
"dest": "/tmp/openshift-logging-ansible-iwnCe7/templates/logging-es-dc.yml",
"gid": 0,
"group": "root",
"md5sum": "85da7da41c0e0f7e767b5723a6643025",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 3178,
"src": "/root/.ansible/tmp/ansible-tmp-1496937645.82-111099138245197/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_elasticsearch : Set ES dc] *****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:262
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get dc logging-es-ops-data-master-obwim1kt -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "DeploymentConfig",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:46Z",
"generation": 2,
"labels": {
"component": "es-ops",
"deployment": "logging-es-ops-data-master-obwim1kt",
"logging-infra": "elasticsearch",
"provider": "openshift"
},
"name": "logging-es-ops-data-master-obwim1kt",
"namespace": "logging",
"resourceVersion": "1514",
"selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-es-ops-data-master-obwim1kt",
"uid": "a25a712f-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"replicas": 1,
"selector": {
"component": "es-ops",
"deployment": "logging-es-ops-data-master-obwim1kt",
"logging-infra": "elasticsearch",
"provider": "openshift"
},
"strategy": {
"activeDeadlineSeconds": 21600,
"recreateParams": {
"timeoutSeconds": 600
},
"resources": {},
"type": "Recreate"
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "es-ops",
"deployment": "logging-es-ops-data-master-obwim1kt",
"logging-infra": "elasticsearch",
"provider": "openshift"
},
"name": "logging-es-ops-data-master-obwim1kt"
},
"spec": {
"containers": [
{
"env": [
{
"name": "NAMESPACE",
"valueFrom": {
"fieldRef": {
"apiVersion": "v1",
"fieldPath": "metadata.namespace"
}
}
},
{
"name": "KUBERNETES_TRUST_CERT",
"value": "true"
},
{
"name": "SERVICE_DNS",
"value": "logging-es-ops-cluster"
},
{
"name": "CLUSTER_NAME",
"value": "logging-es-ops"
},
{
"name": "INSTANCE_RAM",
"value": "8Gi"
},
{
"name": "NODE_QUORUM",
"value": "1"
},
{
"name": "RECOVER_EXPECTED_NODES",
"value": "1"
},
{
"name": "RECOVER_AFTER_TIME",
"value": "5m"
},
{
"name": "READINESS_PROBE_TIMEOUT",
"value": "30"
},
{
"name": "IS_MASTER",
"value": "true"
},
{
"name": "HAS_DATA",
"value": "true"
}
],
"image": "172.30.255.47:5000/logging/logging-elasticsearch:latest",
"imagePullPolicy": "Always",
"name": "elasticsearch",
"ports": [
{
"containerPort": 9200,
"name": "restapi",
"protocol": "TCP"
},
{
"containerPort": 9300,
"name": "cluster",
"protocol": "TCP"
}
],
"readinessProbe": {
"exec": {
"command": [
"/usr/share/elasticsearch/probe/readiness.sh"
]
},
"failureThreshold": 3,
"initialDelaySeconds": 10,
"periodSeconds": 5,
"successThreshold": 1,
"timeoutSeconds": 30
},
"resources": {
"limits": {
"cpu": "1",
"memory": "8Gi"
},
"requests": {
"memory": "512Mi"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/etc/elasticsearch/secret",
"name": "elasticsearch",
"readOnly": true
},
{
"mountPath": "/usr/share/java/elasticsearch/config",
"name": "elasticsearch-config",
"readOnly": true
},
{
"mountPath": "/elasticsearch/persistent",
"name": "elasticsearch-storage"
}
]
}
],
"dnsPolicy": "ClusterFirst",
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {
"supplementalGroups": [
65534
]
},
"serviceAccount": "aggregated-logging-elasticsearch",
"serviceAccountName": "aggregated-logging-elasticsearch",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"name": "elasticsearch",
"secret": {
"defaultMode": 420,
"secretName": "logging-elasticsearch"
}
},
{
"configMap": {
"defaultMode": 420,
"name": "logging-elasticsearch"
},
"name": "elasticsearch-config"
},
{
"emptyDir": {},
"name": "elasticsearch-storage"
}
]
}
},
"test": false,
"triggers": [
{
"type": "ConfigChange"
}
]
},
"status": {
"availableReplicas": 0,
"conditions": [
{
"lastTransitionTime": "2017-06-08T16:00:46Z",
"lastUpdateTime": "2017-06-08T16:00:46Z",
"message": "Deployment config does not have minimum availability.",
"status": "False",
"type": "Available"
},
{
"lastTransitionTime": "2017-06-08T16:00:46Z",
"lastUpdateTime": "2017-06-08T16:00:46Z",
"message": "replication controller \"logging-es-ops-data-master-obwim1kt-1\" is waiting for pod \"logging-es-ops-data-master-obwim1kt-1-deploy\" to run",
"status": "Unknown",
"type": "Progressing"
}
],
"details": {
"causes": [
{
"type": "ConfigChange"
}
],
"message": "config change"
},
"latestVersion": 1,
"observedGeneration": 2,
"replicas": 0,
"unavailableReplicas": 0,
"updatedReplicas": 0
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_elasticsearch : Delete temp directory] *****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:274
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-iwnCe7",
"state": "absent"
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:151
statically included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml
TASK [openshift_logging_kibana : fail] *****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"kibana_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : fail] *****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:15
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : Create temp directory for doing work in] ******
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:7
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.002002",
"end": "2017-06-08 12:00:48.014003",
"rc": 0,
"start": "2017-06-08 12:00:48.012001"
}
STDOUT:
/tmp/openshift-logging-ansible-HUyAQM
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:12
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-HUyAQM"
},
"changed": false
}
TASK [openshift_logging_kibana : Create templates subdirectory] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:16
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-HUyAQM/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_kibana : Create Kibana service account] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:26
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : Create Kibana service account] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:34
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-kibana -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-kibana-dockercfg-sx7gd"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:48Z",
"name": "aggregated-logging-kibana",
"namespace": "logging",
"resourceVersion": "1525",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-kibana",
"uid": "a3b2f554-4c63-11e7-94aa-0e1649350dc2"
},
"secrets": [
{
"name": "aggregated-logging-kibana-token-q016z"
},
{
"name": "aggregated-logging-kibana-dockercfg-sx7gd"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:42
ok: [openshift] => {
"ansible_facts": {
"kibana_component": "kibana",
"kibana_name": "logging-kibana"
},
"changed": false
}
TASK [openshift_logging_kibana : Retrieving the cert to use when generating secrets for the logging components] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:46
ok: [openshift] => (item={u'name': u'ca_file', u'file': u'ca.crt'}) => {
"changed": false,
"content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU1Gb1hEVEl5TURZd056RTFOVGsxTVZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU9relNqcnBVaVRaRytrUUh5UFNtOFFVcnBJTzFMaE1XRUdiZjF5bERXZ00KdDdJVkFKZ05FQXcwRy8wK1pVSFpVQUN1T1M2NEtaZkE1NnhHMTIrL24vYjRtR2Z1QXhjakxBZ2xhNmRadmVqdApEZTdLSnducVBVNXordysrQWRhc3BBS3kyNlJyZCtWVHllbk5hZiswMjdRdDR4bjNETnhLcjFuOXFJcjN3cnZWCnZYZGZVU215RmJjY0N2ZnBpRjZqZE1aL1BhR2VEYXhFelpSUDJrL2liVEIvSTlxTmhESE1xcUxsQXZERmkzZlMKU0JxMTdWbEh4WHJPeENyQm4vNzVyOE9DZVk5a25pTXB1N0NmVFY3Nm85eU81SnlnVWg3TldTQmY5a2w0cHVIMQpLV1htcmVZSWdyR3FxakVFQ2Qxa0cyRVZDMVgvZ1FWZEJHbjBqMkk4d1BjQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFBaE8KbE5uS1RJNjRhWE8yOEVYa1pwcTZtMkdwY2xQTnpzM0dDVGE0RWYvMkJCa3pSd0ttNlgvN2NpZzh1S0lCNDE3Ugp1V1Jia2kvaHBGVGJYakhaRTJDOHEwSEpMVlVJKzdPQkcvS0cyOG5venpPNzMwZ0p4Q1o5NWZZNldsUjV4SURLCkJVeE9xYWhhRWlRTEx6MEtCUDBudUNxNlZCZWRIVEJPY1JGRDJlSXp6RlMvTFZqS2lqZ1MwZFZsb1VSV3JnZGYKdG1LSnhZYlZDOW9sK0MvdzFqYnp1cnFHZ1dRQ24wZGlOZ05HZ3I4TXVYRTZmQ0hMcnJSMVVGUTNXN0R6OUZ2SwpHTGNRcHJEQ3hUbUo0eTk5cy85aUEyUDhDbGQvWXMrVlB6WlB6UmVZcVRkdkVjQXpHb0drdVRiQXhQbmUyUEJICm5wMDM3WkY0UGFwZFdudGdxR3M9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K",
"encoding": "base64",
"item": {
"file": "ca.crt",
"name": "ca_file"
},
"source": "/etc/origin/logging/ca.crt"
}
ok: [openshift] => (item={u'name': u'kibana_internal_key', u'file': u'kibana-internal.key'}) => {
"changed": false,
"content": "LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktLS0tLQpNSUlFb3dJQkFBS0NBUUVBd0I5Y20zaG1ONENzMzVHL2pLZVBiS3hESk5ZUG1xSTQ2MmJWUXJQb3VvWGduSVpTCis3aHlyMGp6bTJMcDhTRjVCcEduaHYzWkcweXlwYkdIN0ZOVG8zNWRacnMvSEE5YnB5ZWxob1F5V0ZuTEw0U0UKSE1PN1pxNVVsdXBqQmducUk5YUVkK1krRjN5ZC9SQ3FuTFJhdDY1MDlFVzRLa3lFa1pTMitXeFVCYkF1R1pudwpqcTJaZm93emEvUFpvYXVWWDFGeCtJWnZHbE0yWDU1eHNsRnBiWHFMa3poSXlIOFZwMnl0d2ZlcXZkOGFRdFBJCmh1ZWhLUDArMXBqMDJIWVFJUDA3bnVqU3hPK0NMU2lrVVJkMHdWYlZuQUk0SHovcFU3MUNZUThsRHFzTmFpT2YKTzZnd0o5SC9NYVZ5ZVVSamtBTzJWMkFjYVBlY2NxK2lDN2FVV3dJREFRQUJBb0lCQUREaFNjeDhhM1UvbGJ3ago5dG93WDN0RXNLaVVsYysvNmo5cUlHUWlKZG9lNmJDcE5EKzdBK0s0NnRIajdxVmM0TS9kQ3dSN1hWdG12aVVOCjhBa2VnaThjbldMZnpRUzBtNXNCcVVsNkpOejVxNHBoYXNOdXdTVTB3V3pNSVhtTjJEWmFBOFlGbkZLWmNCRE0KeHJ1cjMxRFFZQTB1RjljYk5MZGRZTmhBeVhmUWxEQUo4Z2piWWN4ZmVudERJSU13SjFDSENYSWExdklvUXdabwpTWjdHOUN4VFFsNHBlUFVHbjF4ZjJPaEY2ZUlXM1E3WEYvV1FZME9iQmY1SjhrQ2pjcThDbldnMGxoamlVaFJBCmxlaFg3L1lHbmdjMzJPdjZKd1hzOEJyckM3c3k2bjlsMy9TOEU4a1pJWnRhNDlmSW1pSHZRTEQ0U1YvWEN4UjgKZTdrZi95RUNnWUVBejI2ajgwcDlzOEN1Tk1Ca0J2TzhvdnVQQ1dqa3ViWVgzU21KOWpVRXduR0xPUUFNUUUvdwp0L05BcmhSQU1pSDlKbVAvTmdCWWNFaURpUEExcXdhblNWczJaY3J4SUtQUmQ5dGdDa1cwVHZJVlZqMFBZS29ZCkZBV2QyWkY0OWhPNEV4T0xIYndXVXp4a2NEdUNpc2I0NXY2TEU3dTg0akpLWjVLOG9uUGpNSU1DZ1lFQTdSc1IKZUhMTytrbS9EMnl1VVdaNEd1R08rYmxMeHBKbFZqanlQb1ZPRkxXZ0xtbGFLdkpZTVovcThWWTRwcU5OV0tkWApab05jVEN6T3JuckROUVcwenlYOXE2K2tVck9rYllYNWRBdlpxNGx6ZktQNmkvdlpOWkdkdkVYNUkvdk1pYnVWCmFoY3lxalNHcWwvNVZ6NG45ZUdmNWpDRXN5b2NlUVVNb3BiUkZVa0NnWUEvQkhmZWc3VG9sUkxYaDlOYm9WU2YKbHhqL1hOU1A3dGdWSW5kOVN1SWxTR1ZwYmJCTElYNGFCRmFVRENic2xCTGFST3JWdHdrbkk0Q0NhNmVDUzhVcQpyZ0U2cjRyTnhiYnZXTUEybnJLR2dWa21GK3JDRFNxL2VtMVlHYS9MNG5XN3BlWlBwRUtNQ3Y3Z2NkUFk0VlhnCnAxZ05LSzNiY2pmVWUybS9XTUdlalFLQmdRRGhWalJJVUhROGtoR3VTdzl2OVA1NExaMS8zNFlRZGRreEZIWEUKelZQamdxbDA4bExyTmQ1emF4UVJ3R3Vla3R4VFFOWmphcnd3K1BTRUJjKzNlSERaM2JVemtYMk55NFNkUWhKTgpJMlgvREdaaE1rWk8rMDczQmlqdVlXSGh2TkFxcGNmZVI2V3k5TEIzQXpjb25yM0RoR1kra2lYTFVGNDI5WUdJCmsrU3BZUUtCZ0ZIRkJ4VDVxOW9XbUtBd0VMa1d6Z05XU0NqYlFrSjQ0RVROYjlqeEFsMGlyMHlwVGtXVmNDZDMKSFdxMVlzOEJ5bFdNaWEzcTEvSEhaUEQrNGRBSUs2UitSVEMvK1MvNitpUTBTcm1aU2R5S05rT3RZbjdETlBicAordWxTTk1TV2xuUmtsKy8xWTRpeVNrWVE4Y1plUFdtbVc3RStpYm9jU1RRQjZFS0hVQndrCi0tLS0tRU5EIFJTQSBQUklWQVRFIEtFWS0tLS0tCg==",
"encoding": "base64",
"item": {
"file": "kibana-internal.key",
"name": "kibana_internal_key"
},
"source": "/etc/origin/logging/kibana-internal.key"
}
ok: [openshift] => (item={u'name': u'kibana_internal_cert', u'file': u'kibana-internal.crt'}) => {
"changed": false,
"content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURUakNDQWphZ0F3SUJBZ0lCQWpBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU1sb1hEVEU1TURZd09ERTFOVGsxTTFvdwpGakVVTUJJR0ExVUVBeE1MSUd0cFltRnVZUzF2Y0hNd2dnRWlNQTBHQ1NxR1NJYjNEUUVCQVFVQUE0SUJEd0F3CmdnRUtBb0lCQVFEQUgxeWJlR1kzZ0t6ZmtiK01wNDlzckVNazFnK2FvampyWnRWQ3MraTZoZUNjaGxMN3VIS3YKU1BPYll1bnhJWGtHa2FlRy9ka2JUTEtsc1lmc1UxT2pmbDFtdXo4Y0QxdW5KNldHaERKWVdjc3ZoSVFjdzd0bQpybFNXNm1NR0Nlb2oxb1IzNWo0WGZKMzlFS3FjdEZxM3JuVDBSYmdxVElTUmxMYjViRlFGc0M0Wm1mQ09yWmwrCmpETnI4OW1ocTVWZlVYSDRobThhVXpaZm5uR3lVV2x0ZW91VE9FaklmeFduYkszQjk2cTkzeHBDMDhpRzU2RW8KL1Q3V21QVFlkaEFnL1R1ZTZOTEU3NEl0S0tSUkYzVEJWdFdjQWpnZlArbFR2VUpoRHlVT3F3MXFJNTg3cURBbgowZjh4cFhKNVJHT1FBN1pYWUJ4bzk1eHlyNklMdHBSYkFnTUJBQUdqZ1o0d2dac3dEZ1lEVlIwUEFRSC9CQVFECkFnV2dNQk1HQTFVZEpRUU1NQW9HQ0NzR0FRVUZCd01CTUF3R0ExVWRFd0VCL3dRQ01BQXdaZ1lEVlIwUkJGOHcKWFlJTElHdHBZbUZ1WVMxdmNIT0NMQ0JyYVdKaGJtRXRiM0J6TG5KdmRYUmxjaTVrWldaaGRXeDBMbk4yWXk1agpiSFZ6ZEdWeUxteHZZMkZzZ2hnZ2EybGlZVzVoTGpFeU55NHdMakF1TVM1NGFYQXVhVytDQm10cFltRnVZVEFOCkJna3Foa2lHOXcwQkFRc0ZBQU9DQVFFQVZla3NNSVBmSGZhbElUTk5FL2NvYzM3a2lCc2xkMTh6Tzc5cXZOZTIKZnV1WHhjOHd0WmpWcGNkZlAwOVRSS1hSSUhyNC9xSDJUVWpTWTNYajc0STRQQmNobURZMnpwK3pIME1oZTR6ZAo3bjEza0c1aTd5aFFiQlRuMTlQaWNpS3lRdGNidnM2TElGVjF4MVplWHh1WW5MdGRMS0s0MnkzbXlHcmFMRmd4CjYxcVYvbFZGTVpwcGVRQTFSVWFrWGEvNTFNcnlCd2pQY3lnR2ovSjNoc29vcUh0WUVvVVR0R1NjaWRCQyt4aW8KbkplckNRVm1LbTRZaXZIcm9PaFpVYzJmeEVVOGR6RXJkbFZaeWF0TjNxNXFWM0hkTjZYUGQ3dDhhOEx5ckNqbwpibmRMeVJ5bnNHVW1PMFFSZTVrUUdjQkI1UVpMNngyK2Y1cnRSOGFJUVkwaTNRPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQotLS0tLUJFR0lOIENFUlRJRklDQVRFLS0tLS0KTUlJQzJqQ0NBY0tnQXdJQkFnSUJBVEFOQmdrcWhraUc5dzBCQVFzRkFEQWVNUnd3R2dZRFZRUURFeE5zYjJkbgphVzVuTFhOcFoyNWxjaTEwWlhOME1CNFhEVEUzTURZd09ERTFOVGsxTUZvWERUSXlNRFl3TnpFMU5UazFNVm93CkhqRWNNQm9HQTFVRUF4TVRiRzluWjJsdVp5MXphV2R1WlhJdGRHVnpkRENDQVNJd0RRWUpLb1pJaHZjTkFRRUIKQlFBRGdnRVBBRENDQVFvQ2dnRUJBT2t6U2pycFVpVFpHK2tRSHlQU204UVVycElPMUxoTVdFR2JmMXlsRFdnTQp0N0lWQUpnTkVBdzBHLzArWlVIWlVBQ3VPUzY0S1pmQTU2eEcxMisvbi9iNG1HZnVBeGNqTEFnbGE2ZFp2ZWp0CkRlN0tKd25xUFU1eit3KytBZGFzcEFLeTI2UnJkK1ZUeWVuTmFmKzAyN1F0NHhuM0ROeEtyMW45cUlyM3dydlYKdlhkZlVTbXlGYmNjQ3ZmcGlGNmpkTVovUGFHZURheEV6WlJQMmsvaWJUQi9JOXFOaERITXFxTGxBdkRGaTNmUwpTQnExN1ZsSHhYck94Q3JCbi83NXI4T0NlWTlrbmlNcHU3Q2ZUVjc2bzl5TzVKeWdVaDdOV1NCZjlrbDRwdUgxCktXWG1yZVlJZ3JHcXFqRUVDZDFrRzJFVkMxWC9nUVZkQkduMGoySTh3UGNDQXdFQUFhTWpNQ0V3RGdZRFZSMFAKQVFIL0JBUURBZ0trTUE4R0ExVWRFd0VCL3dRRk1BTUJBZjh3RFFZSktvWklodmNOQVFFTEJRQURnZ0VCQUFoTwpsTm5LVEk2NGFYTzI4RVhrWnBxNm0yR3BjbFBOenMzR0NUYTRFZi8yQkJrelJ3S202WC83Y2lnOHVLSUI0MTdSCnVXUmJraS9ocEZUYlhqSFpFMkM4cTBISkxWVUkrN09CRy9LRzI4bm96ek83MzBnSnhDWjk1Zlk2V2xSNXhJREsKQlV4T3FhaGFFaVFMTHowS0JQMG51Q3E2VkJlZEhUQk9jUkZEMmVJenpGUy9MVmpLaWpnUzBkVmxvVVJXcmdkZgp0bUtKeFliVkM5b2wrQy93MWpienVycUdnV1FDbjBkaU5nTkdncjhNdVhFNmZDSExyclIxVUZRM1c3RHo5RnZLCkdMY1FwckRDeFRtSjR5OTlzLzlpQTJQOENsZC9ZcytWUHpaUHpSZVlxVGR2RWNBekdvR2t1VGJBeFBuZTJQQkgKbnAwMzdaRjRQYXBkV250Z3FHcz0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
"encoding": "base64",
"item": {
"file": "kibana-internal.crt",
"name": "kibana_internal_cert"
},
"source": "/etc/origin/logging/kibana-internal.crt"
}
ok: [openshift] => (item={u'name': u'server_tls', u'file': u'server-tls.json'}) => {
"changed": false,
"content": "Ly8gU2VlIGZvciBhdmFpbGFibGUgb3B0aW9uczogaHR0cHM6Ly9ub2RlanMub3JnL2FwaS90bHMuaHRtbCN0bHNfdGxzX2NyZWF0ZXNlcnZlcl9vcHRpb25zX3NlY3VyZWNvbm5lY3Rpb25saXN0ZW5lcgp0bHNfb3B0aW9ucyA9IHsKCWNpcGhlcnM6ICdrRUVDREg6K2tFRUNESCtTSEE6a0VESDora0VESCtTSEE6K2tFREgrQ0FNRUxMSUE6a0VDREg6K2tFQ0RIK1NIQTprUlNBOitrUlNBK1NIQTora1JTQStDQU1FTExJQTohYU5VTEw6IWVOVUxMOiFTU0x2MjohUkM0OiFERVM6IUVYUDohU0VFRDohSURFQTorM0RFUycsCglob25vckNpcGhlck9yZGVyOiB0cnVlCn0K",
"encoding": "base64",
"item": {
"file": "server-tls.json",
"name": "server_tls"
},
"source": "/etc/origin/logging/server-tls.json"
}
TASK [openshift_logging_kibana : Set logging-kibana service] *******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:57
changed: [openshift] => {
"changed": true,
"results": {
"clusterip": "172.30.151.35",
"cmd": "/bin/oc get service logging-kibana -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Service",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:50Z",
"name": "logging-kibana",
"namespace": "logging",
"resourceVersion": "1543",
"selfLink": "/api/v1/namespaces/logging/services/logging-kibana",
"uid": "a4df81eb-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"clusterIP": "172.30.151.35",
"ports": [
{
"port": 443,
"protocol": "TCP",
"targetPort": "oaproxy"
}
],
"selector": {
"component": "kibana",
"provider": "openshift"
},
"sessionAffinity": "None",
"type": "ClusterIP"
},
"status": {
"loadBalancer": {}
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:74
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_key | trim | length
> 0 }}
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:79
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_cert | trim | length
> 0 }}
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:84
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_ca | trim | length >
0 }}
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:89
ok: [openshift] => {
"ansible_facts": {
"kibana_ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU1Gb1hEVEl5TURZd056RTFOVGsxTVZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU9relNqcnBVaVRaRytrUUh5UFNtOFFVcnBJTzFMaE1XRUdiZjF5bERXZ00KdDdJVkFKZ05FQXcwRy8wK1pVSFpVQUN1T1M2NEtaZkE1NnhHMTIrL24vYjRtR2Z1QXhjakxBZ2xhNmRadmVqdApEZTdLSnducVBVNXordysrQWRhc3BBS3kyNlJyZCtWVHllbk5hZiswMjdRdDR4bjNETnhLcjFuOXFJcjN3cnZWCnZYZGZVU215RmJjY0N2ZnBpRjZqZE1aL1BhR2VEYXhFelpSUDJrL2liVEIvSTlxTmhESE1xcUxsQXZERmkzZlMKU0JxMTdWbEh4WHJPeENyQm4vNzVyOE9DZVk5a25pTXB1N0NmVFY3Nm85eU81SnlnVWg3TldTQmY5a2w0cHVIMQpLV1htcmVZSWdyR3FxakVFQ2Qxa0cyRVZDMVgvZ1FWZEJHbjBqMkk4d1BjQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFBaE8KbE5uS1RJNjRhWE8yOEVYa1pwcTZtMkdwY2xQTnpzM0dDVGE0RWYvMkJCa3pSd0ttNlgvN2NpZzh1S0lCNDE3Ugp1V1Jia2kvaHBGVGJYakhaRTJDOHEwSEpMVlVJKzdPQkcvS0cyOG5venpPNzMwZ0p4Q1o5NWZZNldsUjV4SURLCkJVeE9xYWhhRWlRTEx6MEtCUDBudUNxNlZCZWRIVEJPY1JGRDJlSXp6RlMvTFZqS2lqZ1MwZFZsb1VSV3JnZGYKdG1LSnhZYlZDOW9sK0MvdzFqYnp1cnFHZ1dRQ24wZGlOZ05HZ3I4TXVYRTZmQ0hMcnJSMVVGUTNXN0R6OUZ2SwpHTGNRcHJEQ3hUbUo0eTk5cy85aUEyUDhDbGQvWXMrVlB6WlB6UmVZcVRkdkVjQXpHb0drdVRiQXhQbmUyUEJICm5wMDM3WkY0UGFwZFdudGdxR3M9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K"
},
"changed": false
}
TASK [openshift_logging_kibana : Generating Kibana route template] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:94
ok: [openshift] => {
"changed": false,
"checksum": "500fd9d30c73bb2d73d12b4b409258f8379f9738",
"dest": "/tmp/openshift-logging-ansible-HUyAQM/templates/kibana-route.yaml",
"gid": 0,
"group": "root",
"md5sum": "d52dcb3dd25539481da1bc55dd395963",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2714,
"src": "/root/.ansible/tmp/ansible-tmp-1496937651.5-52285772071439/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_kibana : Setting Kibana route] *************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:114
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get route logging-kibana -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Route",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:52Z",
"labels": {
"component": "support",
"logging-infra": "support",
"provider": "openshift"
},
"name": "logging-kibana",
"namespace": "logging",
"resourceVersion": "1549",
"selfLink": "/oapi/v1/namespaces/logging/routes/logging-kibana",
"uid": "a5cdf609-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"host": "kibana.router.default.svc.cluster.local",
"tls": {
"caCertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwODE1NTk1MFoXDTIyMDYwNzE1NTk1MVow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAOkzSjrpUiTZG+kQHyPSm8QUrpIO1LhMWEGbf1ylDWgM\nt7IVAJgNEAw0G/0+ZUHZUACuOS64KZfA56xG12+/n/b4mGfuAxcjLAgla6dZvejt\nDe7KJwnqPU5z+w++AdaspAKy26Rrd+VTyenNaf+027Qt4xn3DNxKr1n9qIr3wrvV\nvXdfUSmyFbccCvfpiF6jdMZ/PaGeDaxEzZRP2k/ibTB/I9qNhDHMqqLlAvDFi3fS\nSBq17VlHxXrOxCrBn/75r8OCeY9kniMpu7CfTV76o9yO5JygUh7NWSBf9kl4puH1\nKWXmreYIgrGqqjEECd1kG2EVC1X/gQVdBGn0j2I8wPcCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAAhO\nlNnKTI64aXO28EXkZpq6m2GpclPNzs3GCTa4Ef/2BBkzRwKm6X/7cig8uKIB417R\nuWRbki/hpFTbXjHZE2C8q0HJLVUI+7OBG/KG28nozzO730gJxCZ95fY6WlR5xIDK\nBUxOqahaEiQLLz0KBP0nuCq6VBedHTBOcRFD2eIzzFS/LVjKijgS0dVloURWrgdf\ntmKJxYbVC9ol+C/w1jbzurqGgWQCn0diNgNGgr8MuXE6fCHLrrR1UFQ3W7Dz9FvK\nGLcQprDCxTmJ4y99s/9iA2P8Cld/Ys+VPzZPzReYqTdvEcAzGoGkuTbAxPne2PBH\nnp037ZF4PapdWntgqGs=\n-----END CERTIFICATE-----\n",
"destinationCACertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwODE1NTk1MFoXDTIyMDYwNzE1NTk1MVow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAOkzSjrpUiTZG+kQHyPSm8QUrpIO1LhMWEGbf1ylDWgM\nt7IVAJgNEAw0G/0+ZUHZUACuOS64KZfA56xG12+/n/b4mGfuAxcjLAgla6dZvejt\nDe7KJwnqPU5z+w++AdaspAKy26Rrd+VTyenNaf+027Qt4xn3DNxKr1n9qIr3wrvV\nvXdfUSmyFbccCvfpiF6jdMZ/PaGeDaxEzZRP2k/ibTB/I9qNhDHMqqLlAvDFi3fS\nSBq17VlHxXrOxCrBn/75r8OCeY9kniMpu7CfTV76o9yO5JygUh7NWSBf9kl4puH1\nKWXmreYIgrGqqjEECd1kG2EVC1X/gQVdBGn0j2I8wPcCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAAhO\nlNnKTI64aXO28EXkZpq6m2GpclPNzs3GCTa4Ef/2BBkzRwKm6X/7cig8uKIB417R\nuWRbki/hpFTbXjHZE2C8q0HJLVUI+7OBG/KG28nozzO730gJxCZ95fY6WlR5xIDK\nBUxOqahaEiQLLz0KBP0nuCq6VBedHTBOcRFD2eIzzFS/LVjKijgS0dVloURWrgdf\ntmKJxYbVC9ol+C/w1jbzurqGgWQCn0diNgNGgr8MuXE6fCHLrrR1UFQ3W7Dz9FvK\nGLcQprDCxTmJ4y99s/9iA2P8Cld/Ys+VPzZPzReYqTdvEcAzGoGkuTbAxPne2PBH\nnp037ZF4PapdWntgqGs=\n-----END CERTIFICATE-----\n",
"insecureEdgeTerminationPolicy": "Redirect",
"termination": "reencrypt"
},
"to": {
"kind": "Service",
"name": "logging-kibana",
"weight": 100
},
"wildcardPolicy": "None"
},
"status": {
"ingress": [
{
"conditions": [
{
"lastTransitionTime": "2017-06-08T16:00:52Z",
"status": "True",
"type": "Admitted"
}
],
"host": "kibana.router.default.svc.cluster.local",
"routerName": "router",
"wildcardPolicy": "None"
}
]
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Generate proxy session] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:125
ok: [openshift] => {
"ansible_facts": {
"session_secret": "bFiK8fH1xm354qYUS5GyArQHgNU99cEPaQDbhyl37X3WMnviNUJqT3ZStJ9PIP0VezeJqFFw1QweGMwITppuddBwsZduaBhFyLECbdQTlm9BK9poI821L7tC7fCLqv8MS25XB0hF071aFwn4ScOtaXc24ihIcMJj64IqTTdqGzUtmkp9hxk2LErdUlhwceVkx0CPyS14"
},
"changed": false
}
TASK [openshift_logging_kibana : Generate oauth client secret] *****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:132
ok: [openshift] => {
"ansible_facts": {
"oauth_secret": "raNePxvdsJybnJ4nQx2pblKa56CqkEggXPWQlj9Dy4RvlEqC47H15C1GV4dSTGrG"
},
"changed": false
}
TASK [openshift_logging_kibana : Create oauth-client template] *****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:138
changed: [openshift] => {
"changed": true,
"checksum": "740b0b0f77d101b77abe166c612bba1d4db8f1bf",
"dest": "/tmp/openshift-logging-ansible-HUyAQM/templates/oauth-client.yml",
"gid": 0,
"group": "root",
"md5sum": "3dc2607cdc050a63893dca7d09e43ad8",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 328,
"src": "/root/.ansible/tmp/ansible-tmp-1496937652.98-255803401153398/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_kibana : Set kibana-proxy oauth-client] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:146
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get oauthclient kibana-proxy -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "OAuthClient",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:54Z",
"labels": {
"logging-infra": "support"
},
"name": "kibana-proxy",
"resourceVersion": "1555",
"selfLink": "/oapi/v1/oauthclients/kibana-proxy",
"uid": "a6ba2ddf-4c63-11e7-94aa-0e1649350dc2"
},
"redirectURIs": [
"https://kibana.router.default.svc.cluster.local"
],
"scopeRestrictions": [
{
"literals": [
"user:info",
"user:check-access",
"user:list-projects"
]
}
],
"secret": "raNePxvdsJybnJ4nQx2pblKa56CqkEggXPWQlj9Dy4RvlEqC47H15C1GV4dSTGrG"
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Set Kibana secret] ****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:157
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc secrets new logging-kibana ca=/etc/origin/logging/ca.crt key=/etc/origin/logging/system.logging.kibana.key cert=/etc/origin/logging/system.logging.kibana.crt -n logging",
"results": "",
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Set Kibana Proxy secret] **********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:171
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc secrets new logging-kibana-proxy oauth-secret=/tmp/oauth-secret-QUrbsO session-secret=/tmp/session-secret-iai_x9 server-key=/tmp/server-key-iLT_Gx server-cert=/tmp/server-cert-rF44hw server-tls.json=/tmp/server-tls.json-TMcYfG -n logging",
"results": "",
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Generate Kibana DC template] ******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:197
changed: [openshift] => {
"changed": true,
"checksum": "2c7e24b14c3acffccefdfca36c2aa1b83d0945c0",
"dest": "/tmp/openshift-logging-ansible-HUyAQM/templates/kibana-dc.yaml",
"gid": 0,
"group": "root",
"md5sum": "125a52b62125e1ce2d1019ec1f329395",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 3737,
"src": "/root/.ansible/tmp/ansible-tmp-1496937655.81-92033783142500/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_kibana : Set Kibana DC] ********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:216
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get dc logging-kibana -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "DeploymentConfig",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:57Z",
"generation": 2,
"labels": {
"component": "kibana",
"logging-infra": "kibana",
"provider": "openshift"
},
"name": "logging-kibana",
"namespace": "logging",
"resourceVersion": "1570",
"selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-kibana",
"uid": "a885b50b-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"replicas": 1,
"selector": {
"component": "kibana",
"logging-infra": "kibana",
"provider": "openshift"
},
"strategy": {
"activeDeadlineSeconds": 21600,
"resources": {},
"rollingParams": {
"intervalSeconds": 1,
"maxSurge": "25%",
"maxUnavailable": "25%",
"timeoutSeconds": 600,
"updatePeriodSeconds": 1
},
"type": "Rolling"
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "kibana",
"logging-infra": "kibana",
"provider": "openshift"
},
"name": "logging-kibana"
},
"spec": {
"containers": [
{
"env": [
{
"name": "ES_HOST",
"value": "logging-es"
},
{
"name": "ES_PORT",
"value": "9200"
},
{
"name": "KIBANA_MEMORY_LIMIT",
"valueFrom": {
"resourceFieldRef": {
"containerName": "kibana",
"divisor": "0",
"resource": "limits.memory"
}
}
}
],
"image": "172.30.255.47:5000/logging/logging-kibana:latest",
"imagePullPolicy": "Always",
"name": "kibana",
"readinessProbe": {
"exec": {
"command": [
"/usr/share/kibana/probe/readiness.sh"
]
},
"failureThreshold": 3,
"initialDelaySeconds": 5,
"periodSeconds": 5,
"successThreshold": 1,
"timeoutSeconds": 4
},
"resources": {
"limits": {
"memory": "736Mi"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/etc/kibana/keys",
"name": "kibana",
"readOnly": true
}
]
},
{
"env": [
{
"name": "OAP_BACKEND_URL",
"value": "http://localhost:5601"
},
{
"name": "OAP_AUTH_MODE",
"value": "oauth2"
},
{
"name": "OAP_TRANSFORM",
"value": "user_header,token_header"
},
{
"name": "OAP_OAUTH_ID",
"value": "kibana-proxy"
},
{
"name": "OAP_MASTER_URL",
"value": "https://kubernetes.default.svc.cluster.local"
},
{
"name": "OAP_PUBLIC_MASTER_URL",
"value": "https://172.18.3.237:8443"
},
{
"name": "OAP_LOGOUT_REDIRECT",
"value": "https://172.18.3.237:8443/console/logout"
},
{
"name": "OAP_MASTER_CA_FILE",
"value": "/var/run/secrets/kubernetes.io/serviceaccount/ca.crt"
},
{
"name": "OAP_DEBUG",
"value": "False"
},
{
"name": "OAP_OAUTH_SECRET_FILE",
"value": "/secret/oauth-secret"
},
{
"name": "OAP_SERVER_CERT_FILE",
"value": "/secret/server-cert"
},
{
"name": "OAP_SERVER_KEY_FILE",
"value": "/secret/server-key"
},
{
"name": "OAP_SERVER_TLS_FILE",
"value": "/secret/server-tls.json"
},
{
"name": "OAP_SESSION_SECRET_FILE",
"value": "/secret/session-secret"
},
{
"name": "OCP_AUTH_PROXY_MEMORY_LIMIT",
"valueFrom": {
"resourceFieldRef": {
"containerName": "kibana-proxy",
"divisor": "0",
"resource": "limits.memory"
}
}
}
],
"image": "172.30.255.47:5000/logging/logging-auth-proxy:latest",
"imagePullPolicy": "Always",
"name": "kibana-proxy",
"ports": [
{
"containerPort": 3000,
"name": "oaproxy",
"protocol": "TCP"
}
],
"resources": {
"limits": {
"memory": "96Mi"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/secret",
"name": "kibana-proxy",
"readOnly": true
}
]
}
],
"dnsPolicy": "ClusterFirst",
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {},
"serviceAccount": "aggregated-logging-kibana",
"serviceAccountName": "aggregated-logging-kibana",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"name": "kibana",
"secret": {
"defaultMode": 420,
"secretName": "logging-kibana"
}
},
{
"name": "kibana-proxy",
"secret": {
"defaultMode": 420,
"secretName": "logging-kibana-proxy"
}
}
]
}
},
"test": false,
"triggers": [
{
"type": "ConfigChange"
}
]
},
"status": {
"availableReplicas": 0,
"conditions": [
{
"lastTransitionTime": "2017-06-08T16:00:57Z",
"lastUpdateTime": "2017-06-08T16:00:57Z",
"message": "Deployment config does not have minimum availability.",
"status": "False",
"type": "Available"
},
{
"lastTransitionTime": "2017-06-08T16:00:57Z",
"lastUpdateTime": "2017-06-08T16:00:57Z",
"message": "replication controller \"logging-kibana-1\" is waiting for pod \"logging-kibana-1-deploy\" to run",
"status": "Unknown",
"type": "Progressing"
}
],
"details": {
"causes": [
{
"type": "ConfigChange"
}
],
"message": "config change"
},
"latestVersion": 1,
"observedGeneration": 2,
"replicas": 0,
"unavailableReplicas": 0,
"updatedReplicas": 0
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Delete temp directory] ************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:228
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-HUyAQM",
"state": "absent"
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:166
statically included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml
TASK [openshift_logging_kibana : fail] *****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"kibana_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : fail] *****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:15
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : Create temp directory for doing work in] ******
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:7
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.004224",
"end": "2017-06-08 12:00:59.209370",
"rc": 0,
"start": "2017-06-08 12:00:59.205146"
}
STDOUT:
/tmp/openshift-logging-ansible-7vUEC3
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:12
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-7vUEC3"
},
"changed": false
}
TASK [openshift_logging_kibana : Create templates subdirectory] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:16
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-7vUEC3/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_kibana : Create Kibana service account] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:26
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : Create Kibana service account] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:34
ok: [openshift] => {
"changed": false,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-kibana -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-kibana-dockercfg-sx7gd"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:48Z",
"name": "aggregated-logging-kibana",
"namespace": "logging",
"resourceVersion": "1525",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-kibana",
"uid": "a3b2f554-4c63-11e7-94aa-0e1649350dc2"
},
"secrets": [
{
"name": "aggregated-logging-kibana-token-q016z"
},
{
"name": "aggregated-logging-kibana-dockercfg-sx7gd"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:42
ok: [openshift] => {
"ansible_facts": {
"kibana_component": "kibana-ops",
"kibana_name": "logging-kibana-ops"
},
"changed": false
}
TASK [openshift_logging_kibana : Retrieving the cert to use when generating secrets for the logging components] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:46
ok: [openshift] => (item={u'name': u'ca_file', u'file': u'ca.crt'}) => {
"changed": false,
"content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU1Gb1hEVEl5TURZd056RTFOVGsxTVZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU9relNqcnBVaVRaRytrUUh5UFNtOFFVcnBJTzFMaE1XRUdiZjF5bERXZ00KdDdJVkFKZ05FQXcwRy8wK1pVSFpVQUN1T1M2NEtaZkE1NnhHMTIrL24vYjRtR2Z1QXhjakxBZ2xhNmRadmVqdApEZTdLSnducVBVNXordysrQWRhc3BBS3kyNlJyZCtWVHllbk5hZiswMjdRdDR4bjNETnhLcjFuOXFJcjN3cnZWCnZYZGZVU215RmJjY0N2ZnBpRjZqZE1aL1BhR2VEYXhFelpSUDJrL2liVEIvSTlxTmhESE1xcUxsQXZERmkzZlMKU0JxMTdWbEh4WHJPeENyQm4vNzVyOE9DZVk5a25pTXB1N0NmVFY3Nm85eU81SnlnVWg3TldTQmY5a2w0cHVIMQpLV1htcmVZSWdyR3FxakVFQ2Qxa0cyRVZDMVgvZ1FWZEJHbjBqMkk4d1BjQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFBaE8KbE5uS1RJNjRhWE8yOEVYa1pwcTZtMkdwY2xQTnpzM0dDVGE0RWYvMkJCa3pSd0ttNlgvN2NpZzh1S0lCNDE3Ugp1V1Jia2kvaHBGVGJYakhaRTJDOHEwSEpMVlVJKzdPQkcvS0cyOG5venpPNzMwZ0p4Q1o5NWZZNldsUjV4SURLCkJVeE9xYWhhRWlRTEx6MEtCUDBudUNxNlZCZWRIVEJPY1JGRDJlSXp6RlMvTFZqS2lqZ1MwZFZsb1VSV3JnZGYKdG1LSnhZYlZDOW9sK0MvdzFqYnp1cnFHZ1dRQ24wZGlOZ05HZ3I4TXVYRTZmQ0hMcnJSMVVGUTNXN0R6OUZ2SwpHTGNRcHJEQ3hUbUo0eTk5cy85aUEyUDhDbGQvWXMrVlB6WlB6UmVZcVRkdkVjQXpHb0drdVRiQXhQbmUyUEJICm5wMDM3WkY0UGFwZFdudGdxR3M9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K",
"encoding": "base64",
"item": {
"file": "ca.crt",
"name": "ca_file"
},
"source": "/etc/origin/logging/ca.crt"
}
ok: [openshift] => (item={u'name': u'kibana_internal_key', u'file': u'kibana-internal.key'}) => {
"changed": false,
"content": "LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktLS0tLQpNSUlFb3dJQkFBS0NBUUVBd0I5Y20zaG1ONENzMzVHL2pLZVBiS3hESk5ZUG1xSTQ2MmJWUXJQb3VvWGduSVpTCis3aHlyMGp6bTJMcDhTRjVCcEduaHYzWkcweXlwYkdIN0ZOVG8zNWRacnMvSEE5YnB5ZWxob1F5V0ZuTEw0U0UKSE1PN1pxNVVsdXBqQmducUk5YUVkK1krRjN5ZC9SQ3FuTFJhdDY1MDlFVzRLa3lFa1pTMitXeFVCYkF1R1pudwpqcTJaZm93emEvUFpvYXVWWDFGeCtJWnZHbE0yWDU1eHNsRnBiWHFMa3poSXlIOFZwMnl0d2ZlcXZkOGFRdFBJCmh1ZWhLUDArMXBqMDJIWVFJUDA3bnVqU3hPK0NMU2lrVVJkMHdWYlZuQUk0SHovcFU3MUNZUThsRHFzTmFpT2YKTzZnd0o5SC9NYVZ5ZVVSamtBTzJWMkFjYVBlY2NxK2lDN2FVV3dJREFRQUJBb0lCQUREaFNjeDhhM1UvbGJ3ago5dG93WDN0RXNLaVVsYysvNmo5cUlHUWlKZG9lNmJDcE5EKzdBK0s0NnRIajdxVmM0TS9kQ3dSN1hWdG12aVVOCjhBa2VnaThjbldMZnpRUzBtNXNCcVVsNkpOejVxNHBoYXNOdXdTVTB3V3pNSVhtTjJEWmFBOFlGbkZLWmNCRE0KeHJ1cjMxRFFZQTB1RjljYk5MZGRZTmhBeVhmUWxEQUo4Z2piWWN4ZmVudERJSU13SjFDSENYSWExdklvUXdabwpTWjdHOUN4VFFsNHBlUFVHbjF4ZjJPaEY2ZUlXM1E3WEYvV1FZME9iQmY1SjhrQ2pjcThDbldnMGxoamlVaFJBCmxlaFg3L1lHbmdjMzJPdjZKd1hzOEJyckM3c3k2bjlsMy9TOEU4a1pJWnRhNDlmSW1pSHZRTEQ0U1YvWEN4UjgKZTdrZi95RUNnWUVBejI2ajgwcDlzOEN1Tk1Ca0J2TzhvdnVQQ1dqa3ViWVgzU21KOWpVRXduR0xPUUFNUUUvdwp0L05BcmhSQU1pSDlKbVAvTmdCWWNFaURpUEExcXdhblNWczJaY3J4SUtQUmQ5dGdDa1cwVHZJVlZqMFBZS29ZCkZBV2QyWkY0OWhPNEV4T0xIYndXVXp4a2NEdUNpc2I0NXY2TEU3dTg0akpLWjVLOG9uUGpNSU1DZ1lFQTdSc1IKZUhMTytrbS9EMnl1VVdaNEd1R08rYmxMeHBKbFZqanlQb1ZPRkxXZ0xtbGFLdkpZTVovcThWWTRwcU5OV0tkWApab05jVEN6T3JuckROUVcwenlYOXE2K2tVck9rYllYNWRBdlpxNGx6ZktQNmkvdlpOWkdkdkVYNUkvdk1pYnVWCmFoY3lxalNHcWwvNVZ6NG45ZUdmNWpDRXN5b2NlUVVNb3BiUkZVa0NnWUEvQkhmZWc3VG9sUkxYaDlOYm9WU2YKbHhqL1hOU1A3dGdWSW5kOVN1SWxTR1ZwYmJCTElYNGFCRmFVRENic2xCTGFST3JWdHdrbkk0Q0NhNmVDUzhVcQpyZ0U2cjRyTnhiYnZXTUEybnJLR2dWa21GK3JDRFNxL2VtMVlHYS9MNG5XN3BlWlBwRUtNQ3Y3Z2NkUFk0VlhnCnAxZ05LSzNiY2pmVWUybS9XTUdlalFLQmdRRGhWalJJVUhROGtoR3VTdzl2OVA1NExaMS8zNFlRZGRreEZIWEUKelZQamdxbDA4bExyTmQ1emF4UVJ3R3Vla3R4VFFOWmphcnd3K1BTRUJjKzNlSERaM2JVemtYMk55NFNkUWhKTgpJMlgvREdaaE1rWk8rMDczQmlqdVlXSGh2TkFxcGNmZVI2V3k5TEIzQXpjb25yM0RoR1kra2lYTFVGNDI5WUdJCmsrU3BZUUtCZ0ZIRkJ4VDVxOW9XbUtBd0VMa1d6Z05XU0NqYlFrSjQ0RVROYjlqeEFsMGlyMHlwVGtXVmNDZDMKSFdxMVlzOEJ5bFdNaWEzcTEvSEhaUEQrNGRBSUs2UitSVEMvK1MvNitpUTBTcm1aU2R5S05rT3RZbjdETlBicAordWxTTk1TV2xuUmtsKy8xWTRpeVNrWVE4Y1plUFdtbVc3RStpYm9jU1RRQjZFS0hVQndrCi0tLS0tRU5EIFJTQSBQUklWQVRFIEtFWS0tLS0tCg==",
"encoding": "base64",
"item": {
"file": "kibana-internal.key",
"name": "kibana_internal_key"
},
"source": "/etc/origin/logging/kibana-internal.key"
}
ok: [openshift] => (item={u'name': u'kibana_internal_cert', u'file': u'kibana-internal.crt'}) => {
"changed": false,
"content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURUakNDQWphZ0F3SUJBZ0lCQWpBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU1sb1hEVEU1TURZd09ERTFOVGsxTTFvdwpGakVVTUJJR0ExVUVBeE1MSUd0cFltRnVZUzF2Y0hNd2dnRWlNQTBHQ1NxR1NJYjNEUUVCQVFVQUE0SUJEd0F3CmdnRUtBb0lCQVFEQUgxeWJlR1kzZ0t6ZmtiK01wNDlzckVNazFnK2FvampyWnRWQ3MraTZoZUNjaGxMN3VIS3YKU1BPYll1bnhJWGtHa2FlRy9ka2JUTEtsc1lmc1UxT2pmbDFtdXo4Y0QxdW5KNldHaERKWVdjc3ZoSVFjdzd0bQpybFNXNm1NR0Nlb2oxb1IzNWo0WGZKMzlFS3FjdEZxM3JuVDBSYmdxVElTUmxMYjViRlFGc0M0Wm1mQ09yWmwrCmpETnI4OW1ocTVWZlVYSDRobThhVXpaZm5uR3lVV2x0ZW91VE9FaklmeFduYkszQjk2cTkzeHBDMDhpRzU2RW8KL1Q3V21QVFlkaEFnL1R1ZTZOTEU3NEl0S0tSUkYzVEJWdFdjQWpnZlArbFR2VUpoRHlVT3F3MXFJNTg3cURBbgowZjh4cFhKNVJHT1FBN1pYWUJ4bzk1eHlyNklMdHBSYkFnTUJBQUdqZ1o0d2dac3dEZ1lEVlIwUEFRSC9CQVFECkFnV2dNQk1HQTFVZEpRUU1NQW9HQ0NzR0FRVUZCd01CTUF3R0ExVWRFd0VCL3dRQ01BQXdaZ1lEVlIwUkJGOHcKWFlJTElHdHBZbUZ1WVMxdmNIT0NMQ0JyYVdKaGJtRXRiM0J6TG5KdmRYUmxjaTVrWldaaGRXeDBMbk4yWXk1agpiSFZ6ZEdWeUxteHZZMkZzZ2hnZ2EybGlZVzVoTGpFeU55NHdMakF1TVM1NGFYQXVhVytDQm10cFltRnVZVEFOCkJna3Foa2lHOXcwQkFRc0ZBQU9DQVFFQVZla3NNSVBmSGZhbElUTk5FL2NvYzM3a2lCc2xkMTh6Tzc5cXZOZTIKZnV1WHhjOHd0WmpWcGNkZlAwOVRSS1hSSUhyNC9xSDJUVWpTWTNYajc0STRQQmNobURZMnpwK3pIME1oZTR6ZAo3bjEza0c1aTd5aFFiQlRuMTlQaWNpS3lRdGNidnM2TElGVjF4MVplWHh1WW5MdGRMS0s0MnkzbXlHcmFMRmd4CjYxcVYvbFZGTVpwcGVRQTFSVWFrWGEvNTFNcnlCd2pQY3lnR2ovSjNoc29vcUh0WUVvVVR0R1NjaWRCQyt4aW8KbkplckNRVm1LbTRZaXZIcm9PaFpVYzJmeEVVOGR6RXJkbFZaeWF0TjNxNXFWM0hkTjZYUGQ3dDhhOEx5ckNqbwpibmRMeVJ5bnNHVW1PMFFSZTVrUUdjQkI1UVpMNngyK2Y1cnRSOGFJUVkwaTNRPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQotLS0tLUJFR0lOIENFUlRJRklDQVRFLS0tLS0KTUlJQzJqQ0NBY0tnQXdJQkFnSUJBVEFOQmdrcWhraUc5dzBCQVFzRkFEQWVNUnd3R2dZRFZRUURFeE5zYjJkbgphVzVuTFhOcFoyNWxjaTEwWlhOME1CNFhEVEUzTURZd09ERTFOVGsxTUZvWERUSXlNRFl3TnpFMU5UazFNVm93CkhqRWNNQm9HQTFVRUF4TVRiRzluWjJsdVp5MXphV2R1WlhJdGRHVnpkRENDQVNJd0RRWUpLb1pJaHZjTkFRRUIKQlFBRGdnRVBBRENDQVFvQ2dnRUJBT2t6U2pycFVpVFpHK2tRSHlQU204UVVycElPMUxoTVdFR2JmMXlsRFdnTQp0N0lWQUpnTkVBdzBHLzArWlVIWlVBQ3VPUzY0S1pmQTU2eEcxMisvbi9iNG1HZnVBeGNqTEFnbGE2ZFp2ZWp0CkRlN0tKd25xUFU1eit3KytBZGFzcEFLeTI2UnJkK1ZUeWVuTmFmKzAyN1F0NHhuM0ROeEtyMW45cUlyM3dydlYKdlhkZlVTbXlGYmNjQ3ZmcGlGNmpkTVovUGFHZURheEV6WlJQMmsvaWJUQi9JOXFOaERITXFxTGxBdkRGaTNmUwpTQnExN1ZsSHhYck94Q3JCbi83NXI4T0NlWTlrbmlNcHU3Q2ZUVjc2bzl5TzVKeWdVaDdOV1NCZjlrbDRwdUgxCktXWG1yZVlJZ3JHcXFqRUVDZDFrRzJFVkMxWC9nUVZkQkduMGoySTh3UGNDQXdFQUFhTWpNQ0V3RGdZRFZSMFAKQVFIL0JBUURBZ0trTUE4R0ExVWRFd0VCL3dRRk1BTUJBZjh3RFFZSktvWklodmNOQVFFTEJRQURnZ0VCQUFoTwpsTm5LVEk2NGFYTzI4RVhrWnBxNm0yR3BjbFBOenMzR0NUYTRFZi8yQkJrelJ3S202WC83Y2lnOHVLSUI0MTdSCnVXUmJraS9ocEZUYlhqSFpFMkM4cTBISkxWVUkrN09CRy9LRzI4bm96ek83MzBnSnhDWjk1Zlk2V2xSNXhJREsKQlV4T3FhaGFFaVFMTHowS0JQMG51Q3E2VkJlZEhUQk9jUkZEMmVJenpGUy9MVmpLaWpnUzBkVmxvVVJXcmdkZgp0bUtKeFliVkM5b2wrQy93MWpienVycUdnV1FDbjBkaU5nTkdncjhNdVhFNmZDSExyclIxVUZRM1c3RHo5RnZLCkdMY1FwckRDeFRtSjR5OTlzLzlpQTJQOENsZC9ZcytWUHpaUHpSZVlxVGR2RWNBekdvR2t1VGJBeFBuZTJQQkgKbnAwMzdaRjRQYXBkV250Z3FHcz0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
"encoding": "base64",
"item": {
"file": "kibana-internal.crt",
"name": "kibana_internal_cert"
},
"source": "/etc/origin/logging/kibana-internal.crt"
}
ok: [openshift] => (item={u'name': u'server_tls', u'file': u'server-tls.json'}) => {
"changed": false,
"content": "Ly8gU2VlIGZvciBhdmFpbGFibGUgb3B0aW9uczogaHR0cHM6Ly9ub2RlanMub3JnL2FwaS90bHMuaHRtbCN0bHNfdGxzX2NyZWF0ZXNlcnZlcl9vcHRpb25zX3NlY3VyZWNvbm5lY3Rpb25saXN0ZW5lcgp0bHNfb3B0aW9ucyA9IHsKCWNpcGhlcnM6ICdrRUVDREg6K2tFRUNESCtTSEE6a0VESDora0VESCtTSEE6K2tFREgrQ0FNRUxMSUE6a0VDREg6K2tFQ0RIK1NIQTprUlNBOitrUlNBK1NIQTora1JTQStDQU1FTExJQTohYU5VTEw6IWVOVUxMOiFTU0x2MjohUkM0OiFERVM6IUVYUDohU0VFRDohSURFQTorM0RFUycsCglob25vckNpcGhlck9yZGVyOiB0cnVlCn0K",
"encoding": "base64",
"item": {
"file": "server-tls.json",
"name": "server_tls"
},
"source": "/etc/origin/logging/server-tls.json"
}
TASK [openshift_logging_kibana : Set logging-kibana-ops service] ***************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:57
changed: [openshift] => {
"changed": true,
"results": {
"clusterip": "172.30.27.219",
"cmd": "/bin/oc get service logging-kibana-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Service",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:02Z",
"name": "logging-kibana-ops",
"namespace": "logging",
"resourceVersion": "1601",
"selfLink": "/api/v1/namespaces/logging/services/logging-kibana-ops",
"uid": "abe9555f-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"clusterIP": "172.30.27.219",
"ports": [
{
"port": 443,
"protocol": "TCP",
"targetPort": "oaproxy"
}
],
"selector": {
"component": "kibana-ops",
"provider": "openshift"
},
"sessionAffinity": "None",
"type": "ClusterIP"
},
"status": {
"loadBalancer": {}
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:74
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_key | trim | length
> 0 }}
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:79
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_cert | trim | length
> 0 }}
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:84
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_ca | trim | length >
0 }}
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:89
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : Generating Kibana route template] *************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:94
ok: [openshift] => {
"changed": false,
"checksum": "81d91cb23f49736a59b4e35ceff52ac59387f178",
"dest": "/tmp/openshift-logging-ansible-7vUEC3/templates/kibana-route.yaml",
"gid": 0,
"group": "root",
"md5sum": "3b5f21c8d154735dfc0edc061bec7979",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2726,
"src": "/root/.ansible/tmp/ansible-tmp-1496937663.63-174574035422648/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_kibana : Setting Kibana route] *************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:114
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get route logging-kibana-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Route",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:04Z",
"labels": {
"component": "support",
"logging-infra": "support",
"provider": "openshift"
},
"name": "logging-kibana-ops",
"namespace": "logging",
"resourceVersion": "1605",
"selfLink": "/oapi/v1/namespaces/logging/routes/logging-kibana-ops",
"uid": "ad2f58c1-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"host": "kibana-ops.router.default.svc.cluster.local",
"tls": {
"caCertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwODE1NTk1MFoXDTIyMDYwNzE1NTk1MVow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAOkzSjrpUiTZG+kQHyPSm8QUrpIO1LhMWEGbf1ylDWgM\nt7IVAJgNEAw0G/0+ZUHZUACuOS64KZfA56xG12+/n/b4mGfuAxcjLAgla6dZvejt\nDe7KJwnqPU5z+w++AdaspAKy26Rrd+VTyenNaf+027Qt4xn3DNxKr1n9qIr3wrvV\nvXdfUSmyFbccCvfpiF6jdMZ/PaGeDaxEzZRP2k/ibTB/I9qNhDHMqqLlAvDFi3fS\nSBq17VlHxXrOxCrBn/75r8OCeY9kniMpu7CfTV76o9yO5JygUh7NWSBf9kl4puH1\nKWXmreYIgrGqqjEECd1kG2EVC1X/gQVdBGn0j2I8wPcCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAAhO\nlNnKTI64aXO28EXkZpq6m2GpclPNzs3GCTa4Ef/2BBkzRwKm6X/7cig8uKIB417R\nuWRbki/hpFTbXjHZE2C8q0HJLVUI+7OBG/KG28nozzO730gJxCZ95fY6WlR5xIDK\nBUxOqahaEiQLLz0KBP0nuCq6VBedHTBOcRFD2eIzzFS/LVjKijgS0dVloURWrgdf\ntmKJxYbVC9ol+C/w1jbzurqGgWQCn0diNgNGgr8MuXE6fCHLrrR1UFQ3W7Dz9FvK\nGLcQprDCxTmJ4y99s/9iA2P8Cld/Ys+VPzZPzReYqTdvEcAzGoGkuTbAxPne2PBH\nnp037ZF4PapdWntgqGs=\n-----END CERTIFICATE-----\n",
"destinationCACertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwODE1NTk1MFoXDTIyMDYwNzE1NTk1MVow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAOkzSjrpUiTZG+kQHyPSm8QUrpIO1LhMWEGbf1ylDWgM\nt7IVAJgNEAw0G/0+ZUHZUACuOS64KZfA56xG12+/n/b4mGfuAxcjLAgla6dZvejt\nDe7KJwnqPU5z+w++AdaspAKy26Rrd+VTyenNaf+027Qt4xn3DNxKr1n9qIr3wrvV\nvXdfUSmyFbccCvfpiF6jdMZ/PaGeDaxEzZRP2k/ibTB/I9qNhDHMqqLlAvDFi3fS\nSBq17VlHxXrOxCrBn/75r8OCeY9kniMpu7CfTV76o9yO5JygUh7NWSBf9kl4puH1\nKWXmreYIgrGqqjEECd1kG2EVC1X/gQVdBGn0j2I8wPcCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAAhO\nlNnKTI64aXO28EXkZpq6m2GpclPNzs3GCTa4Ef/2BBkzRwKm6X/7cig8uKIB417R\nuWRbki/hpFTbXjHZE2C8q0HJLVUI+7OBG/KG28nozzO730gJxCZ95fY6WlR5xIDK\nBUxOqahaEiQLLz0KBP0nuCq6VBedHTBOcRFD2eIzzFS/LVjKijgS0dVloURWrgdf\ntmKJxYbVC9ol+C/w1jbzurqGgWQCn0diNgNGgr8MuXE6fCHLrrR1UFQ3W7Dz9FvK\nGLcQprDCxTmJ4y99s/9iA2P8Cld/Ys+VPzZPzReYqTdvEcAzGoGkuTbAxPne2PBH\nnp037ZF4PapdWntgqGs=\n-----END CERTIFICATE-----\n",
"insecureEdgeTerminationPolicy": "Redirect",
"termination": "reencrypt"
},
"to": {
"kind": "Service",
"name": "logging-kibana-ops",
"weight": 100
},
"wildcardPolicy": "None"
},
"status": {
"ingress": [
{
"conditions": [
{
"lastTransitionTime": "2017-06-08T16:01:04Z",
"status": "True",
"type": "Admitted"
}
],
"host": "kibana-ops.router.default.svc.cluster.local",
"routerName": "router",
"wildcardPolicy": "None"
}
]
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Generate proxy session] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:125
ok: [openshift] => {
"ansible_facts": {
"session_secret": "v3vq4cjwi4UAStY0PtEfneh2uHZ30gleCbNmByaCGDLXmIf0MwgymMybgDiPDJkLlrbar5XijzIA3jSyVL7saGaN6sz10QMSnaHsWRHiZXohkeSKMBgxXkD7GucqxmFLnfikQonsIDtun0G61TehLdHaALhGEoT1kblFJDekl9v3tHJ5G1rXG4cmKPfRQI0loBnEu7QF"
},
"changed": false
}
TASK [openshift_logging_kibana : Generate oauth client secret] *****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:132
ok: [openshift] => {
"ansible_facts": {
"oauth_secret": "WrPK47rl8TwdDyaEglgNKBeVgCo9b6qs1pCGezBpUDdEkPOfancf2FyNDFrzZpmz"
},
"changed": false
}
TASK [openshift_logging_kibana : Create oauth-client template] *****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:138
changed: [openshift] => {
"changed": true,
"checksum": "9e92537f04adfdeda260a24bd81d14af4bee0703",
"dest": "/tmp/openshift-logging-ansible-7vUEC3/templates/oauth-client.yml",
"gid": 0,
"group": "root",
"md5sum": "7fb80f5dc90ee3c9cfab6cacf5a0f7fa",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 332,
"src": "/root/.ansible/tmp/ansible-tmp-1496937665.32-29059572341737/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_kibana : Set kibana-proxy oauth-client] ****************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:146
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get oauthclient kibana-proxy -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "OAuthClient",
"metadata": {
"creationTimestamp": "2017-06-08T16:00:54Z",
"labels": {
"logging-infra": "support"
},
"name": "kibana-proxy",
"resourceVersion": "1609",
"selfLink": "/oapi/v1/oauthclients/kibana-proxy",
"uid": "a6ba2ddf-4c63-11e7-94aa-0e1649350dc2"
},
"redirectURIs": [
"https://kibana-ops.router.default.svc.cluster.local"
],
"scopeRestrictions": [
{
"literals": [
"user:info",
"user:check-access",
"user:list-projects"
]
}
],
"secret": "WrPK47rl8TwdDyaEglgNKBeVgCo9b6qs1pCGezBpUDdEkPOfancf2FyNDFrzZpmz"
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Set Kibana secret] ****************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:157
ok: [openshift] => {
"changed": false,
"results": {
"apiVersion": "v1",
"data": {
"ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU1Gb1hEVEl5TURZd056RTFOVGsxTVZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU9relNqcnBVaVRaRytrUUh5UFNtOFFVcnBJTzFMaE1XRUdiZjF5bERXZ00KdDdJVkFKZ05FQXcwRy8wK1pVSFpVQUN1T1M2NEtaZkE1NnhHMTIrL24vYjRtR2Z1QXhjakxBZ2xhNmRadmVqdApEZTdLSnducVBVNXordysrQWRhc3BBS3kyNlJyZCtWVHllbk5hZiswMjdRdDR4bjNETnhLcjFuOXFJcjN3cnZWCnZYZGZVU215RmJjY0N2ZnBpRjZqZE1aL1BhR2VEYXhFelpSUDJrL2liVEIvSTlxTmhESE1xcUxsQXZERmkzZlMKU0JxMTdWbEh4WHJPeENyQm4vNzVyOE9DZVk5a25pTXB1N0NmVFY3Nm85eU81SnlnVWg3TldTQmY5a2w0cHVIMQpLV1htcmVZSWdyR3FxakVFQ2Qxa0cyRVZDMVgvZ1FWZEJHbjBqMkk4d1BjQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFBaE8KbE5uS1RJNjRhWE8yOEVYa1pwcTZtMkdwY2xQTnpzM0dDVGE0RWYvMkJCa3pSd0ttNlgvN2NpZzh1S0lCNDE3Ugp1V1Jia2kvaHBGVGJYakhaRTJDOHEwSEpMVlVJKzdPQkcvS0cyOG5venpPNzMwZ0p4Q1o5NWZZNldsUjV4SURLCkJVeE9xYWhhRWlRTEx6MEtCUDBudUNxNlZCZWRIVEJPY1JGRDJlSXp6RlMvTFZqS2lqZ1MwZFZsb1VSV3JnZGYKdG1LSnhZYlZDOW9sK0MvdzFqYnp1cnFHZ1dRQ24wZGlOZ05HZ3I4TXVYRTZmQ0hMcnJSMVVGUTNXN0R6OUZ2SwpHTGNRcHJEQ3hUbUo0eTk5cy85aUEyUDhDbGQvWXMrVlB6WlB6UmVZcVRkdkVjQXpHb0drdVRiQXhQbmUyUEJICm5wMDM3WkY0UGFwZFdudGdxR3M9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K",
"cert": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURSVENDQWkyZ0F3SUJBZ0lCQXpBTkJna3Foa2lHOXcwQkFRVUZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU5sb1hEVEU1TURZd09ERTFOVGsxTmxvdwpSakVRTUE0R0ExVUVDZ3dIVEc5bloybHVaekVTTUJBR0ExVUVDd3dKVDNCbGJsTm9hV1owTVI0d0hBWURWUVFECkRCVnplWE4wWlcwdWJHOW5aMmx1Wnk1cmFXSmhibUV3Z2dFaU1BMEdDU3FHU0liM0RRRUJBUVVBQTRJQkR3QXcKZ2dFS0FvSUJBUUNsWDBpZTJ0WWZRNnNPL1d6ZE5ONXFPd2F0aHlSK2ZuOThORlVvaFhLcVBQMkdJeit4WW1YcwpvcWhXbHd3WHQ5NTlVQzZtZnJ2b2g3Y0lzZUlnRjhzRjdLMk5QaVJ1VWQzNTdQS3lEYnZyelB0aVhWakhBeklyCnN2MWd1R24vb1RIbkUvZVhDam5kbjIzbUE5OTRMMEFZWUVjV0E4Rm95dUZkZXJYOERNWnFOaDE0TnAvVXlvYmUKUmpqR3I1eWpaeXN2dmhvQkNVL2RldUJ0Y1R2MnpyMUVKOVlEQlhYM3YrZDdsWks1a3BQRnVxUWNNN3ZXQ24zcAp1MFVnUExoS2J1MnI1d0FwR1I0aG1FS1o2bUVPUXVkQ2E2NkIzSmptNTUzOXBTOHZSUXdaZGFzVTVLUTl6T2ozCjNHRDBUYkYwQVRua01XTVl0dkJabENnYXB0aTVEN2VaQWdNQkFBR2paakJrTUE0R0ExVWREd0VCL3dRRUF3SUYKb0RBSkJnTlZIUk1FQWpBQU1CMEdBMVVkSlFRV01CUUdDQ3NHQVFVRkJ3TUJCZ2dyQmdFRkJRY0RBakFkQmdOVgpIUTRFRmdRVVlVWmQvSTgzUnpDTWF6bzlGTGJLZ0puZGJVVXdDUVlEVlIwakJBSXdBREFOQmdrcWhraUc5dzBCCkFRVUZBQU9DQVFFQWJFRkhFY056dHVzaGJ4TzY4QUVnbkxBREdyclgxTU9jbTVqanpBRkliSGttdnNnSTduZ0sKYlpkZGFWYWVaTTFTUWJGSFVLblhKck12U2RFY1hmckZIZC84SjNuQ29jOEJKaVkyUjV3aTVPQXVkVy9IQnhpaQo1WmZFOW9wTDNla0xVcnhZQmJwNVdwb2ZUNkhvV2lDQWtDWlZaRTV2QzV2R0liM1lRZmNpdm9HSUpEUEdXdExiClE1VjdlNFVjUDlpNC9UVEhkOTAvU2lKdWl0MkVNQWo1Sk1NcHlVYVpMNjUzd2FuWDloRUtQZTZqQ1BpYm0vemgKWTZxaWVRYjU1bXE4TGM4d0lPNkIrTTlLOFljVkluQWh0TmpHeFQzMUVwSGNuQ005THNwYzRlTy90RzljKzdXTAptRVpsbWhBTkh6Y2pQZlNocXdMK255OFluSVdFRDlsZlBBPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
"key": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV2UUlCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktjd2dnU2pBZ0VBQW9JQkFRQ2xYMGllMnRZZlE2c08KL1d6ZE5ONXFPd2F0aHlSK2ZuOThORlVvaFhLcVBQMkdJeit4WW1Yc29xaFdsd3dYdDk1OVVDNm1mcnZvaDdjSQpzZUlnRjhzRjdLMk5QaVJ1VWQzNTdQS3lEYnZyelB0aVhWakhBeklyc3YxZ3VHbi9vVEhuRS9lWENqbmRuMjNtCkE5OTRMMEFZWUVjV0E4Rm95dUZkZXJYOERNWnFOaDE0TnAvVXlvYmVSampHcjV5alp5c3Z2aG9CQ1UvZGV1QnQKY1R2MnpyMUVKOVlEQlhYM3YrZDdsWks1a3BQRnVxUWNNN3ZXQ24zcHUwVWdQTGhLYnUycjV3QXBHUjRobUVLWgo2bUVPUXVkQ2E2NkIzSmptNTUzOXBTOHZSUXdaZGFzVTVLUTl6T2ozM0dEMFRiRjBBVG5rTVdNWXR2QlpsQ2dhCnB0aTVEN2VaQWdNQkFBRUNnZ0VBWDlsZlBQdDN1YjA5eXoxbHVMeW80UWQvTWxxdTgwWWNXLy8xRDd4eEhteWwKUVBiek5ydmllWW81YVo2NG1wS3V1UkVkU05FUjFvKzZyYVowZXNkdm0rcDNHUE5ZcUR4NUhSK1I5MU53aFJaVQo2Y2xTNkY1WTJPMHdZTHZpYkJDZzNOODgxT1ZQYnIyMk0rZ3d3UGNaL2tiblRNSTJlcWVFZ2VyYXlkRVp1M3JkClZOVTU5L0hFZVQwVTY3UWpSallETWdmNVVVRHRSdlo4R0xGWnRUT2d2bU5nYWlSNnJsQlZQWlFoODZLQlUzWXcKd0x3aC9FR2RPTkQxdkh4MUtsOWVZb05BejFmQ1hOdzgyMkpPSU9IOUlKSmlzVDY5QXFET3FyR0tmTnJ5VFRPQQpxVzVsN0xZd1NRL0E3cnlNTjBrVW1lUjRoNVlhK2ZkdG9BM1FWVmQzZ1FLQmdRRFd0bkRwUlZFTWhOUTJXTHdkCnhYSHRhSUJudVRhcUdYZU56eDM3d25QM1QrbVVmdm92aXJVSWVSaXZzSkNhYm84dzhDWXYxc3VxRS9BT3RjSlEKZitIMzlXMUFVS3VETVJOaDl4TjJCZ3RLc0tnNjFJWG1INlI3cktHU28xTWxRdWF6VFMrUFRjcXhwcDI3clpETwpPajg2UXlKU1FnMkIzMVVZRXJsSDJwTGhhUUtCZ1FERksvdXRleThFWDlaZmdwMDBBSlZuSWE4Wmc4VGh3dSt1CjlJYjBISjllUjdzNG1rajlheGFCcXc5ODJINWlyZHo2L0s2RmtsbnF4RXBERG1VZjJ0dExjS2dpblpUck1qSzkKQTMxWXZrSFVJRVlISjBSUHIxdkQwMFJldzFJTDRKOHlrYms1WkFrQXRsU0swOFB6YjJ3bXl1V3VzTlhocXlwbwpwdlVDcTlrdXNRS0JnR3FoY2xPMjgvaFdveGxXV2g0aTUyQnk3SW9XaGxwVmlYVW9yZ2hRMnN5d3FCenlMb2ViCnlDb3NFYUYyKzJsbWpNQk9FM2pnb0lhWG5qbC85TCtMc3dvMG5ZdzZRK05FWlE3YTZKUk5qaUFLdVpGMTZBV1EKSTF5ME1BMm1CTzNWV3NNakN3S05MS09yVGx4ZFp6T3o0NkNvcEl2YmQ5L09yUERtbzVOV3JtazVBb0dCQUtUQQphSzcvdER5NmU5MFl2WlNiUER1TnFNcndFTTMzM2VEWnovNGdBSVo5OTVHSFVaLzNJRG8vSGxUYWJWaTFJR1hVClIxdXkrMUV3clVDMHdZakpqZDNPaDU5TS93Yzd6YXVrUTlPb1BrY3FwSGtMdFlmRDVqQ04wcDVBSk1scDZud04KeWJDTHh2NENYRWdZUks2ZmxzWWZXYVlMZXR2eTh4KzVDaGN1VXU0eEFvR0FYNnNMdUt1c1VZdlVvb0xGdEZ1cwphTUhnMVpIRnc2WE0wVWZ4ZVFMM2hNazVxTUgyOCtUTTFOcWdaOURaSithSUhiUjlBd25mYnpOK1gzMi9XN0FyCmM3S2tncTNVMzRyQXgzdG55NHd0OUgweC9tcEhXVk1wYy9CbXJyWEFJZFlWeDM0QVFjdkQ2aHUwNHFSODlzT3YKUHdEUUMrT3dSSkpvNHB5eHhsVDVaM289Ci0tLS0tRU5EIFBSSVZBVEUgS0VZLS0tLS0K"
},
"kind": "Secret",
"metadata": {
"creationTimestamp": null,
"name": "logging-kibana"
},
"type": "Opaque"
},
"state": "present"
}
TASK [openshift_logging_kibana : Set Kibana Proxy secret] **********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:171
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc replace -f /tmp/logging-kibana-proxy -n logging",
"results": "",
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Generate Kibana DC template] ******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:197
changed: [openshift] => {
"changed": true,
"checksum": "f2d980813716ecbcb8511ef64f2f90ff65ff483a",
"dest": "/tmp/openshift-logging-ansible-7vUEC3/templates/kibana-dc.yaml",
"gid": 0,
"group": "root",
"md5sum": "0f2814ec7663be90532db4729d9eaf69",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 3761,
"src": "/root/.ansible/tmp/ansible-tmp-1496937668.7-272576019342324/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_kibana : Set Kibana DC] ********************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:216
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get dc logging-kibana-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "DeploymentConfig",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:09Z",
"generation": 2,
"labels": {
"component": "kibana-ops",
"logging-infra": "kibana",
"provider": "openshift"
},
"name": "logging-kibana-ops",
"namespace": "logging",
"resourceVersion": "1624",
"selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-kibana-ops",
"uid": "aff4f5fb-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"replicas": 1,
"selector": {
"component": "kibana-ops",
"logging-infra": "kibana",
"provider": "openshift"
},
"strategy": {
"activeDeadlineSeconds": 21600,
"resources": {},
"rollingParams": {
"intervalSeconds": 1,
"maxSurge": "25%",
"maxUnavailable": "25%",
"timeoutSeconds": 600,
"updatePeriodSeconds": 1
},
"type": "Rolling"
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "kibana-ops",
"logging-infra": "kibana",
"provider": "openshift"
},
"name": "logging-kibana-ops"
},
"spec": {
"containers": [
{
"env": [
{
"name": "ES_HOST",
"value": "logging-es-ops"
},
{
"name": "ES_PORT",
"value": "9200"
},
{
"name": "KIBANA_MEMORY_LIMIT",
"valueFrom": {
"resourceFieldRef": {
"containerName": "kibana",
"divisor": "0",
"resource": "limits.memory"
}
}
}
],
"image": "172.30.255.47:5000/logging/logging-kibana:latest",
"imagePullPolicy": "Always",
"name": "kibana",
"readinessProbe": {
"exec": {
"command": [
"/usr/share/kibana/probe/readiness.sh"
]
},
"failureThreshold": 3,
"initialDelaySeconds": 5,
"periodSeconds": 5,
"successThreshold": 1,
"timeoutSeconds": 4
},
"resources": {
"limits": {
"memory": "736Mi"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/etc/kibana/keys",
"name": "kibana",
"readOnly": true
}
]
},
{
"env": [
{
"name": "OAP_BACKEND_URL",
"value": "http://localhost:5601"
},
{
"name": "OAP_AUTH_MODE",
"value": "oauth2"
},
{
"name": "OAP_TRANSFORM",
"value": "user_header,token_header"
},
{
"name": "OAP_OAUTH_ID",
"value": "kibana-proxy"
},
{
"name": "OAP_MASTER_URL",
"value": "https://kubernetes.default.svc.cluster.local"
},
{
"name": "OAP_PUBLIC_MASTER_URL",
"value": "https://172.18.3.237:8443"
},
{
"name": "OAP_LOGOUT_REDIRECT",
"value": "https://172.18.3.237:8443/console/logout"
},
{
"name": "OAP_MASTER_CA_FILE",
"value": "/var/run/secrets/kubernetes.io/serviceaccount/ca.crt"
},
{
"name": "OAP_DEBUG",
"value": "False"
},
{
"name": "OAP_OAUTH_SECRET_FILE",
"value": "/secret/oauth-secret"
},
{
"name": "OAP_SERVER_CERT_FILE",
"value": "/secret/server-cert"
},
{
"name": "OAP_SERVER_KEY_FILE",
"value": "/secret/server-key"
},
{
"name": "OAP_SERVER_TLS_FILE",
"value": "/secret/server-tls.json"
},
{
"name": "OAP_SESSION_SECRET_FILE",
"value": "/secret/session-secret"
},
{
"name": "OCP_AUTH_PROXY_MEMORY_LIMIT",
"valueFrom": {
"resourceFieldRef": {
"containerName": "kibana-proxy",
"divisor": "0",
"resource": "limits.memory"
}
}
}
],
"image": "172.30.255.47:5000/logging/logging-auth-proxy:latest",
"imagePullPolicy": "Always",
"name": "kibana-proxy",
"ports": [
{
"containerPort": 3000,
"name": "oaproxy",
"protocol": "TCP"
}
],
"resources": {
"limits": {
"memory": "96Mi"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/secret",
"name": "kibana-proxy",
"readOnly": true
}
]
}
],
"dnsPolicy": "ClusterFirst",
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {},
"serviceAccount": "aggregated-logging-kibana",
"serviceAccountName": "aggregated-logging-kibana",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"name": "kibana",
"secret": {
"defaultMode": 420,
"secretName": "logging-kibana"
}
},
{
"name": "kibana-proxy",
"secret": {
"defaultMode": 420,
"secretName": "logging-kibana-proxy"
}
}
]
}
},
"test": false,
"triggers": [
{
"type": "ConfigChange"
}
]
},
"status": {
"availableReplicas": 0,
"conditions": [
{
"lastTransitionTime": "2017-06-08T16:01:09Z",
"lastUpdateTime": "2017-06-08T16:01:09Z",
"message": "Deployment config does not have minimum availability.",
"status": "False",
"type": "Available"
},
{
"lastTransitionTime": "2017-06-08T16:01:09Z",
"lastUpdateTime": "2017-06-08T16:01:09Z",
"message": "replication controller \"logging-kibana-ops-1\" is waiting for pod \"logging-kibana-ops-1-deploy\" to run",
"status": "Unknown",
"type": "Progressing"
}
],
"details": {
"causes": [
{
"type": "ConfigChange"
}
],
"message": "config change"
},
"latestVersion": 1,
"observedGeneration": 2,
"replicas": 0,
"unavailableReplicas": 0,
"updatedReplicas": 0
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Delete temp directory] ************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:228
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-7vUEC3",
"state": "absent"
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:195
statically included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml
TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"curator_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:15
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Create temp directory for doing work in] *****
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:5
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.003915",
"end": "2017-06-08 12:01:11.153327",
"rc": 0,
"start": "2017-06-08 12:01:11.149412"
}
STDOUT:
/tmp/openshift-logging-ansible-mt8iQS
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:10
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-mt8iQS"
},
"changed": false
}
TASK [openshift_logging_curator : Create templates subdirectory] ***************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:14
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-mt8iQS/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:24
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:32
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-curator-dockercfg-s1xqh"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:12Z",
"name": "aggregated-logging-curator",
"namespace": "logging",
"resourceVersion": "1638",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-curator",
"uid": "b19692e4-4c63-11e7-94aa-0e1649350dc2"
},
"secrets": [
{
"name": "aggregated-logging-curator-token-2hxcd"
},
{
"name": "aggregated-logging-curator-dockercfg-s1xqh"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:41
ok: [openshift] => {
"changed": false,
"checksum": "9008efd9a8892dcc42c28c6dfb6708527880a6d8",
"dest": "/tmp/openshift-logging-ansible-mt8iQS/curator.yml",
"gid": 0,
"group": "root",
"md5sum": "5498c5fd98f3dd06e34b20eb1f55dc12",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 320,
"src": "/root/.ansible/tmp/ansible-tmp-1496937672.72-127375036698555/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:47
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Set Curator configmap] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:53
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get configmap logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"data": {
"config.yaml": "# Logging example curator config file\n\n# uncomment and use this to override the defaults from env vars\n#.defaults:\n# delete:\n# days: 30\n# runhour: 0\n# runminute: 0\n\n# to keep ops logs for a different duration:\n#.operations:\n# delete:\n# weeks: 8\n\n# example for a normal project\n#myapp:\n# delete:\n# weeks: 1\n"
},
"kind": "ConfigMap",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:13Z",
"name": "logging-curator",
"namespace": "logging",
"resourceVersion": "1640",
"selfLink": "/api/v1/namespaces/logging/configmaps/logging-curator",
"uid": "b25aefb9-4c63-11e7-94aa-0e1649350dc2"
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : Set Curator secret] **************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:62
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc secrets new logging-curator ca=/etc/origin/logging/ca.crt key=/etc/origin/logging/system.logging.curator.key cert=/etc/origin/logging/system.logging.curator.crt -n logging",
"results": "",
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:75
ok: [openshift] => {
"ansible_facts": {
"curator_component": "curator",
"curator_name": "logging-curator"
},
"changed": false
}
TASK [openshift_logging_curator : Generate Curator deploymentconfig] ***********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:81
ok: [openshift] => {
"changed": false,
"checksum": "e3059c2899f1563c9f223ad22fed52a02d791752",
"dest": "/tmp/openshift-logging-ansible-mt8iQS/templates/curator-dc.yaml",
"gid": 0,
"group": "root",
"md5sum": "4935281471701d3b596a2daa02727c3d",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2340,
"src": "/root/.ansible/tmp/ansible-tmp-1496937674.74-227565484127174/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_curator : Set Curator DC] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:99
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get dc logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "DeploymentConfig",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:15Z",
"generation": 2,
"labels": {
"component": "curator",
"logging-infra": "curator",
"provider": "openshift"
},
"name": "logging-curator",
"namespace": "logging",
"resourceVersion": "1656",
"selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-curator",
"uid": "b3b807c7-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"replicas": 1,
"selector": {
"component": "curator",
"logging-infra": "curator",
"provider": "openshift"
},
"strategy": {
"activeDeadlineSeconds": 21600,
"recreateParams": {
"timeoutSeconds": 600
},
"resources": {},
"rollingParams": {
"intervalSeconds": 1,
"maxSurge": "25%",
"maxUnavailable": "25%",
"timeoutSeconds": 600,
"updatePeriodSeconds": 1
},
"type": "Recreate"
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "curator",
"logging-infra": "curator",
"provider": "openshift"
},
"name": "logging-curator"
},
"spec": {
"containers": [
{
"env": [
{
"name": "K8S_HOST_URL",
"value": "https://kubernetes.default.svc.cluster.local"
},
{
"name": "ES_HOST",
"value": "logging-es"
},
{
"name": "ES_PORT",
"value": "9200"
},
{
"name": "ES_CLIENT_CERT",
"value": "/etc/curator/keys/cert"
},
{
"name": "ES_CLIENT_KEY",
"value": "/etc/curator/keys/key"
},
{
"name": "ES_CA",
"value": "/etc/curator/keys/ca"
},
{
"name": "CURATOR_DEFAULT_DAYS",
"value": "30"
},
{
"name": "CURATOR_RUN_HOUR",
"value": "0"
},
{
"name": "CURATOR_RUN_MINUTE",
"value": "0"
},
{
"name": "CURATOR_RUN_TIMEZONE",
"value": "UTC"
},
{
"name": "CURATOR_SCRIPT_LOG_LEVEL",
"value": "INFO"
},
{
"name": "CURATOR_LOG_LEVEL",
"value": "ERROR"
}
],
"image": "172.30.255.47:5000/logging/logging-curator:latest",
"imagePullPolicy": "Always",
"name": "curator",
"resources": {
"limits": {
"cpu": "100m"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/etc/curator/keys",
"name": "certs",
"readOnly": true
},
{
"mountPath": "/etc/curator/settings",
"name": "config",
"readOnly": true
}
]
}
],
"dnsPolicy": "ClusterFirst",
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {},
"serviceAccount": "aggregated-logging-curator",
"serviceAccountName": "aggregated-logging-curator",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"name": "certs",
"secret": {
"defaultMode": 420,
"secretName": "logging-curator"
}
},
{
"configMap": {
"defaultMode": 420,
"name": "logging-curator"
},
"name": "config"
}
]
}
},
"test": false,
"triggers": [
{
"type": "ConfigChange"
}
]
},
"status": {
"availableReplicas": 0,
"conditions": [
{
"lastTransitionTime": "2017-06-08T16:01:15Z",
"lastUpdateTime": "2017-06-08T16:01:15Z",
"message": "Deployment config does not have minimum availability.",
"status": "False",
"type": "Available"
},
{
"lastTransitionTime": "2017-06-08T16:01:15Z",
"lastUpdateTime": "2017-06-08T16:01:15Z",
"message": "replication controller \"logging-curator-1\" is waiting for pod \"logging-curator-1-deploy\" to run",
"status": "Unknown",
"type": "Progressing"
}
],
"details": {
"causes": [
{
"type": "ConfigChange"
}
],
"message": "config change"
},
"latestVersion": 1,
"observedGeneration": 2,
"replicas": 0,
"unavailableReplicas": 0,
"updatedReplicas": 0
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : Delete temp directory] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:109
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-mt8iQS",
"state": "absent"
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:207
statically included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml
TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"curator_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:15
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Create temp directory for doing work in] *****
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:5
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:01.003785",
"end": "2017-06-08 12:01:19.817458",
"rc": 0,
"start": "2017-06-08 12:01:18.813673"
}
STDOUT:
/tmp/openshift-logging-ansible-rrg5iz
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:10
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-rrg5iz"
},
"changed": false
}
TASK [openshift_logging_curator : Create templates subdirectory] ***************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:14
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-rrg5iz/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:24
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:32
ok: [openshift] => {
"changed": false,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-curator-dockercfg-s1xqh"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:12Z",
"name": "aggregated-logging-curator",
"namespace": "logging",
"resourceVersion": "1638",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-curator",
"uid": "b19692e4-4c63-11e7-94aa-0e1649350dc2"
},
"secrets": [
{
"name": "aggregated-logging-curator-token-2hxcd"
},
{
"name": "aggregated-logging-curator-dockercfg-s1xqh"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:41
ok: [openshift] => {
"changed": false,
"checksum": "9008efd9a8892dcc42c28c6dfb6708527880a6d8",
"dest": "/tmp/openshift-logging-ansible-rrg5iz/curator.yml",
"gid": 0,
"group": "root",
"md5sum": "5498c5fd98f3dd06e34b20eb1f55dc12",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 320,
"src": "/root/.ansible/tmp/ansible-tmp-1496937680.63-62289953895467/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:47
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Set Curator configmap] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:53
ok: [openshift] => {
"changed": false,
"results": {
"cmd": "/bin/oc get configmap logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"data": {
"config.yaml": "# Logging example curator config file\n\n# uncomment and use this to override the defaults from env vars\n#.defaults:\n# delete:\n# days: 30\n# runhour: 0\n# runminute: 0\n\n# to keep ops logs for a different duration:\n#.operations:\n# delete:\n# weeks: 8\n\n# example for a normal project\n#myapp:\n# delete:\n# weeks: 1\n"
},
"kind": "ConfigMap",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:13Z",
"name": "logging-curator",
"namespace": "logging",
"resourceVersion": "1640",
"selfLink": "/api/v1/namespaces/logging/configmaps/logging-curator",
"uid": "b25aefb9-4c63-11e7-94aa-0e1649350dc2"
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : Set Curator secret] **************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:62
ok: [openshift] => {
"changed": false,
"results": {
"apiVersion": "v1",
"data": {
"ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU1Gb1hEVEl5TURZd056RTFOVGsxTVZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU9relNqcnBVaVRaRytrUUh5UFNtOFFVcnBJTzFMaE1XRUdiZjF5bERXZ00KdDdJVkFKZ05FQXcwRy8wK1pVSFpVQUN1T1M2NEtaZkE1NnhHMTIrL24vYjRtR2Z1QXhjakxBZ2xhNmRadmVqdApEZTdLSnducVBVNXordysrQWRhc3BBS3kyNlJyZCtWVHllbk5hZiswMjdRdDR4bjNETnhLcjFuOXFJcjN3cnZWCnZYZGZVU215RmJjY0N2ZnBpRjZqZE1aL1BhR2VEYXhFelpSUDJrL2liVEIvSTlxTmhESE1xcUxsQXZERmkzZlMKU0JxMTdWbEh4WHJPeENyQm4vNzVyOE9DZVk5a25pTXB1N0NmVFY3Nm85eU81SnlnVWg3TldTQmY5a2w0cHVIMQpLV1htcmVZSWdyR3FxakVFQ2Qxa0cyRVZDMVgvZ1FWZEJHbjBqMkk4d1BjQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFBaE8KbE5uS1RJNjRhWE8yOEVYa1pwcTZtMkdwY2xQTnpzM0dDVGE0RWYvMkJCa3pSd0ttNlgvN2NpZzh1S0lCNDE3Ugp1V1Jia2kvaHBGVGJYakhaRTJDOHEwSEpMVlVJKzdPQkcvS0cyOG5venpPNzMwZ0p4Q1o5NWZZNldsUjV4SURLCkJVeE9xYWhhRWlRTEx6MEtCUDBudUNxNlZCZWRIVEJPY1JGRDJlSXp6RlMvTFZqS2lqZ1MwZFZsb1VSV3JnZGYKdG1LSnhZYlZDOW9sK0MvdzFqYnp1cnFHZ1dRQ24wZGlOZ05HZ3I4TXVYRTZmQ0hMcnJSMVVGUTNXN0R6OUZ2SwpHTGNRcHJEQ3hUbUo0eTk5cy85aUEyUDhDbGQvWXMrVlB6WlB6UmVZcVRkdkVjQXpHb0drdVRiQXhQbmUyUEJICm5wMDM3WkY0UGFwZFdudGdxR3M9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K",
"cert": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURSakNDQWk2Z0F3SUJBZ0lCQkRBTkJna3Foa2lHOXcwQkFRVUZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUxTlRrMU5sb1hEVEU1TURZd09ERTFOVGsxTmxvdwpSekVRTUE0R0ExVUVDZ3dIVEc5bloybHVaekVTTUJBR0ExVUVDd3dKVDNCbGJsTm9hV1owTVI4d0hRWURWUVFECkRCWnplWE4wWlcwdWJHOW5aMmx1Wnk1amRYSmhkRzl5TUlJQklqQU5CZ2txaGtpRzl3MEJBUUVGQUFPQ0FROEEKTUlJQkNnS0NBUUVBcHArV3VxeFVzZUp6c040RnVZamZxOVNQSXZUSWY4Yy81NHJLZU5pZTJJUU8zb2MzSDVxdQo2SFdlWklyUElZeVcxNGJZSXNVVDRqSW4wOVdjVU1sME9nZkpGN1ovT3JYL0FhRGNRMzJncGFVUzhQem1iajJ0Cm03c3dUelV3QzhFYjQ2UXhWaE5XakowWmlJaGlJN3V1UW5ZVzZsSTJqcnpCWHNPTEhtdzB2ZEk0N3hpcDVNZVUKZmwrU1hyN2VWYURqT24zYXBSZDc4ajN1VVBRaGk4MUFlNmFzNk90YlMrNkgwbndxODREcVFSNlNGRVdqVzh5dApVQkhkQzU2WWV2RzN2TURwTkFZMm9SV0ZzMjNuejdzMDN5SnJPMmRxRnR2N0Q4ZXdvUTZMcG4yT3dIcy9nOFpzCjd3aW1jMW5ZKzA2MExzZmFpbzkwSHlUc0k5RnJobVlqU1FJREFRQUJvMll3WkRBT0JnTlZIUThCQWY4RUJBTUMKQmFBd0NRWURWUjBUQkFJd0FEQWRCZ05WSFNVRUZqQVVCZ2dyQmdFRkJRY0RBUVlJS3dZQkJRVUhBd0l3SFFZRApWUjBPQkJZRUZFajVNaTZBK3dVcE5RSXJibGR3UVhaVHdFaTNNQWtHQTFVZEl3UUNNQUF3RFFZSktvWklodmNOCkFRRUZCUUFEZ2dFQkFDZUJnVHRsVnpNdjNoZnRvVGJ1VDBRTmtteDhBNHRJN0xaYkYwS254RHRkZGlkSTJGSUkKcEszV3JKMW5pcTI5dGkvaml3OXE5VTB3RWU4Z0pXYVhoVlB4Z0lsWG9jVHYwMFB6U1RWZUJNT1k0OTVwbmREeQptSGxqNGRkeUp1cVZ2a3JzMU5rR3VYVjM2SnluWmthUGpxSU1DekY4cmZMQlFMMHlicitLdkxDWFNYRlJZcFM3Cm9FTGlCUk5RU0c3amdQUzF5dFBxYndhTE5mVnFvdUZOVHFNdG8wUzdYRUZZLzFhdGV0NnBmTTJHNXZCMjA4U1oKUll3bC9wV3h4NDlBVDZEaGJFWkFrcW5yVTlxWTFqcFI2TGUzTG5Pa1R4aUtLN2doOHAwbEY4eWU2V25JSnVZNQowQ25SZE1OajNmME1DWTFUV1Badm8yckdxTE5YSXRsRWxrRT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
"key": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV2d0lCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktrd2dnU2xBZ0VBQW9JQkFRQ21uNWE2ckZTeDRuT3cKM2dXNWlOK3IxSThpOU1oL3h6L25pc3A0Mko3WWhBN2VoemNmbXE3b2RaNWtpczhoakpiWGh0Z2l4UlBpTWlmVAoxWnhReVhRNkI4a1h0bjg2dGY4Qm9OeERmYUNscFJMdy9PWnVQYTJidXpCUE5UQUx3UnZqcERGV0UxYU1uUm1JCmlHSWp1NjVDZGhicVVqYU92TUZldzRzZWJEUzkwamp2R0tua3g1UitYNUpldnQ1Vm9PTTZmZHFsRjN2eVBlNVEKOUNHTHpVQjdwcXpvNjF0TDdvZlNmQ3J6Z09wQkhwSVVSYU5ieksxUUVkMExucGg2OGJlOHdPazBCamFoRllXegpiZWZQdXpUZkltczdaMm9XMi9zUHg3Q2hEb3VtZlk3QWV6K0R4bXp2Q0taeldkajdUclF1eDlxS2ozUWZKT3dqCjBXdUdaaU5KQWdNQkFBRUNnZ0VCQUpEYVFHRThtOHloUDA1TlZkQzd5eWRJZkw4NDZtMGJQTlQvOVpFbFVNS04KMjVkZEdYRGlPcGhnV0RpejYrb1FuTG4xd2tSSDdFZENyeGticE84ZWsybzNobnlVN1BxUGFZZHkzc25WbHlrcAozdE9lS0gxQ0pZRXpOSVpIaU9OUEYvM0lxaE5ZY0crQnk4YkVPZjB5bGdXMVA2cXBta0J3bS9MVU1FZGNibVZ0Cnl0Z0d5aGRGWjAzU1AwR28yYXJ3aE4wUGxGK0V1YjlKdmVwVmdRR3dZNVlMT0FpbS95SnRlTnlyWExCTlB1SEUKcU5IQmNqdmE1WnNORW1TRnpxU2gxN3d0Y3djN2NGTmdrdEtTdnFMNDhsNDVBS042TFBLbEpnWG96ZWtvemg3dAp6MDRKRSs3RGY5RlkrdlRGSm1XbUl5UVFzTHF3Q3o5Q2ZEcFU3M3B1RExrQ2dZRUEydWUrNXJma1M4djJ2NTFjCnVWMFJJY0RRQVJkZVozZDYzU3dqRlJuSjRCV25BcWRKN01BZWQxcU9nWlErMHJIaDF6WFlreGxWNmxQa3UvbFAKQWtmWm53ZlJhT3V6dEhrUzB0Q3RJVDhnUXNLZlViMm1YWmF5ZUZnLzVmZkQrM3lHWWNOWlMrUE5GZ1BReUpyaApkTG12QjUvRzZ4QXlyMjZMa3dFWmNOVlJvRHNDZ1lFQXd0dlNSOU5iWjVpZXZOMVV3ZW5BT0p5cGFicWRaNXJvCndFem9nYWtmR0NkUVdKUUJkbzVGdURGaUJieklFNFVlMUR4c21LU0p3SzlMRXl2UVYwRjZ1elhudjlCOEJ6NVcKalNuOXpFTjhDUE12am9NU0N2RTR1TXNTamNsYUlZOEFia3QzZWhMT2htUVh3Wm9hWldCR1EyV2pxUkkwYVBWTgpiRThyZ0pYVGRrc0NnWUVBenVDVkdaWlF1eEZYY0Y5WGNoYnlTZVBlc0NsVm5wTGNHb25MM2ZVeFJBVXlnTjk0CmpiWkRGS2tRWklXbG1abGl4ZkN0Yk5kVXlzL2VLNGZCazNZenhJZXU4R2xRdkE1d0s2dnE4ckNsM0hIeC8xNHMKQythUFpBeUMxdU1BNUhzYXhPbkpTbDlQUXE3NGNaMXQxTkpuQjkySU1ENXVxRHpneTEwT25nUDcwR0VDZ1lFQQpwZlNoK3hvL1Z0UGRIZTFES29QeWVrU3k1S0ZUUGRIcE9SNUhSMmJLWEwvZTVoSG56UmtPdC9rWEN0dmxhdnFOCmJ5U25PTG1wdUxtU3J4azVyNWJNK0hUSWs4ejBWUmI2aisrYmdFUlpkeVhtOHZFZWhNTTlFK1dnUDdHbFdiOTUKRm5hZm94QXhaTFFLcDVDZnVvZHNVQ3BZWUw1b2RKdTIvTy9RbFFzRnRsRUNnWUJJUExqWnpkdmhyc09aNjZLZwpyVlBkYThyV1Y5aEdBam1DUkFHdkVFUHcxcjJLOEdSd2dPSFU3djV0OVg5Y0Y5bTVBT1NkYmJMV2FxMXI5SnVvClkyMVF5c2pBaGJMdGVHclR4L1c1M3BCNXdOL0Z1Z2tSVWE1ejE3U1ViRmRRVHFpNkI0dUJFVFV0ZGZjclZEcmYKZXBTMTBBTjIvUnJRWjNvYUVaSHZtUWNvTnc9PQotLS0tLUVORCBQUklWQVRFIEtFWS0tLS0tCg=="
},
"kind": "Secret",
"metadata": {
"creationTimestamp": null,
"name": "logging-curator"
},
"type": "Opaque"
},
"state": "present"
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:75
ok: [openshift] => {
"ansible_facts": {
"curator_component": "curator-ops",
"curator_name": "logging-curator-ops"
},
"changed": false
}
TASK [openshift_logging_curator : Generate Curator deploymentconfig] ***********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:81
ok: [openshift] => {
"changed": false,
"checksum": "96c349b7f8a16c0be141e65654f5dd2401db0d04",
"dest": "/tmp/openshift-logging-ansible-rrg5iz/templates/curator-dc.yaml",
"gid": 0,
"group": "root",
"md5sum": "0241f1ecfa3fe4ffd914d6eccd650565",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2364,
"src": "/root/.ansible/tmp/ansible-tmp-1496937682.27-137109705271937/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_curator : Set Curator DC] ******************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:99
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get dc logging-curator-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "DeploymentConfig",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:23Z",
"generation": 2,
"labels": {
"component": "curator-ops",
"logging-infra": "curator",
"provider": "openshift"
},
"name": "logging-curator-ops",
"namespace": "logging",
"resourceVersion": "1690",
"selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-curator-ops",
"uid": "b810e577-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"replicas": 1,
"selector": {
"component": "curator-ops",
"logging-infra": "curator",
"provider": "openshift"
},
"strategy": {
"activeDeadlineSeconds": 21600,
"recreateParams": {
"timeoutSeconds": 600
},
"resources": {},
"rollingParams": {
"intervalSeconds": 1,
"maxSurge": "25%",
"maxUnavailable": "25%",
"timeoutSeconds": 600,
"updatePeriodSeconds": 1
},
"type": "Recreate"
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "curator-ops",
"logging-infra": "curator",
"provider": "openshift"
},
"name": "logging-curator-ops"
},
"spec": {
"containers": [
{
"env": [
{
"name": "K8S_HOST_URL",
"value": "https://kubernetes.default.svc.cluster.local"
},
{
"name": "ES_HOST",
"value": "logging-es-ops"
},
{
"name": "ES_PORT",
"value": "9200"
},
{
"name": "ES_CLIENT_CERT",
"value": "/etc/curator/keys/cert"
},
{
"name": "ES_CLIENT_KEY",
"value": "/etc/curator/keys/key"
},
{
"name": "ES_CA",
"value": "/etc/curator/keys/ca"
},
{
"name": "CURATOR_DEFAULT_DAYS",
"value": "30"
},
{
"name": "CURATOR_RUN_HOUR",
"value": "0"
},
{
"name": "CURATOR_RUN_MINUTE",
"value": "0"
},
{
"name": "CURATOR_RUN_TIMEZONE",
"value": "UTC"
},
{
"name": "CURATOR_SCRIPT_LOG_LEVEL",
"value": "INFO"
},
{
"name": "CURATOR_LOG_LEVEL",
"value": "ERROR"
}
],
"image": "172.30.255.47:5000/logging/logging-curator:latest",
"imagePullPolicy": "Always",
"name": "curator",
"resources": {
"limits": {
"cpu": "100m"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/etc/curator/keys",
"name": "certs",
"readOnly": true
},
{
"mountPath": "/etc/curator/settings",
"name": "config",
"readOnly": true
}
]
}
],
"dnsPolicy": "ClusterFirst",
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {},
"serviceAccount": "aggregated-logging-curator",
"serviceAccountName": "aggregated-logging-curator",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"name": "certs",
"secret": {
"defaultMode": 420,
"secretName": "logging-curator"
}
},
{
"configMap": {
"defaultMode": 420,
"name": "logging-curator"
},
"name": "config"
}
]
}
},
"test": false,
"triggers": [
{
"type": "ConfigChange"
}
]
},
"status": {
"availableReplicas": 0,
"conditions": [
{
"lastTransitionTime": "2017-06-08T16:01:23Z",
"lastUpdateTime": "2017-06-08T16:01:23Z",
"message": "Deployment config does not have minimum availability.",
"status": "False",
"type": "Available"
},
{
"lastTransitionTime": "2017-06-08T16:01:23Z",
"lastUpdateTime": "2017-06-08T16:01:23Z",
"message": "replication controller \"logging-curator-ops-1\" is waiting for pod \"logging-curator-ops-1-deploy\" to run",
"status": "Unknown",
"type": "Progressing"
}
],
"details": {
"causes": [
{
"type": "ConfigChange"
}
],
"message": "config change"
},
"latestVersion": 1,
"observedGeneration": 2,
"replicas": 0,
"unavailableReplicas": 0,
"updatedReplicas": 0
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : Delete temp directory] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:109
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-rrg5iz",
"state": "absent"
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:226
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:241
statically included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:2
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_fluentd_nodeselector.keys()
| count }} > 1
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:6
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:10
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:14
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"fluentd_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:15
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:20
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:26
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : Create temp directory for doing work in] *****
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:33
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.002034",
"end": "2017-06-08 12:01:26.895004",
"rc": 0,
"start": "2017-06-08 12:01:26.892970"
}
STDOUT:
/tmp/openshift-logging-ansible-93vigU
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:38
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-93vigU"
},
"changed": false
}
TASK [openshift_logging_fluentd : Create templates subdirectory] ***************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:41
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-93vigU/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_fluentd : Create Fluentd service account] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:51
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : Create Fluentd service account] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:59
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-fluentd -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-fluentd-dockercfg-l7k3x"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:27Z",
"name": "aggregated-logging-fluentd",
"namespace": "logging",
"resourceVersion": "1704",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-fluentd",
"uid": "bac2706a-4c63-11e7-94aa-0e1649350dc2"
},
"secrets": [
{
"name": "aggregated-logging-fluentd-dockercfg-l7k3x"
},
{
"name": "aggregated-logging-fluentd-token-zc0vw"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_fluentd : Set privileged permissions for Fluentd] ******
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:68
changed: [openshift] => {
"changed": true,
"present": "present",
"results": {
"cmd": "/bin/oc adm policy add-scc-to-user privileged system:serviceaccount:logging:aggregated-logging-fluentd -n logging",
"results": "",
"returncode": 0
}
}
TASK [openshift_logging_fluentd : Set cluster-reader permissions for Fluentd] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:77
changed: [openshift] => {
"changed": true,
"present": "present",
"results": {
"cmd": "/bin/oc adm policy add-cluster-role-to-user cluster-reader system:serviceaccount:logging:aggregated-logging-fluentd -n logging",
"results": "",
"returncode": 0
}
}
TASK [openshift_logging_fluentd : template] ************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:86
ok: [openshift] => {
"changed": false,
"checksum": "a8c8596f5fc2c5dd7c8d33d244af17a2555be086",
"dest": "/tmp/openshift-logging-ansible-93vigU/fluent.conf",
"gid": 0,
"group": "root",
"md5sum": "579698b48ffce6276ee0e8d5ac71a338",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 1301,
"src": "/root/.ansible/tmp/ansible-tmp-1496937689.43-239761296344115/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:94
ok: [openshift] => {
"changed": false,
"checksum": "b3e75eddc4a0765edc77da092384c0c6f95440e1",
"dest": "/tmp/openshift-logging-ansible-93vigU/fluentd-throttle-config.yaml",
"gid": 0,
"group": "root",
"md5sum": "25871b8e0a9bedc166a6029872a6c336",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 133,
"src": "/root/.ansible/tmp/ansible-tmp-1496937689.91-23262744748646/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:100
ok: [openshift] => {
"changed": false,
"checksum": "a3aa36da13f3108aa4ad5b98d4866007b44e9798",
"dest": "/tmp/openshift-logging-ansible-93vigU/secure-forward.conf",
"gid": 0,
"group": "root",
"md5sum": "1084b00c427f4fa48dfc66d6ad6555d4",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 563,
"src": "/root/.ansible/tmp/ansible-tmp-1496937690.23-193864224121332/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:107
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:113
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:119
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : Set Fluentd configmap] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:125
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get configmap logging-fluentd -o json -n logging",
"results": [
{
"apiVersion": "v1",
"data": {
"fluent.conf": "# This file is the fluentd configuration entrypoint. Edit with care.\n\n@include configs.d/openshift/system.conf\n\n# In each section below, pre- and post- includes don't include anything initially;\n# they exist to enable future additions to openshift conf as needed.\n\n## sources\n## ordered so that syslog always runs last...\n@include configs.d/openshift/input-pre-*.conf\n@include configs.d/dynamic/input-docker-*.conf\n@include configs.d/dynamic/input-syslog-*.conf\n@include configs.d/openshift/input-post-*.conf\n##\n\n<label @INGRESS>\n## filters\n @include configs.d/openshift/filter-pre-*.conf\n @include configs.d/openshift/filter-retag-journal.conf\n @include configs.d/openshift/filter-k8s-meta.conf\n @include configs.d/openshift/filter-kibana-transform.conf\n @include configs.d/openshift/filter-k8s-flatten-hash.conf\n @include configs.d/openshift/filter-k8s-record-transform.conf\n @include configs.d/openshift/filter-syslog-record-transform.conf\n @include configs.d/openshift/filter-viaq-data-model.conf\n @include configs.d/openshift/filter-post-*.conf\n##\n\n## matches\n @include configs.d/openshift/output-pre-*.conf\n @include configs.d/openshift/output-operations.conf\n @include configs.d/openshift/output-applications.conf\n # no post - applications.conf matches everything left\n##\n</label>\n",
"secure-forward.conf": "# @type secure_forward\n\n# self_hostname ${HOSTNAME}\n# shared_key <SECRET_STRING>\n\n# secure yes\n# enable_strict_verification yes\n\n# ca_cert_path /etc/fluent/keys/your_ca_cert\n# ca_private_key_path /etc/fluent/keys/your_private_key\n # for private CA secret key\n# ca_private_key_passphrase passphrase\n\n# <server>\n # or IP\n# host server.fqdn.example.com\n# port 24284\n# </server>\n# <server>\n # ip address to connect\n# host 203.0.113.8\n # specify hostlabel for FQDN verification if ipaddress is used for host\n# hostlabel server.fqdn.example.com\n# </server>\n",
"throttle-config.yaml": "# Logging example fluentd throttling config file\n\n#example-project:\n# read_lines_limit: 10\n#\n#.operations:\n# read_lines_limit: 100\n"
},
"kind": "ConfigMap",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:31Z",
"name": "logging-fluentd",
"namespace": "logging",
"resourceVersion": "1728",
"selfLink": "/api/v1/namespaces/logging/configmaps/logging-fluentd",
"uid": "bccc79f7-4c63-11e7-94aa-0e1649350dc2"
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_fluentd : Set logging-fluentd secret] ******************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:137
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc secrets new logging-fluentd ca=/etc/origin/logging/ca.crt key=/etc/origin/logging/system.logging.fluentd.key cert=/etc/origin/logging/system.logging.fluentd.crt -n logging",
"results": "",
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_fluentd : Generate logging-fluentd daemonset definition] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:154
ok: [openshift] => {
"changed": false,
"checksum": "3d22736cd2f7d6f32c9aff1922f56b8d588ca2db",
"dest": "/tmp/openshift-logging-ansible-93vigU/templates/logging-fluentd.yaml",
"gid": 0,
"group": "root",
"md5sum": "6c0a7f6ddcbde86e304d77e889833a87",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 3414,
"src": "/root/.ansible/tmp/ansible-tmp-1496937692.0-213176700497659/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_fluentd : Set logging-fluentd daemonset] ***************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:172
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get daemonset logging-fluentd -o json -n logging",
"results": [
{
"apiVersion": "extensions/v1beta1",
"kind": "DaemonSet",
"metadata": {
"creationTimestamp": "2017-06-08T16:01:33Z",
"generation": 1,
"labels": {
"component": "fluentd",
"logging-infra": "fluentd",
"provider": "openshift"
},
"name": "logging-fluentd",
"namespace": "logging",
"resourceVersion": "1738",
"selfLink": "/apis/extensions/v1beta1/namespaces/logging/daemonsets/logging-fluentd",
"uid": "be081477-4c63-11e7-94aa-0e1649350dc2"
},
"spec": {
"selector": {
"matchLabels": {
"component": "fluentd",
"provider": "openshift"
}
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "fluentd",
"logging-infra": "fluentd",
"provider": "openshift"
},
"name": "fluentd-elasticsearch"
},
"spec": {
"containers": [
{
"env": [
{
"name": "K8S_HOST_URL",
"value": "https://kubernetes.default.svc.cluster.local"
},
{
"name": "ES_HOST",
"value": "logging-es"
},
{
"name": "ES_PORT",
"value": "9200"
},
{
"name": "ES_CLIENT_CERT",
"value": "/etc/fluent/keys/cert"
},
{
"name": "ES_CLIENT_KEY",
"value": "/etc/fluent/keys/key"
},
{
"name": "ES_CA",
"value": "/etc/fluent/keys/ca"
},
{
"name": "OPS_HOST",
"value": "logging-es-ops"
},
{
"name": "OPS_PORT",
"value": "9200"
},
{
"name": "OPS_CLIENT_CERT",
"value": "/etc/fluent/keys/cert"
},
{
"name": "OPS_CLIENT_KEY",
"value": "/etc/fluent/keys/key"
},
{
"name": "OPS_CA",
"value": "/etc/fluent/keys/ca"
},
{
"name": "ES_COPY",
"value": "false"
},
{
"name": "USE_JOURNAL",
"value": "true"
},
{
"name": "JOURNAL_SOURCE"
},
{
"name": "JOURNAL_READ_FROM_HEAD",
"value": "false"
}
],
"image": "172.30.255.47:5000/logging/logging-fluentd:latest",
"imagePullPolicy": "Always",
"name": "fluentd-elasticsearch",
"resources": {
"limits": {
"cpu": "100m",
"memory": "512Mi"
}
},
"securityContext": {
"privileged": true
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/run/log/journal",
"name": "runlogjournal"
},
{
"mountPath": "/var/log",
"name": "varlog"
},
{
"mountPath": "/var/lib/docker/containers",
"name": "varlibdockercontainers",
"readOnly": true
},
{
"mountPath": "/etc/fluent/configs.d/user",
"name": "config",
"readOnly": true
},
{
"mountPath": "/etc/fluent/keys",
"name": "certs",
"readOnly": true
},
{
"mountPath": "/etc/docker-hostname",
"name": "dockerhostname",
"readOnly": true
},
{
"mountPath": "/etc/localtime",
"name": "localtime",
"readOnly": true
},
{
"mountPath": "/etc/sysconfig/docker",
"name": "dockercfg",
"readOnly": true
},
{
"mountPath": "/etc/docker",
"name": "dockerdaemoncfg",
"readOnly": true
}
]
}
],
"dnsPolicy": "ClusterFirst",
"nodeSelector": {
"logging-infra-fluentd": "true"
},
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {},
"serviceAccount": "aggregated-logging-fluentd",
"serviceAccountName": "aggregated-logging-fluentd",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"hostPath": {
"path": "/run/log/journal"
},
"name": "runlogjournal"
},
{
"hostPath": {
"path": "/var/log"
},
"name": "varlog"
},
{
"hostPath": {
"path": "/var/lib/docker/containers"
},
"name": "varlibdockercontainers"
},
{
"configMap": {
"defaultMode": 420,
"name": "logging-fluentd"
},
"name": "config"
},
{
"name": "certs",
"secret": {
"defaultMode": 420,
"secretName": "logging-fluentd"
}
},
{
"hostPath": {
"path": "/etc/hostname"
},
"name": "dockerhostname"
},
{
"hostPath": {
"path": "/etc/localtime"
},
"name": "localtime"
},
{
"hostPath": {
"path": "/etc/sysconfig/docker"
},
"name": "dockercfg"
},
{
"hostPath": {
"path": "/etc/docker"
},
"name": "dockerdaemoncfg"
}
]
}
},
"templateGeneration": 1,
"updateStrategy": {
"rollingUpdate": {
"maxUnavailable": 1
},
"type": "RollingUpdate"
}
},
"status": {
"currentNumberScheduled": 0,
"desiredNumberScheduled": 0,
"numberMisscheduled": 0,
"numberReady": 0,
"observedGeneration": 1
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_fluentd : Retrieve list of Fluentd hosts] **************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:183
ok: [openshift] => {
"changed": false,
"results": {
"cmd": "/bin/oc get node -o json -n default",
"results": [
{
"apiVersion": "v1",
"items": [
{
"apiVersion": "v1",
"kind": "Node",
"metadata": {
"annotations": {
"volumes.kubernetes.io/controller-managed-attach-detach": "true"
},
"creationTimestamp": "2017-06-08T15:38:33Z",
"labels": {
"beta.kubernetes.io/arch": "amd64",
"beta.kubernetes.io/os": "linux",
"kubernetes.io/hostname": "172.18.3.237"
},
"name": "172.18.3.237",
"namespace": "",
"resourceVersion": "1705",
"selfLink": "/api/v1/nodes/172.18.3.237",
"uid": "8768a5bc-4c60-11e7-94aa-0e1649350dc2"
},
"spec": {
"externalID": "172.18.3.237",
"providerID": "aws:////i-06878b3e9e9644cee"
},
"status": {
"addresses": [
{
"address": "172.18.3.237",
"type": "LegacyHostIP"
},
{
"address": "172.18.3.237",
"type": "InternalIP"
},
{
"address": "172.18.3.237",
"type": "Hostname"
}
],
"allocatable": {
"cpu": "4",
"memory": "7129288Ki",
"pods": "40"
},
"capacity": {
"cpu": "4",
"memory": "7231688Ki",
"pods": "40"
},
"conditions": [
{
"lastHeartbeatTime": "2017-06-08T16:01:28Z",
"lastTransitionTime": "2017-06-08T15:38:33Z",
"message": "kubelet has sufficient disk space available",
"reason": "KubeletHasSufficientDisk",
"status": "False",
"type": "OutOfDisk"
},
{
"lastHeartbeatTime": "2017-06-08T16:01:28Z",
"lastTransitionTime": "2017-06-08T15:38:33Z",
"message": "kubelet has sufficient memory available",
"reason": "KubeletHasSufficientMemory",
"status": "False",
"type": "MemoryPressure"
},
{
"lastHeartbeatTime": "2017-06-08T16:01:28Z",
"lastTransitionTime": "2017-06-08T15:38:33Z",
"message": "kubelet has no disk pressure",
"reason": "KubeletHasNoDiskPressure",
"status": "False",
"type": "DiskPressure"
},
{
"lastHeartbeatTime": "2017-06-08T16:01:28Z",
"lastTransitionTime": "2017-06-08T15:38:33Z",
"message": "kubelet is posting ready status",
"reason": "KubeletReady",
"status": "True",
"type": "Ready"
}
],
"daemonEndpoints": {
"kubeletEndpoint": {
"Port": 10250
}
},
"images": [
{
"names": [
"openshift/origin-federation:6acabdc",
"openshift/origin-federation:latest"
],
"sizeBytes": 1205885664
},
{
"names": [
"docker.io/openshift/origin-docker-registry@sha256:54f022c67562440fb5cc73421f32624747cd7836d45b9bb1f3e144eec437be12",
"docker.io/openshift/origin-docker-registry:latest"
],
"sizeBytes": 1100553091
},
{
"names": [
"openshift/origin-docker-registry:latest"
],
"sizeBytes": 1100164272
},
{
"names": [
"openshift/node:6acabdc",
"openshift/node:latest"
],
"sizeBytes": 1051721928
},
{
"names": [
"openshift/origin-haproxy-router:6acabdc",
"openshift/origin-haproxy-router:latest"
],
"sizeBytes": 1022758742
},
{
"names": [
"openshift/origin-docker-builder:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"openshift/origin-recycler:6acabdc",
"openshift/origin-recycler:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"openshift/origin-deployer:6acabdc",
"openshift/origin-deployer:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"openshift/origin:6acabdc",
"openshift/origin:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"openshift/origin-cluster-capacity:6acabdc",
"openshift/origin-cluster-capacity:latest"
],
"sizeBytes": 962455026
},
{
"names": [
"openshift/dind-master:latest"
],
"sizeBytes": 731456758
},
{
"names": [
"openshift/dind-node:latest"
],
"sizeBytes": 731453034
},
{
"names": [
"172.30.255.47:5000/logging/logging-auth-proxy@sha256:6ae7f9f4986fcacfc0397cb05b2c52234dfc214e82c1dfb3ed04fee27e471935",
"172.30.255.47:5000/logging/logging-auth-proxy:latest"
],
"sizeBytes": 715535980
},
{
"names": [
"<none>@<none>",
"<none>:<none>"
],
"sizeBytes": 709532011
},
{
"names": [
"docker.io/node@sha256:46db0dd19955beb87b841c30a6b9812ba626473283e84117d1c016deee5949a9",
"docker.io/node:0.10.36"
],
"sizeBytes": 697128386
},
{
"names": [
"docker.io/openshift/origin-logging-kibana@sha256:70ead525ed596b73301e8df3ac229e33dd7f8431ec1233b37e96544c556530e9",
"docker.io/openshift/origin-logging-kibana:latest"
],
"sizeBytes": 682851528
},
{
"names": [
"172.30.255.47:5000/logging/logging-kibana@sha256:56e34b1a2e934ab614299d4818546e7b0ad61fb03987188c80c880c86a59577a",
"172.30.255.47:5000/logging/logging-kibana:latest"
],
"sizeBytes": 682851513
},
{
"names": [
"openshift/dind:latest"
],
"sizeBytes": 640650210
},
{
"names": [
"172.30.255.47:5000/logging/logging-elasticsearch@sha256:a5bceb422ca90819dc7ac8847a8908fa1e033257961e387fb6e0967f9756bb7f",
"172.30.255.47:5000/logging/logging-elasticsearch:latest"
],
"sizeBytes": 623379762
},
{
"names": [
"172.30.255.47:5000/logging/logging-fluentd@sha256:e79d465f5fceb5b2fc70e3a3bf3d75dfb6d276919035f112cdf12f64d1653388",
"172.30.255.47:5000/logging/logging-fluentd:latest"
],
"sizeBytes": 472183180
},
{
"names": [
"172.30.255.47:5000/logging/logging-curator@sha256:4762bcdb87e470fc1246e0e2f320e5aa27bc077593da9c59f6dab4562f073fdf",
"172.30.255.47:5000/logging/logging-curator:latest"
],
"sizeBytes": 418288220
},
{
"names": [
"docker.io/openshift/base-centos7@sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c",
"docker.io/openshift/base-centos7:latest"
],
"sizeBytes": 383049978
},
{
"names": [
"rhel7.2:latest"
],
"sizeBytes": 377493597
},
{
"names": [
"openshift/origin-egress-router:6acabdc",
"openshift/origin-egress-router:latest"
],
"sizeBytes": 364745713
},
{
"names": [
"openshift/origin-base:latest"
],
"sizeBytes": 363070172
},
{
"names": [
"docker.io/openshift/origin-logging-fluentd@sha256:bc70848086a50bad58a2f41e166098e8ed351bf4dbe7af83caeb7a29f35b4395",
"docker.io/openshift/origin-logging-fluentd:latest"
],
"sizeBytes": 359217371
},
{
"names": [
"docker.io/fedora@sha256:69281ddd7b2600e5f2b17f1e12d7fba25207f459204fb2d15884f8432c479136",
"docker.io/fedora:25"
],
"sizeBytes": 230864375
},
{
"names": [
"docker.io/openshift/origin-logging-curator@sha256:e820338ca7fb0addfaec25d80d40a49f5ea25b24ff056ab6adbb42dd9eec94b4",
"docker.io/openshift/origin-logging-curator:latest"
],
"sizeBytes": 224977691
},
{
"names": [
"rhel7.3:latest",
"rhel7:latest"
],
"sizeBytes": 219121266
},
{
"names": [
"openshift/origin-pod:6acabdc",
"openshift/origin-pod:latest"
],
"sizeBytes": 213199843
},
{
"names": [
"registry.access.redhat.com/rhel7.2@sha256:98e6ca5d226c26e31a95cd67716afe22833c943e1926a21daf1a030906a02249",
"registry.access.redhat.com/rhel7.2:latest"
],
"sizeBytes": 201376319
},
{
"names": [
"registry.access.redhat.com/rhel7.3@sha256:1e232401d8e0ba53b36b757b4712fbcbd1dab9c21db039c45a84871a74e89e68",
"registry.access.redhat.com/rhel7.3:latest"
],
"sizeBytes": 192693772
},
{
"names": [
"docker.io/centos@sha256:bba1de7c9d900a898e3cadbae040dfe8a633c06bc104a0df76ae24483e03c077"
],
"sizeBytes": 192548999
},
{
"names": [
"openshift/origin-source:latest"
],
"sizeBytes": 192548894
},
{
"names": [
"docker.io/centos@sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9",
"docker.io/centos:7",
"docker.io/centos:centos7"
],
"sizeBytes": 192548537
},
{
"names": [
"registry.access.redhat.com/rhel7.1@sha256:1bc5a4c43bbb29a5a96a61896ff696933be3502e2f5fdc4cde02d9e101731fdd",
"registry.access.redhat.com/rhel7.1:latest"
],
"sizeBytes": 158229901
}
],
"nodeInfo": {
"architecture": "amd64",
"bootID": "c2cfc298-5593-4726-958d-742f09f4df0d",
"containerRuntimeVersion": "docker://1.12.6",
"kernelVersion": "3.10.0-327.22.2.el7.x86_64",
"kubeProxyVersion": "v1.6.1+5115d708d7",
"kubeletVersion": "v1.6.1+5115d708d7",
"machineID": "f9370ed252a14f73b014c1301a9b6d1b",
"operatingSystem": "linux",
"osImage": "Red Hat Enterprise Linux Server 7.3 (Maipo)",
"systemUUID": "EC2388AB-03B5-9846-ECC4-052DA3A164CF"
}
}
}
],
"kind": "List",
"metadata": {},
"resourceVersion": "",
"selfLink": ""
}
],
"returncode": 0
},
"state": "list"
}
TASK [openshift_logging_fluentd : Set openshift_logging_fluentd_hosts] *********
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:190
ok: [openshift] => {
"ansible_facts": {
"openshift_logging_fluentd_hosts": [
"172.18.3.237"
]
},
"changed": false
}
TASK [openshift_logging_fluentd : include] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:195
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/label_and_wait.yaml for openshift
TASK [openshift_logging_fluentd : Label 172.18.3.237 for Fluentd deployment] ***
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/label_and_wait.yaml:2
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc label node 172.18.3.237 logging-infra-fluentd=true --overwrite",
"results": "",
"returncode": 0
},
"state": "add"
}
TASK [openshift_logging_fluentd : command] *************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/label_and_wait.yaml:10
changed: [openshift -> 127.0.0.1] => {
"changed": true,
"cmd": [
"sleep",
"0.5"
],
"delta": "0:00:00.502260",
"end": "2017-06-08 12:01:35.810608",
"rc": 0,
"start": "2017-06-08 12:01:35.308348"
}
TASK [openshift_logging_fluentd : Delete temp directory] ***********************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:202
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-93vigU",
"state": "absent"
}
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:253
included: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/update_master_config.yaml for openshift
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/main.yaml:36
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Cleaning up local temp dir] **************************
task path: /tmp/tmp.giX4WBVEkB/openhift-ansible/roles/openshift_logging/tasks/main.yaml:40
ok: [openshift -> 127.0.0.1] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-BE0YYi",
"state": "absent"
}
META: ran handlers
META: ran handlers
PLAY [Update Master configs] ***************************************************
skipping: no hosts matched
PLAY RECAP *********************************************************************
localhost : ok=2 changed=0 unreachable=0 failed=0
openshift : ok=207 changed=70 unreachable=0 failed=0
/data/src/github.com/openshift/origin-aggregated-logging
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:170: executing 'oc get pods -l component=es' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.291s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:170: executing 'oc get pods -l component=es' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-es-data-master-9s2p0i8l-1-4pfln 1/1 Running 0 1m
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:171: executing 'oc get pods -l component=kibana' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.232s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:171: executing 'oc get pods -l component=kibana' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-kibana-1-btlws 2/2 Running 0 36s
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:172: executing 'oc get pods -l component=curator' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 13.883s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:172: executing 'oc get pods -l component=curator' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 9s
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 10s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 11s
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 12s
... repeated 3 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 13s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 14s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 15s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 16s
... repeated 3 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 17s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 18s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 19s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 20s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 21s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 0/1 ContainerCreating 0 22s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-f45s0 1/1 Running 0 23s
Standard error from the command:
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:175: executing 'oc get pods -l component=es-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.270s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:175: executing 'oc get pods -l component=es-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-es-ops-data-master-obwim1kt-1-5j1nm 1/1 Running 0 1m
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:176: executing 'oc get pods -l component=kibana-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.267s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:176: executing 'oc get pods -l component=kibana-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-kibana-ops-1-8g75g 1/2 Running 0 30s
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:177: executing 'oc get pods -l component=curator-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 1.120s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:177: executing 'oc get pods -l component=curator-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-curator-ops-1-38441 0/1 ContainerCreating 0 13s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-ops-1-38441 1/1 Running 0 14s
Standard error from the command:
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:185: executing 'oc project logging > /dev/null' expecting success...
SUCCESS after 0.267s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:185: executing 'oc project logging > /dev/null' expecting success
There was no output from the command.
There was no error output from the command.
/data/src/github.com/openshift/origin-aggregated-logging/hack/testing /data/src/github.com/openshift/origin-aggregated-logging
--> Deploying template "logging/logging-fluentd-template-maker" for "-" to project logging
logging-fluentd-template-maker
---------
Template to create template for fluentd
* With parameters:
* MASTER_URL=https://kubernetes.default.svc.cluster.local
* ES_HOST=logging-es
* ES_PORT=9200
* ES_CLIENT_CERT=/etc/fluent/keys/cert
* ES_CLIENT_KEY=/etc/fluent/keys/key
* ES_CA=/etc/fluent/keys/ca
* OPS_HOST=logging-es-ops
* OPS_PORT=9200
* OPS_CLIENT_CERT=/etc/fluent/keys/cert
* OPS_CLIENT_KEY=/etc/fluent/keys/key
* OPS_CA=/etc/fluent/keys/ca
* ES_COPY=false
* ES_COPY_HOST=
* ES_COPY_PORT=
* ES_COPY_SCHEME=https
* ES_COPY_CLIENT_CERT=
* ES_COPY_CLIENT_KEY=
* ES_COPY_CA=
* ES_COPY_USERNAME=
* ES_COPY_PASSWORD=
* OPS_COPY_HOST=
* OPS_COPY_PORT=
* OPS_COPY_SCHEME=https
* OPS_COPY_CLIENT_CERT=
* OPS_COPY_CLIENT_KEY=
* OPS_COPY_CA=
* OPS_COPY_USERNAME=
* OPS_COPY_PASSWORD=
* IMAGE_PREFIX_DEFAULT=172.30.255.47:5000/logging/
* IMAGE_VERSION_DEFAULT=latest
* USE_JOURNAL=
* JOURNAL_SOURCE=
* JOURNAL_READ_FROM_HEAD=false
* USE_MUX=false
* USE_MUX_CLIENT=false
* MUX_ALLOW_EXTERNAL=false
* BUFFER_QUEUE_LIMIT=1024
* BUFFER_SIZE_LIMIT=16777216
--> Creating resources ...
template "logging-fluentd-template" created
--> Success
Run 'oc status' to view your app.
WARNING: bridge-nf-call-ip6tables is disabled
Error: timed out waiting for /var/log/journal.pos - check Fluentd pod log
[ERROR] PID 4246: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:615: `return 1` exited with status 1.
[INFO] Stack Trace:
[INFO] 1: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:615: `return 1`
[INFO] Exiting with code 1.
/data/src/github.com/openshift/origin-aggregated-logging/hack/lib/log/system.sh: line 31: 4649 Terminated sar -A -o "${binary_logfile}" 1 86400 > /dev/null 2> "${stderr_logfile}" (wd: /data/src/github.com/openshift/origin-aggregated-logging)
[INFO] [CLEANUP] Beginning cleanup routines...
[INFO] [CLEANUP] Dumping cluster events to /tmp/origin-aggregated-logging/artifacts/events.txt
[INFO] [CLEANUP] Dumping etcd contents to /tmp/origin-aggregated-logging/artifacts/etcd
[WARNING] No compiled `etcdhelper` binary was found. Attempting to build one using:
[WARNING] $ hack/build-go.sh tools/etcdhelper
Error while running ssh/sudo command:
set -e
pushd /data/src/github.com/openshift//origin-aggregated-logging/hack/testing >/dev/null
export PATH=$GOPATH/bin:$PATH
echo '***************************************************'
echo 'Running GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh...'
time GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh
echo 'Finished GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh'
echo '***************************************************'
popd >/dev/null
SIGTERM
==> openshiftdev: Downloading logs
Build was aborted
Aborted by Rich Megginson
Publish artifacts to S3 Bucket Skipping publishing on S3 because build aborted
[description-setter] Could not determine description.
[PostBuildScript] - Execution post build scripts.
[workspace] $ /bin/sh -xe /tmp/hudson25446529649546822.sh
+ INSTANCE_NAME=origin_logging-rhel7-1628
+ pushd origin
~/jobs/test-origin-aggregated-logging/workspace/origin ~/jobs/test-origin-aggregated-logging/workspace
+ rc=0
+ '[' -f .vagrant-openshift.json ']'
++ /usr/bin/vagrant ssh -c 'sudo ausearch -m avc'
==> openshiftdev: Downloading artifacts from '/var/log/yum.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts/yum.log'
==> openshiftdev: Downloading artifacts from '/var/log/secure' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts/secure'
==> openshiftdev: Downloading artifacts from '/var/log/audit/audit.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts/audit.log'
==> openshiftdev: Downloading artifacts from '/tmp/origin-aggregated-logging/' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts'
+ ausearchresult='<no matches>'
+ rc=1
+ '[' '<no matches>' = '<no matches>' ']'
+ rc=0
+ /usr/bin/vagrant destroy -f
==> openshiftdev: Terminating the instance...
==> openshiftdev: Running cleanup tasks for 'shell' provisioner...
+ popd
~/jobs/test-origin-aggregated-logging/workspace
+ exit 0
[BFA] Scanning build for known causes...
[BFA] Found failure cause(s):
[BFA] Command Failure from category failure
[BFA] Done. 0s
Finished: ABORTED