FailedConsole Output

Started by upstream project "merge_pull_request_origin_aggregated_logging" build number 24
originally caused by:
 Started by remote host 50.17.198.52
[EnvInject] - Loading node environment variables.
Building in workspace /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content 
OS_ROOT=/data/src/github.com/openshift/origin
INSTANCE_TYPE=c4.xlarge
GITHUB_REPO=openshift
OS=rhel7
TESTNAME=logging

[EnvInject] - Variables injected successfully.
[workspace] $ /bin/sh -xe /tmp/hudson8222972581909836687.sh
+ false
+ unset GOPATH
+ REPO_NAME=origin-aggregated-logging
+ rm -rf origin-aggregated-logging
+ vagrant origin-local-checkout --replace --repo origin-aggregated-logging -b master
You don't seem to have the GOPATH environment variable set on your system.
See: 'go help gopath' for more details about GOPATH.
Waiting for the cloning process to finish
Cloning origin-aggregated-logging ...
Submodule 'deployer/common' (https://github.com/openshift/origin-integration-common) registered for path 'deployer/common'
Submodule 'kibana-proxy' (https://github.com/fabric8io/openshift-auth-proxy.git) registered for path 'kibana-proxy'
Cloning into 'deployer/common'...
Submodule path 'deployer/common': checked out '45bf993212cdcbab5cbce3b3fab74a72b851402e'
Cloning into 'kibana-proxy'...
Submodule path 'kibana-proxy': checked out '118dfb40f7a8082d370ba7f4805255c9ec7c8178'
Origin repositories cloned into /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
+ pushd origin-aggregated-logging
~/jobs/test-origin-aggregated-logging/workspace/origin-aggregated-logging ~/jobs/test-origin-aggregated-logging/workspace
+ git checkout master
Already on 'master'
+ popd
~/jobs/test-origin-aggregated-logging/workspace
+ '[' -n '' ']'
+ vagrant origin-local-checkout --replace
You don't seem to have the GOPATH environment variable set on your system.
See: 'go help gopath' for more details about GOPATH.
Waiting for the cloning process to finish
Checking repo integrity for /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin
~/jobs/test-origin-aggregated-logging/workspace/origin ~/jobs/test-origin-aggregated-logging/workspace
# On branch master
# Untracked files:
#   (use "git add <file>..." to include in what will be committed)
#
#	artifacts/
nothing added to commit but untracked files present (use "git add" to track)
~/jobs/test-origin-aggregated-logging/workspace
Replacing: /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin
~/jobs/test-origin-aggregated-logging/workspace/origin ~/jobs/test-origin-aggregated-logging/workspace
From https://github.com/openshift/origin
   2458531..cc2ed8f  master     -> origin/master
Already on 'master'
Your branch is behind 'origin/master' by 2 commits, and can be fast-forwarded.
  (use "git pull" to update your local branch)
HEAD is now at cc2ed8f Merge pull request #13586 from danwinship/egress-router-proxy
Removing .vagrant-openshift.json
Removing .vagrant/
Removing artifacts/
fatal: branch name required
~/jobs/test-origin-aggregated-logging/workspace
Origin repositories cloned into /var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
+ pushd origin
~/jobs/test-origin-aggregated-logging/workspace/origin ~/jobs/test-origin-aggregated-logging/workspace
+ INSTANCE_NAME=origin_logging-rhel7-1652
+ GIT_URL=https://github.com/openshift/origin-aggregated-logging
++ echo https://github.com/openshift/origin-aggregated-logging
++ sed s,https://,,
+ OAL_LOCAL_PATH=github.com/openshift/origin-aggregated-logging
+ OS_O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging
+ env
+ sort
_=/bin/env
BRANCH=master
BUILD_CAUSE=UPSTREAMTRIGGER
BUILD_CAUSE_UPSTREAMTRIGGER=true
BUILD_DISPLAY_NAME=#1652
BUILD_ID=1652
BUILD_NUMBER=1652
BUILD_TAG=jenkins-test-origin-aggregated-logging-1652
BUILD_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1652/
EXECUTOR_NUMBER=92
GITHUB_REPO=openshift
HOME=/var/lib/jenkins
HUDSON_COOKIE=7fbf95ce-0c08-4510-927a-57484115c25e
HUDSON_HOME=/var/lib/jenkins
HUDSON_SERVER_COOKIE=ec11f8b2841c966f
HUDSON_URL=https://ci.openshift.redhat.com/jenkins/
INSTANCE_TYPE=c4.xlarge
JENKINS_HOME=/var/lib/jenkins
JENKINS_SERVER_COOKIE=ec11f8b2841c966f
JENKINS_URL=https://ci.openshift.redhat.com/jenkins/
JOB_BASE_NAME=test-origin-aggregated-logging
JOB_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/display/redirect
JOB_NAME=test-origin-aggregated-logging
JOB_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/
LANG=en_US.UTF-8
LOGNAME=jenkins
MERGE=false
MERGE_SEVERITY=none
NLSPATH=/usr/dt/lib/nls/msg/%L/%N.cat
NODE_LABELS=master
NODE_NAME=master
OLDPWD=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
OPENSHIFT_ANSIBLE_TARGET_BRANCH=master
ORIGIN_AGGREGATED_LOGGING_PULL_ID=421
ORIGIN_AGGREGATED_LOGGING_TARGET_BRANCH=master
OS_ANSIBLE_BRANCH=master
OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible 
OS=rhel7
OS_ROOT=/data/src/github.com/openshift/origin
PATH=/sbin:/usr/sbin:/bin:/usr/bin
PWD=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin
ROOT_BUILD_CAUSE=REMOTECAUSE
ROOT_BUILD_CAUSE_REMOTECAUSE=true
RUN_CHANGES_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1652/display/redirect?page=changes
RUN_DISPLAY_URL=https://ci.openshift.redhat.com/jenkins/job/test-origin-aggregated-logging/1652/display/redirect
SHELL=/bin/bash
SHLVL=3
TESTNAME=logging
TEST_PERF=false
USER=jenkins
WORKSPACE=/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace
XFILESEARCHPATH=/usr/dt/app-defaults/%L/Dt
+ vagrant origin-init --stage inst --os rhel7 --instance-type c4.xlarge origin_logging-rhel7-1652
Reading AWS credentials from /var/lib/jenkins/.awscred
Searching devenv-rhel7_* for latest base AMI (required_name_tag=)
Found: ami-83a1fc95 (devenv-rhel7_6323)
++ seq 0 2
+ for i in '$(seq 0 2)'
+ vagrant up --provider aws
Bringing machine 'openshiftdev' up with 'aws' provider...
==> openshiftdev: Warning! The AWS provider doesn't support any of the Vagrant
==> openshiftdev: high-level network configurations (`config.vm.network`). They
==> openshiftdev: will be silently ignored.
==> openshiftdev: Warning! You're launching this instance into a VPC without an
==> openshiftdev: elastic IP. Please verify you're properly connected to a VPN so
==> openshiftdev: you can access this machine, otherwise Vagrant will not be able
==> openshiftdev: to SSH into it.
==> openshiftdev: Launching an instance with the following settings...
==> openshiftdev:  -- Type: c4.xlarge
==> openshiftdev:  -- AMI: ami-83a1fc95
==> openshiftdev:  -- Region: us-east-1
==> openshiftdev:  -- Keypair: libra
==> openshiftdev:  -- Subnet ID: subnet-cf57c596
==> openshiftdev:  -- User Data: yes
==> openshiftdev:  -- User Data: 
==> openshiftdev: # cloud-config
==> openshiftdev: 
==> openshiftdev: growpart:
==> openshiftdev:   mode: auto
==> openshiftdev:   devices: ['/']
==> openshiftdev: runcmd:
==> openshiftdev: - [ sh, -xc, "sed -i s/^Defaults.*requiretty/#Defaults requiretty/g /etc/sudoers"]
==> openshiftdev:         
==> openshiftdev:  -- Block Device Mapping: [{"DeviceName"=>"/dev/sda1", "Ebs.VolumeSize"=>25, "Ebs.VolumeType"=>"gp2"}, {"DeviceName"=>"/dev/sdb", "Ebs.VolumeSize"=>35, "Ebs.VolumeType"=>"gp2"}]
==> openshiftdev:  -- Terminate On Shutdown: false
==> openshiftdev:  -- Monitoring: false
==> openshiftdev:  -- EBS optimized: false
==> openshiftdev:  -- Assigning a public IP address in a VPC: false
==> openshiftdev: Waiting for instance to become "ready"...
==> openshiftdev: Waiting for SSH to become available...
==> openshiftdev: Machine is booted and ready for use!
==> openshiftdev: Running provisioner: setup (shell)...
    openshiftdev: Running: /tmp/vagrant-shell20170609-24034-1hpbxhb.sh
==> openshiftdev: Host: ec2-54-90-235-203.compute-1.amazonaws.com
+ break
+ vagrant sync-origin-aggregated-logging -c -s
Running ssh/sudo command 'rm -rf /data/src/github.com/openshift/origin-aggregated-logging-bare; 
' with timeout 14400. Attempt #0
Running ssh/sudo command 'mkdir -p /ec2-user/.ssh;
mv /tmp/file20170609-27227-7pl83d /ec2-user/.ssh/config &&
chown ec2-user:ec2-user /ec2-user/.ssh/config &&
chmod 0600 /ec2-user/.ssh/config' with timeout 14400. Attempt #0
Running ssh/sudo command 'mkdir -p /data/src/github.com/openshift/' with timeout 14400. Attempt #0
Running ssh/sudo command 'mkdir -p /data/src/github.com/openshift/builder && chown -R ec2-user:ec2-user /data/src/github.com/openshift/' with timeout 14400. Attempt #0
Running ssh/sudo command 'set -e
rm -fr /data/src/github.com/openshift/origin-aggregated-logging-bare;

if [ ! -d /data/src/github.com/openshift/origin-aggregated-logging-bare ]; then
git clone --quiet --bare https://github.com/openshift/origin-aggregated-logging.git /data/src/github.com/openshift/origin-aggregated-logging-bare >/dev/null
fi
' with timeout 14400. Attempt #0
Synchronizing local sources
Synchronizing [origin-aggregated-logging@master] from origin-aggregated-logging...
Warning: Permanently added '54.90.235.203' (ECDSA) to the list of known hosts.
Running ssh/sudo command 'set -e

if [ -d /data/src/github.com/openshift/origin-aggregated-logging-bare ]; then
rm -rf /data/src/github.com/openshift/origin-aggregated-logging
echo 'Cloning origin-aggregated-logging ...'
git clone --quiet --recurse-submodules /data/src/github.com/openshift/origin-aggregated-logging-bare /data/src/github.com/openshift/origin-aggregated-logging

else
MISSING_REPO+='origin-aggregated-logging-bare'
fi

if [ -n "$MISSING_REPO" ]; then
echo 'Missing required upstream repositories:'
echo $MISSING_REPO
echo 'To fix, execute command: vagrant clone-upstream-repos'
fi
' with timeout 14400. Attempt #0
Cloning origin-aggregated-logging ...
Submodule 'deployer/common' (https://github.com/openshift/origin-integration-common) registered for path 'deployer/common'
Submodule 'kibana-proxy' (https://github.com/fabric8io/openshift-auth-proxy.git) registered for path 'kibana-proxy'
Cloning into 'deployer/common'...
Submodule path 'deployer/common': checked out '45bf993212cdcbab5cbce3b3fab74a72b851402e'
Cloning into 'kibana-proxy'...
Submodule path 'kibana-proxy': checked out '118dfb40f7a8082d370ba7f4805255c9ec7c8178'
+ vagrant ssh -c 'if [ ! -d /tmp/openshift ] ; then mkdir /tmp/openshift ; fi ; sudo chmod 777 /tmp/openshift'
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/base-centos7 ...
pulling image openshift/base-centos7 ...
+ vagrant ssh -c 'docker pull openshift/base-centos7' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/base-centos7 ... 
latest: Pulling from docker.io/openshift/base-centos7
45a2e645736c: Pulling fs layer
734fb161cf89: Pulling fs layer
78efc9e155c4: Pulling fs layer
8a3400b7e31a: Pulling fs layer
8a3400b7e31a: Waiting
734fb161cf89: Verifying Checksum
734fb161cf89: Download complete
8a3400b7e31a: Verifying Checksum
8a3400b7e31a: Download complete
45a2e645736c: Verifying Checksum
45a2e645736c: Download complete
78efc9e155c4: Verifying Checksum
78efc9e155c4: Download complete
45a2e645736c: Pull complete
734fb161cf89: Pull complete
78efc9e155c4: Pull complete
8a3400b7e31a: Pull complete
Digest: sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c
+ echo done with openshift/base-centos7
done with openshift/base-centos7
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image centos:centos7 ...
pulling image centos:centos7 ...
+ vagrant ssh -c 'docker pull centos:centos7' -- -n
Trying to pull repository docker.io/library/centos ... 
centos7: Pulling from docker.io/library/centos
Digest: sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9
+ echo done with centos:centos7
done with centos:centos7
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-elasticsearch ...
pulling image openshift/origin-logging-elasticsearch ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-elasticsearch' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-elasticsearch ... 
latest: Pulling from docker.io/openshift/origin-logging-elasticsearch
d5e46245fe40: Already exists
c6633336ebec: Pulling fs layer
c8922b1f556c: Pulling fs layer
8efa1082b536: Pulling fs layer
1b2efd2f86a8: Pulling fs layer
0c23282fef8c: Pulling fs layer
1181763a1fee: Pulling fs layer
4943cf20f63b: Pulling fs layer
3ca1d3f72a67: Pulling fs layer
ff2b5d1f1edd: Pulling fs layer
f633748cd255: Pulling fs layer
1181763a1fee: Waiting
ff2b5d1f1edd: Waiting
4943cf20f63b: Waiting
f633748cd255: Waiting
3ca1d3f72a67: Waiting
1b2efd2f86a8: Waiting
0c23282fef8c: Waiting
c6633336ebec: Download complete
8efa1082b536: Download complete
1b2efd2f86a8: Verifying Checksum
1b2efd2f86a8: Download complete
0c23282fef8c: Verifying Checksum
0c23282fef8c: Download complete
1181763a1fee: Verifying Checksum
1181763a1fee: Download complete
4943cf20f63b: Verifying Checksum
4943cf20f63b: Download complete
ff2b5d1f1edd: Verifying Checksum
ff2b5d1f1edd: Download complete
f633748cd255: Download complete
3ca1d3f72a67: Verifying Checksum
3ca1d3f72a67: Download complete
c8922b1f556c: Verifying Checksum
c8922b1f556c: Download complete
c6633336ebec: Pull complete
c8922b1f556c: Pull complete
8efa1082b536: Pull complete
1b2efd2f86a8: Pull complete
0c23282fef8c: Pull complete
1181763a1fee: Pull complete
4943cf20f63b: Pull complete
3ca1d3f72a67: Pull complete
ff2b5d1f1edd: Pull complete
f633748cd255: Pull complete
Digest: sha256:6296f1719676e970438cac4d912542b35ac786c14a15df892507007c4ecbe490
+ echo done with openshift/origin-logging-elasticsearch
done with openshift/origin-logging-elasticsearch
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-fluentd ...
pulling image openshift/origin-logging-fluentd ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-fluentd' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-fluentd ... 
latest: Pulling from docker.io/openshift/origin-logging-fluentd
d5e46245fe40: Already exists
d7c28cc24dc2: Pulling fs layer
9175f9d06c1f: Pulling fs layer
91e5bb34ef30: Pulling fs layer
0c100caa1a42: Pulling fs layer
19d549a53f32: Pulling fs layer
0c100caa1a42: Waiting
19d549a53f32: Waiting
91e5bb34ef30: Download complete
0c100caa1a42: Verifying Checksum
0c100caa1a42: Download complete
19d549a53f32: Verifying Checksum
9175f9d06c1f: Verifying Checksum
9175f9d06c1f: Download complete
d7c28cc24dc2: Download complete
d7c28cc24dc2: Pull complete
9175f9d06c1f: Pull complete
91e5bb34ef30: Pull complete
0c100caa1a42: Pull complete
19d549a53f32: Pull complete
Digest: sha256:cae7c21c9f111d4f5b481c14a65c597c67e715a8ffe3aee4c483100ee77296d7
+ echo done with openshift/origin-logging-fluentd
done with openshift/origin-logging-fluentd
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-curator ...
pulling image openshift/origin-logging-curator ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-curator' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-curator ... 
latest: Pulling from docker.io/openshift/origin-logging-curator
d5e46245fe40: Already exists
73c202020d66: Pulling fs layer
b330097dd4ed: Pulling fs layer
73c202020d66: Download complete
73c202020d66: Pull complete
b330097dd4ed: Verifying Checksum
b330097dd4ed: Download complete
b330097dd4ed: Pull complete
Digest: sha256:daded10ff4e08dfb6659c964e305f16679596312da558af095835202cf66f703
+ echo done with openshift/origin-logging-curator
done with openshift/origin-logging-curator
+ for image in openshift/base-centos7 centos:centos7 openshift/origin-logging-elasticsearch openshift/origin-logging-fluentd openshift/origin-logging-curator openshift/origin-logging-kibana
+ echo pulling image openshift/origin-logging-kibana ...
pulling image openshift/origin-logging-kibana ...
+ vagrant ssh -c 'docker pull openshift/origin-logging-kibana' -- -n
Using default tag: latest
Trying to pull repository docker.io/openshift/origin-logging-kibana ... 
latest: Pulling from docker.io/openshift/origin-logging-kibana
45a2e645736c: Already exists
734fb161cf89: Already exists
78efc9e155c4: Already exists
8a3400b7e31a: Already exists
4a36f2160feb: Pulling fs layer
c746963bb35a: Pulling fs layer
1a70ab5cdf21: Pulling fs layer
41752a3d77bd: Pulling fs layer
a8fda808a3c1: Pulling fs layer
3b950177e658: Pulling fs layer
41752a3d77bd: Waiting
a8fda808a3c1: Waiting
3b950177e658: Waiting
1a70ab5cdf21: Verifying Checksum
1a70ab5cdf21: Download complete
4a36f2160feb: Verifying Checksum
4a36f2160feb: Download complete
41752a3d77bd: Verifying Checksum
41752a3d77bd: Download complete
a8fda808a3c1: Verifying Checksum
a8fda808a3c1: Download complete
4a36f2160feb: Pull complete
3b950177e658: Verifying Checksum
3b950177e658: Download complete
c746963bb35a: Verifying Checksum
c746963bb35a: Download complete
c746963bb35a: Pull complete
1a70ab5cdf21: Pull complete
41752a3d77bd: Pull complete
a8fda808a3c1: Pull complete
3b950177e658: Pull complete
Digest: sha256:950568237cc7d0ff14ea9fe22c3967d888996db70c66181421ad68caeb5ba75f
+ echo done with openshift/origin-logging-kibana
done with openshift/origin-logging-kibana
+ vagrant test-origin-aggregated-logging -d --env GIT_URL=https://github.com/openshift/origin-aggregated-logging --env GIT_BRANCH=master --env O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging --env OS_ROOT=/data/src/github.com/openshift/origin --env ENABLE_OPS_CLUSTER=true --env USE_LOCAL_SOURCE=true --env TEST_PERF=false --env VERBOSE=1 --env OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible --env OS_ANSIBLE_BRANCH=master
***************************************************
Running GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh...
/data/src/github.com/openshift/origin /data/src/github.com/openshift/origin-aggregated-logging/hack/testing
/data/src/github.com/openshift/origin-aggregated-logging/hack/testing
/data/src/github.com/openshift/origin-aggregated-logging /data/src/github.com/openshift/origin-aggregated-logging/hack/testing
/data/src/github.com/openshift/origin-aggregated-logging/hack/testing
Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
Metadata Cache Created
Loaded plugins: amazon-id, rhui-lb, search-disabled-repos
Resolving Dependencies
--> Running transaction check
---> Package ansible.noarch 0:2.3.0.0-3.el7 will be installed
--> Processing Dependency: sshpass for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-paramiko for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-keyczar for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-httplib2 for package: ansible-2.3.0.0-3.el7.noarch
--> Processing Dependency: python-crypto for package: ansible-2.3.0.0-3.el7.noarch
---> Package python2-pip.noarch 0:8.1.2-5.el7 will be installed
---> Package python2-ruamel-yaml.x86_64 0:0.12.14-9.el7 will be installed
--> Processing Dependency: python2-typing for package: python2-ruamel-yaml-0.12.14-9.el7.x86_64
--> Processing Dependency: python2-ruamel-ordereddict for package: python2-ruamel-yaml-0.12.14-9.el7.x86_64
--> Running transaction check
---> Package python-httplib2.noarch 0:0.9.1-2.el7aos will be installed
---> Package python-keyczar.noarch 0:0.71c-2.el7aos will be installed
--> Processing Dependency: python-pyasn1 for package: python-keyczar-0.71c-2.el7aos.noarch
---> Package python-paramiko.noarch 0:2.1.1-1.el7 will be installed
--> Processing Dependency: python-cryptography for package: python-paramiko-2.1.1-1.el7.noarch
---> Package python2-crypto.x86_64 0:2.6.1-13.el7 will be installed
--> Processing Dependency: libtomcrypt.so.0()(64bit) for package: python2-crypto-2.6.1-13.el7.x86_64
---> Package python2-ruamel-ordereddict.x86_64 0:0.4.9-3.el7 will be installed
---> Package python2-typing.noarch 0:3.5.2.2-3.el7 will be installed
---> Package sshpass.x86_64 0:1.06-1.el7 will be installed
--> Running transaction check
---> Package libtomcrypt.x86_64 0:1.17-23.el7 will be installed
--> Processing Dependency: libtommath >= 0.42.0 for package: libtomcrypt-1.17-23.el7.x86_64
--> Processing Dependency: libtommath.so.0()(64bit) for package: libtomcrypt-1.17-23.el7.x86_64
---> Package python2-cryptography.x86_64 0:1.3.1-3.el7 will be installed
--> Processing Dependency: python-idna >= 2.0 for package: python2-cryptography-1.3.1-3.el7.x86_64
--> Processing Dependency: python-cffi >= 1.4.1 for package: python2-cryptography-1.3.1-3.el7.x86_64
--> Processing Dependency: python-ipaddress for package: python2-cryptography-1.3.1-3.el7.x86_64
--> Processing Dependency: python-enum34 for package: python2-cryptography-1.3.1-3.el7.x86_64
---> Package python2-pyasn1.noarch 0:0.1.9-7.el7 will be installed
--> Running transaction check
---> Package libtommath.x86_64 0:0.42.0-4.el7 will be installed
---> Package python-cffi.x86_64 0:1.6.0-5.el7 will be installed
--> Processing Dependency: python-pycparser for package: python-cffi-1.6.0-5.el7.x86_64
---> Package python-enum34.noarch 0:1.0.4-1.el7 will be installed
---> Package python-idna.noarch 0:2.0-1.el7 will be installed
---> Package python-ipaddress.noarch 0:1.0.16-2.el7 will be installed
--> Running transaction check
---> Package python-pycparser.noarch 0:2.14-1.el7 will be installed
--> Processing Dependency: python-ply for package: python-pycparser-2.14-1.el7.noarch
--> Running transaction check
---> Package python-ply.noarch 0:3.4-10.el7 will be installed
--> Finished Dependency Resolution

Dependencies Resolved

================================================================================
 Package              Arch   Version        Repository                     Size
================================================================================
Installing:
 ansible              noarch 2.3.0.0-3.el7  epel                          5.7 M
 python2-pip          noarch 8.1.2-5.el7    epel                          1.7 M
 python2-ruamel-yaml  x86_64 0.12.14-9.el7  li                            245 k
Installing for dependencies:
 libtomcrypt          x86_64 1.17-23.el7    epel                          224 k
 libtommath           x86_64 0.42.0-4.el7   epel                           35 k
 python-cffi          x86_64 1.6.0-5.el7    oso-rhui-rhel-server-releases 218 k
 python-enum34        noarch 1.0.4-1.el7    oso-rhui-rhel-server-releases  52 k
 python-httplib2      noarch 0.9.1-2.el7aos li                            115 k
 python-idna          noarch 2.0-1.el7      oso-rhui-rhel-server-releases  92 k
 python-ipaddress     noarch 1.0.16-2.el7   oso-rhui-rhel-server-releases  34 k
 python-keyczar       noarch 0.71c-2.el7aos rhel-7-server-ose-3.1-rpms    217 k
 python-paramiko      noarch 2.1.1-1.el7    rhel-7-server-ose-3.4-rpms    266 k
 python-ply           noarch 3.4-10.el7     oso-rhui-rhel-server-releases 123 k
 python-pycparser     noarch 2.14-1.el7     oso-rhui-rhel-server-releases 105 k
 python2-crypto       x86_64 2.6.1-13.el7   epel                          476 k
 python2-cryptography x86_64 1.3.1-3.el7    oso-rhui-rhel-server-releases 471 k
 python2-pyasn1       noarch 0.1.9-7.el7    oso-rhui-rhel-server-releases 100 k
 python2-ruamel-ordereddict
                      x86_64 0.4.9-3.el7    li                             38 k
 python2-typing       noarch 3.5.2.2-3.el7  epel                           39 k
 sshpass              x86_64 1.06-1.el7     epel                           21 k

Transaction Summary
================================================================================
Install  3 Packages (+17 Dependent packages)

Total download size: 10 M
Installed size: 47 M
Downloading packages:
--------------------------------------------------------------------------------
Total                                              5.2 MB/s |  10 MB  00:01     
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
  Installing : python2-pyasn1-0.1.9-7.el7.noarch                           1/20 
  Installing : sshpass-1.06-1.el7.x86_64                                   2/20 
  Installing : libtommath-0.42.0-4.el7.x86_64                              3/20 
  Installing : libtomcrypt-1.17-23.el7.x86_64                              4/20 
  Installing : python2-crypto-2.6.1-13.el7.x86_64                          5/20 
  Installing : python-keyczar-0.71c-2.el7aos.noarch                        6/20 
  Installing : python-enum34-1.0.4-1.el7.noarch                            7/20 
  Installing : python-ply-3.4-10.el7.noarch                                8/20 
  Installing : python-pycparser-2.14-1.el7.noarch                          9/20 
  Installing : python-cffi-1.6.0-5.el7.x86_64                             10/20 
  Installing : python-httplib2-0.9.1-2.el7aos.noarch                      11/20 
  Installing : python-idna-2.0-1.el7.noarch                               12/20 
  Installing : python2-ruamel-ordereddict-0.4.9-3.el7.x86_64              13/20 
  Installing : python2-typing-3.5.2.2-3.el7.noarch                        14/20 
  Installing : python-ipaddress-1.0.16-2.el7.noarch                       15/20 
  Installing : python2-cryptography-1.3.1-3.el7.x86_64                    16/20 
  Installing : python-paramiko-2.1.1-1.el7.noarch                         17/20 
  Installing : ansible-2.3.0.0-3.el7.noarch                               18/20 
  Installing : python2-ruamel-yaml-0.12.14-9.el7.x86_64                   19/20 
  Installing : python2-pip-8.1.2-5.el7.noarch                             20/20 
  Verifying  : python-pycparser-2.14-1.el7.noarch                          1/20 
  Verifying  : python-ipaddress-1.0.16-2.el7.noarch                        2/20 
  Verifying  : ansible-2.3.0.0-3.el7.noarch                                3/20 
  Verifying  : python2-typing-3.5.2.2-3.el7.noarch                         4/20 
  Verifying  : python2-pip-8.1.2-5.el7.noarch                              5/20 
  Verifying  : python2-pyasn1-0.1.9-7.el7.noarch                           6/20 
  Verifying  : libtomcrypt-1.17-23.el7.x86_64                              7/20 
  Verifying  : python-cffi-1.6.0-5.el7.x86_64                              8/20 
  Verifying  : python2-ruamel-yaml-0.12.14-9.el7.x86_64                    9/20 
  Verifying  : python2-ruamel-ordereddict-0.4.9-3.el7.x86_64              10/20 
  Verifying  : python-idna-2.0-1.el7.noarch                               11/20 
  Verifying  : python-httplib2-0.9.1-2.el7aos.noarch                      12/20 
  Verifying  : python-ply-3.4-10.el7.noarch                               13/20 
  Verifying  : python-enum34-1.0.4-1.el7.noarch                           14/20 
  Verifying  : python-keyczar-0.71c-2.el7aos.noarch                       15/20 
  Verifying  : libtommath-0.42.0-4.el7.x86_64                             16/20 
  Verifying  : sshpass-1.06-1.el7.x86_64                                  17/20 
  Verifying  : python2-cryptography-1.3.1-3.el7.x86_64                    18/20 
  Verifying  : python-paramiko-2.1.1-1.el7.noarch                         19/20 
  Verifying  : python2-crypto-2.6.1-13.el7.x86_64                         20/20 

Installed:
  ansible.noarch 0:2.3.0.0-3.el7              python2-pip.noarch 0:8.1.2-5.el7 
  python2-ruamel-yaml.x86_64 0:0.12.14-9.el7 

Dependency Installed:
  libtomcrypt.x86_64 0:1.17-23.el7                                              
  libtommath.x86_64 0:0.42.0-4.el7                                              
  python-cffi.x86_64 0:1.6.0-5.el7                                              
  python-enum34.noarch 0:1.0.4-1.el7                                            
  python-httplib2.noarch 0:0.9.1-2.el7aos                                       
  python-idna.noarch 0:2.0-1.el7                                                
  python-ipaddress.noarch 0:1.0.16-2.el7                                        
  python-keyczar.noarch 0:0.71c-2.el7aos                                        
  python-paramiko.noarch 0:2.1.1-1.el7                                          
  python-ply.noarch 0:3.4-10.el7                                                
  python-pycparser.noarch 0:2.14-1.el7                                          
  python2-crypto.x86_64 0:2.6.1-13.el7                                          
  python2-cryptography.x86_64 0:1.3.1-3.el7                                     
  python2-pyasn1.noarch 0:0.1.9-7.el7                                           
  python2-ruamel-ordereddict.x86_64 0:0.4.9-3.el7                               
  python2-typing.noarch 0:3.5.2.2-3.el7                                         
  sshpass.x86_64 0:1.06-1.el7                                                   

Complete!
Cloning into '/tmp/tmp.7yWYbvLCJ1/openhift-ansible'...
Copying oc from path to /usr/local/bin for use by openshift-ansible
Copying oc from path to /usr/bin for use by openshift-ansible
Copying oadm from path to /usr/local/bin for use by openshift-ansible
Copying oadm from path to /usr/bin for use by openshift-ansible
[INFO] Starting logging tests at Fri Jun  9 10:22:25 EDT 2017
Generated new key pair as /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/serviceaccounts.public.key and /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/serviceaccounts.private.key
Generating node credentials ...
Created node config for 172.18.4.93 in /tmp/openshift/origin-aggregated-logging/openshift.local.config/node-172.18.4.93
Wrote master config to: /tmp/openshift/origin-aggregated-logging/openshift.local.config/master/master-config.yaml
Running hack/lib/start.sh:352: executing 'oc get --raw /healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s...
SUCCESS after 36.275s: hack/lib/start.sh:352: executing 'oc get --raw /healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s
Standard output from the command:
ok
Standard error from the command:
The connection to the server 172.18.4.93:8443 was refused - did you specify the right host or port?
... repeated 77 times
Error from server (Forbidden): User "system:admin" cannot "get" on "/healthz"
... repeated 7 times
Running hack/lib/start.sh:353: executing 'oc get --raw https://172.18.4.93:10250/healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.5s until completion or 120.000s...
SUCCESS after 0.195s: hack/lib/start.sh:353: executing 'oc get --raw https://172.18.4.93:10250/healthz --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.5s until completion or 120.000s
Standard output from the command:
ok
There was no error output from the command.
Running hack/lib/start.sh:354: executing 'oc get --raw /healthz/ready --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s...
SUCCESS after 0.874s: hack/lib/start.sh:354: executing 'oc get --raw /healthz/ready --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting any result and text 'ok'; re-trying every 0.25s until completion or 80.000s
Standard output from the command:
ok
Standard error from the command:
Error from server (InternalError): an error on the server ("") has prevented the request from succeeding
Running hack/lib/start.sh:355: executing 'oc get service kubernetes --namespace default --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 160.000s...
SUCCESS after 0.340s: hack/lib/start.sh:355: executing 'oc get service kubernetes --namespace default --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 160.000s
Standard output from the command:
NAME         CLUSTER-IP   EXTERNAL-IP   PORT(S)                 AGE
kubernetes   172.30.0.1   <none>        443/TCP,53/UDP,53/TCP   5s

There was no error output from the command.
Running hack/lib/start.sh:356: executing 'oc get --raw /api/v1/nodes/172.18.4.93 --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 80.000s...
SUCCESS after 0.269s: hack/lib/start.sh:356: executing 'oc get --raw /api/v1/nodes/172.18.4.93 --config='/tmp/openshift/origin-aggregated-logging/openshift.local.config/master/admin.kubeconfig'' expecting success; re-trying every 0.25s until completion or 80.000s
Standard output from the command:
{"kind":"Node","apiVersion":"v1","metadata":{"name":"172.18.4.93","selfLink":"/api/v1/nodes/172.18.4.93","uid":"30393f5f-4d1f-11e7-83b0-0e6fb895db82","resourceVersion":"287","creationTimestamp":"2017-06-09T14:23:20Z","labels":{"beta.kubernetes.io/arch":"amd64","beta.kubernetes.io/os":"linux","kubernetes.io/hostname":"172.18.4.93"},"annotations":{"volumes.kubernetes.io/controller-managed-attach-detach":"true"}},"spec":{"externalID":"172.18.4.93","providerID":"aws:////i-08eae8de52d2e283e"},"status":{"capacity":{"cpu":"4","memory":"7231688Ki","pods":"40"},"allocatable":{"cpu":"4","memory":"7129288Ki","pods":"40"},"conditions":[{"type":"OutOfDisk","status":"False","lastHeartbeatTime":"2017-06-09T14:23:20Z","lastTransitionTime":"2017-06-09T14:23:20Z","reason":"KubeletHasSufficientDisk","message":"kubelet has sufficient disk space available"},{"type":"MemoryPressure","status":"False","lastHeartbeatTime":"2017-06-09T14:23:20Z","lastTransitionTime":"2017-06-09T14:23:20Z","reason":"KubeletHasSufficientMemory","message":"kubelet has sufficient memory available"},{"type":"DiskPressure","status":"False","lastHeartbeatTime":"2017-06-09T14:23:20Z","lastTransitionTime":"2017-06-09T14:23:20Z","reason":"KubeletHasNoDiskPressure","message":"kubelet has no disk pressure"},{"type":"Ready","status":"True","lastHeartbeatTime":"2017-06-09T14:23:20Z","lastTransitionTime":"2017-06-09T14:23:20Z","reason":"KubeletReady","message":"kubelet is posting ready status"}],"addresses":[{"type":"LegacyHostIP","address":"172.18.4.93"},{"type":"InternalIP","address":"172.18.4.93"},{"type":"Hostname","address":"172.18.4.93"}],"daemonEndpoints":{"kubeletEndpoint":{"Port":10250}},"nodeInfo":{"machineID":"f9370ed252a14f73b014c1301a9b6d1b","systemUUID":"EC20179D-CEE7-8FA3-53A5-5B49D0B44786","bootID":"9b91d16b-8962-41b1-a934-bcdeca0205d2","kernelVersion":"3.10.0-327.22.2.el7.x86_64","osImage":"Red Hat Enterprise Linux Server 7.3 (Maipo)","containerRuntimeVersion":"docker://1.12.6","kubeletVersion":"v1.6.1+5115d708d7","kubeProxyVersion":"v1.6.1+5115d708d7","operatingSystem":"linux","architecture":"amd64"},"images":[{"names":["openshift/origin-federation:6acabdc","openshift/origin-federation:latest"],"sizeBytes":1205885664},{"names":["openshift/origin-docker-registry:6acabdc","openshift/origin-docker-registry:latest"],"sizeBytes":1100164272},{"names":["openshift/origin-gitserver:6acabdc","openshift/origin-gitserver:latest"],"sizeBytes":1086520226},{"names":["openshift/openvswitch:6acabdc","openshift/openvswitch:latest"],"sizeBytes":1053403667},{"names":["openshift/node:6acabdc","openshift/node:latest"],"sizeBytes":1051721928},{"names":["openshift/origin-keepalived-ipfailover:6acabdc","openshift/origin-keepalived-ipfailover:latest"],"sizeBytes":1028529711},{"names":["openshift/origin-haproxy-router:6acabdc","openshift/origin-haproxy-router:latest"],"sizeBytes":1022758742},{"names":["openshift/origin:6acabdc","openshift/origin:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-f5-router:6acabdc","openshift/origin-f5-router:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-sti-builder:6acabdc","openshift/origin-sti-builder:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-recycler:6acabdc","openshift/origin-recycler:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-deployer:6acabdc","openshift/origin-deployer:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-docker-builder:6acabdc","openshift/origin-docker-builder:latest"],"sizeBytes":1001728427},{"names":["openshift/origin-cluster-capacity:6acabdc","openshift/origin-cluster-capacity:latest"],"sizeBytes":962455026},{"names":["rhel7.1:latest"],"sizeBytes":765301508},{"names":["openshift/dind-master:latest"],"sizeBytes":731456758},{"names":["openshift/dind-node:latest"],"sizeBytes":731453034},{"names":["\u003cnone\u003e@\u003cnone\u003e","\u003cnone\u003e:\u003cnone\u003e"],"sizeBytes":709532011},{"names":["docker.io/openshift/origin-logging-kibana@sha256:950568237cc7d0ff14ea9fe22c3967d888996db70c66181421ad68caeb5ba75f","docker.io/openshift/origin-logging-kibana:latest"],"sizeBytes":682851513},{"names":["openshift/dind:latest"],"sizeBytes":640650210},{"names":["docker.io/openshift/origin-logging-elasticsearch@sha256:6296f1719676e970438cac4d912542b35ac786c14a15df892507007c4ecbe490","docker.io/openshift/origin-logging-elasticsearch:latest"],"sizeBytes":425567196},{"names":["docker.io/openshift/base-centos7@sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c","docker.io/openshift/base-centos7:latest"],"sizeBytes":383049978},{"names":["rhel7.2:latest"],"sizeBytes":377493597},{"names":["openshift/origin-egress-router:6acabdc","openshift/origin-egress-router:latest"],"sizeBytes":364745713},{"names":["openshift/origin-base:latest"],"sizeBytes":363070172},{"names":["\u003cnone\u003e@\u003cnone\u003e","\u003cnone\u003e:\u003cnone\u003e"],"sizeBytes":363024702},{"names":["docker.io/openshift/origin-logging-fluentd@sha256:cae7c21c9f111d4f5b481c14a65c597c67e715a8ffe3aee4c483100ee77296d7","docker.io/openshift/origin-logging-fluentd:latest"],"sizeBytes":359223728},{"names":["docker.io/fedora@sha256:69281ddd7b2600e5f2b17f1e12d7fba25207f459204fb2d15884f8432c479136","docker.io/fedora:25"],"sizeBytes":230864375},{"names":["docker.io/openshift/origin-logging-curator@sha256:daded10ff4e08dfb6659c964e305f16679596312da558af095835202cf66f703","docker.io/openshift/origin-logging-curator:latest"],"sizeBytes":224977669},{"names":["rhel7.3:latest","rhel7:latest"],"sizeBytes":219121266},{"names":["openshift/origin-pod:6acabdc","openshift/origin-pod:latest"],"sizeBytes":213199843},{"names":["registry.access.redhat.com/rhel7.2@sha256:98e6ca5d226c26e31a95cd67716afe22833c943e1926a21daf1a030906a02249","registry.access.redhat.com/rhel7.2:latest"],"sizeBytes":201376319},{"names":["registry.access.redhat.com/rhel7.3@sha256:1e232401d8e0ba53b36b757b4712fbcbd1dab9c21db039c45a84871a74e89e68","registry.access.redhat.com/rhel7.3:latest"],"sizeBytes":192693772},{"names":["docker.io/centos@sha256:bba1de7c9d900a898e3cadbae040dfe8a633c06bc104a0df76ae24483e03c077"],"sizeBytes":192548999},{"names":["openshift/origin-source:latest"],"sizeBytes":192548894},{"names":["docker.io/centos@sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9","docker.io/centos:7","docker.io/centos:centos7"],"sizeBytes":192548537},{"names":["registry.access.redhat.com/rhel7.1@sha256:1bc5a4c43bbb29a5a96a61896ff696933be3502e2f5fdc4cde02d9e101731fdd","registry.access.redhat.com/rhel7.1:latest"],"sizeBytes":158229901},{"names":["openshift/hello-openshift:6acabdc","openshift/hello-openshift:latest"],"sizeBytes":5643318}]}}

There was no error output from the command.
serviceaccount "registry" created
clusterrolebinding "registry-registry-role" created
deploymentconfig "docker-registry" created
service "docker-registry" created
info: password for stats user admin has been set to j065EJWzyQ
--> Creating router router ...
    serviceaccount "router" created
    clusterrolebinding "router-router-role" created
    deploymentconfig "router" created
    service "router" created
--> Success
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:162: executing 'oadm new-project logging --node-selector=''' expecting success...
SUCCESS after 0.773s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:162: executing 'oadm new-project logging --node-selector=''' expecting success
Standard output from the command:
Created project logging

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:163: executing 'oc project logging > /dev/null' expecting success...
SUCCESS after 0.305s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:163: executing 'oc project logging > /dev/null' expecting success
There was no output from the command.
There was no error output from the command.
apiVersion: v1
items:
- apiVersion: v1
  kind: ImageStream
  metadata:
    labels:
      build: logging-elasticsearch
      component: development
      logging-infra: development
      provider: openshift
    name: logging-elasticsearch
  spec: {}
- apiVersion: v1
  kind: ImageStream
  metadata:
    labels:
      build: logging-fluentd
      component: development
      logging-infra: development
      provider: openshift
    name: logging-fluentd
  spec: {}
- apiVersion: v1
  kind: ImageStream
  metadata:
    labels:
      build: logging-kibana
      component: development
      logging-infra: development
      provider: openshift
    name: logging-kibana
  spec: {}
- apiVersion: v1
  kind: ImageStream
  metadata:
    labels:
      build: logging-curator
      component: development
      logging-infra: development
      provider: openshift
    name: logging-curator
  spec: {}
- apiVersion: v1
  kind: ImageStream
  metadata:
    labels:
      build: logging-auth-proxy
      component: development
      logging-infra: development
      provider: openshift
    name: logging-auth-proxy
  spec: {}
- apiVersion: v1
  kind: ImageStream
  metadata:
    labels:
      build: logging-deployment
      component: development
      logging-infra: development
      provider: openshift
    name: origin
  spec:
    dockerImageRepository: openshift/origin
    tags:
    - from:
        kind: DockerImage
        name: openshift/origin:v1.5.0-alpha.2
      name: v1.5.0-alpha.2
- apiVersion: v1
  kind: BuildConfig
  metadata:
    labels:
      app: logging-elasticsearch
      component: development
      logging-infra: development
      provider: openshift
    name: logging-elasticsearch
  spec:
    output:
      to:
        kind: ImageStreamTag
        name: logging-elasticsearch:latest
    resources: {}
    source:
      contextDir: elasticsearch
      git:
        ref: master
        uri: https://github.com/openshift/origin-aggregated-logging
      type: Git
    strategy:
      dockerStrategy:
        from:
          kind: DockerImage
          name: openshift/base-centos7
      type: Docker
- apiVersion: v1
  kind: BuildConfig
  metadata:
    labels:
      build: logging-fluentd
      component: development
      logging-infra: development
      provider: openshift
    name: logging-fluentd
  spec:
    output:
      to:
        kind: ImageStreamTag
        name: logging-fluentd:latest
    resources: {}
    source:
      contextDir: fluentd
      git:
        ref: master
        uri: https://github.com/openshift/origin-aggregated-logging
      type: Git
    strategy:
      dockerStrategy:
        from:
          kind: DockerImage
          name: openshift/base-centos7
      type: Docker
- apiVersion: v1
  kind: BuildConfig
  metadata:
    labels:
      build: logging-kibana
      component: development
      logging-infra: development
      provider: openshift
    name: logging-kibana
  spec:
    output:
      to:
        kind: ImageStreamTag
        name: logging-kibana:latest
    resources: {}
    source:
      contextDir: kibana
      git:
        ref: master
        uri: https://github.com/openshift/origin-aggregated-logging
      type: Git
    strategy:
      dockerStrategy:
        from:
          kind: DockerImage
          name: openshift/base-centos7
      type: Docker
- apiVersion: v1
  kind: BuildConfig
  metadata:
    labels:
      build: logging-curator
      component: development
      logging-infra: development
      provider: openshift
    name: logging-curator
  spec:
    output:
      to:
        kind: ImageStreamTag
        name: logging-curator:latest
    resources: {}
    source:
      contextDir: curator
      git:
        ref: master
        uri: https://github.com/openshift/origin-aggregated-logging
      type: Git
    strategy:
      dockerStrategy:
        from:
          kind: DockerImage
          name: openshift/base-centos7
      type: Docker
- apiVersion: v1
  kind: BuildConfig
  metadata:
    labels:
      build: logging-auth-proxy
      component: development
      logging-infra: development
      provider: openshift
    name: logging-auth-proxy
  spec:
    output:
      to:
        kind: ImageStreamTag
        name: logging-auth-proxy:latest
    resources: {}
    source:
      contextDir: kibana-proxy
      git:
        ref: master
        uri: https://github.com/openshift/origin-aggregated-logging
      type: Git
    strategy:
      dockerStrategy:
        from:
          kind: DockerImage
          name: library/node:0.10.36
      type: Docker
kind: List
metadata: {}
Running hack/testing/build-images:31: executing 'oc process -o yaml    -f /data/src/github.com/openshift/origin-aggregated-logging/hack/templates/dev-builds-wo-deployer.yaml    -p LOGGING_FORK_URL=https://github.com/openshift/origin-aggregated-logging -p LOGGING_FORK_BRANCH=master    | build_filter | oc create -f -' expecting success...
SUCCESS after 0.404s: hack/testing/build-images:31: executing 'oc process -o yaml    -f /data/src/github.com/openshift/origin-aggregated-logging/hack/templates/dev-builds-wo-deployer.yaml    -p LOGGING_FORK_URL=https://github.com/openshift/origin-aggregated-logging -p LOGGING_FORK_BRANCH=master    | build_filter | oc create -f -' expecting success
Standard output from the command:
imagestream "logging-elasticsearch" created
imagestream "logging-fluentd" created
imagestream "logging-kibana" created
imagestream "logging-curator" created
imagestream "logging-auth-proxy" created
imagestream "origin" created
buildconfig "logging-elasticsearch" created
buildconfig "logging-fluentd" created
buildconfig "logging-kibana" created
buildconfig "logging-curator" created
buildconfig "logging-auth-proxy" created

There was no error output from the command.
Running hack/testing/build-images:9: executing 'oc get imagestreamtag origin:latest' expecting success; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 1.024s: hack/testing/build-images:9: executing 'oc get imagestreamtag origin:latest' expecting success; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
NAME            DOCKER REF                                                                                 UPDATED                  IMAGENAME
origin:latest   openshift/origin@sha256:c32d9de7ecabaee3cb2e9c253edfeb72c546a09a22906c80195258cff83ea77f   Less than a second ago   sha256:c32d9de7ecabaee3cb2e9c253edfeb72c546a09a22906c80195258cff83ea77f
Standard error from the command:
Error from server (NotFound): imagestreamtags.image.openshift.io "origin:latest" not found
... repeated 2 times
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-auth-proxy-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-curator-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-elasticsearch-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-fluentd-1" started
Uploading directory "/data/src/github.com/openshift/origin-aggregated-logging" as binary input for the build ...
build "logging-kibana-1" started
Running hack/testing/build-images:33: executing 'wait_for_builds_complete' expecting success...
SUCCESS after 60.624s: hack/testing/build-images:33: executing 'wait_for_builds_complete' expecting success
Standard output from the command:
Builds are complete

There was no error output from the command.
/tmp/tmp.7yWYbvLCJ1/openhift-ansible /data/src/github.com/openshift/origin-aggregated-logging
### Created host inventory file ###
[oo_first_master]
openshift

[oo_first_master:vars]
ansible_become=true
ansible_connection=local
containerized=true
docker_protect_installed_version=true
openshift_deployment_type=origin
deployment_type=origin
required_packages=[]


openshift_hosted_logging_hostname=kibana.127.0.0.1.xip.io
openshift_master_logging_public_url=https://kibana.127.0.0.1.xip.io
openshift_logging_master_public_url=https://172.18.4.93:8443

openshift_logging_image_prefix=172.30.224.2:5000/logging/
openshift_logging_use_ops=true

openshift_logging_fluentd_journal_read_from_head=False
openshift_logging_es_log_appenders=['console']
openshift_logging_use_mux=false
openshift_logging_mux_allow_external=false
openshift_logging_use_mux_client=false





###################################
Running hack/testing/init-log-stack:58: executing 'oc login -u system:admin' expecting success...
SUCCESS after 0.214s: hack/testing/init-log-stack:58: executing 'oc login -u system:admin' expecting success
Standard output from the command:
Logged into "https://172.18.4.93:8443" as "system:admin" using existing credentials.

You have access to the following projects and can switch between them with 'oc project <projectname>':

    default
    kube-public
    kube-system
  * logging
    openshift
    openshift-infra

Using project "logging".

There was no error output from the command.
Using /tmp/tmp.7yWYbvLCJ1/openhift-ansible/ansible.cfg as config file

PLAYBOOK: openshift-logging.yml ************************************************
4 plays in /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/byo/openshift-cluster/openshift-logging.yml

PLAY [Create initial host groups for localhost] ********************************
META: ran handlers

TASK [include_vars] ************************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/byo/openshift-cluster/initialize_groups.yml:10
ok: [localhost] => {
    "ansible_facts": {
        "g_all_hosts": "{{ g_master_hosts | union(g_node_hosts) | union(g_etcd_hosts) | union(g_lb_hosts) | union(g_nfs_hosts) | union(g_new_node_hosts)| union(g_new_master_hosts) | default([]) }}", 
        "g_etcd_hosts": "{{ groups.etcd | default([]) }}", 
        "g_glusterfs_hosts": "{{ groups.glusterfs | default([]) }}", 
        "g_glusterfs_registry_hosts": "{{ groups.glusterfs_registry | default(g_glusterfs_hosts) }}", 
        "g_lb_hosts": "{{ groups.lb | default([]) }}", 
        "g_master_hosts": "{{ groups.masters | default([]) }}", 
        "g_new_master_hosts": "{{ groups.new_masters | default([]) }}", 
        "g_new_node_hosts": "{{ groups.new_nodes | default([]) }}", 
        "g_nfs_hosts": "{{ groups.nfs | default([]) }}", 
        "g_node_hosts": "{{ groups.nodes | default([]) }}"
    }, 
    "changed": false
}
META: ran handlers
META: ran handlers

PLAY [Populate config host groups] *********************************************
META: ran handlers

TASK [Evaluate groups - g_etcd_hosts required] *********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:8
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [Evaluate groups - g_master_hosts or g_new_master_hosts required] *********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:13
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [Evaluate groups - g_node_hosts or g_new_node_hosts required] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:18
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [Evaluate groups - g_lb_hosts required] ***********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:23
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [Evaluate groups - g_nfs_hosts required] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:28
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [Evaluate groups - g_nfs_hosts is single host] ****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:33
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [Evaluate groups - g_glusterfs_hosts required] ****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:38
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [Evaluate oo_all_hosts] ***************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:43

TASK [Evaluate oo_masters] *****************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:52

TASK [Evaluate oo_first_master] ************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:61
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [Evaluate oo_masters_to_config] *******************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:70

TASK [Evaluate oo_etcd_to_config] **********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:79

TASK [Evaluate oo_first_etcd] **************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:88
skipping: [localhost] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [Evaluate oo_etcd_hosts_to_upgrade] ***************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:100

TASK [Evaluate oo_etcd_hosts_to_backup] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:107
creating host via 'add_host': hostname=openshift
ok: [localhost] => (item=openshift) => {
    "add_host": {
        "groups": [
            "oo_etcd_hosts_to_backup"
        ], 
        "host_name": "openshift", 
        "host_vars": {}
    }, 
    "changed": false, 
    "item": "openshift"
}

TASK [Evaluate oo_nodes_to_config] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:114

TASK [Add master to oo_nodes_to_config] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:124

TASK [Evaluate oo_lb_to_config] ************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:134

TASK [Evaluate oo_nfs_to_config] ***********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:143

TASK [Evaluate oo_glusterfs_to_config] *****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:152
META: ran handlers
META: ran handlers

PLAY [OpenShift Aggregated Logging] ********************************************

TASK [Gathering Facts] *********************************************************
ok: [openshift]
META: ran handlers

TASK [openshift_sanitize_inventory : Abort when conflicting deployment type variables are set] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:2
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_sanitize_inventory : Standardize on latest variable names] *****
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:15
ok: [openshift] => {
    "ansible_facts": {
        "deployment_type": "origin", 
        "openshift_deployment_type": "origin"
    }, 
    "changed": false
}

TASK [openshift_sanitize_inventory : Abort when deployment type is invalid] ****
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:23
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_sanitize_inventory : Normalize openshift_release] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:31
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_sanitize_inventory : Abort when openshift_release is invalid] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_sanitize_inventory/tasks/main.yml:41
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_facts : Detecting Operating System] ****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:2
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_facts : set_fact] **********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:8
ok: [openshift] => {
    "ansible_facts": {
        "l_is_atomic": false
    }, 
    "changed": false
}

TASK [openshift_facts : set_fact] **********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:10
ok: [openshift] => {
    "ansible_facts": {
        "l_is_containerized": true, 
        "l_is_etcd_system_container": false, 
        "l_is_master_system_container": false, 
        "l_is_node_system_container": false, 
        "l_is_openvswitch_system_container": false
    }, 
    "changed": false
}

TASK [openshift_facts : set_fact] **********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:16
ok: [openshift] => {
    "ansible_facts": {
        "l_any_system_container": false
    }, 
    "changed": false
}

TASK [openshift_facts : set_fact] **********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:18
ok: [openshift] => {
    "ansible_facts": {
        "l_etcd_runtime": "docker"
    }, 
    "changed": false
}

TASK [openshift_facts : Validate python version] *******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:22
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_facts : Validate python version] *******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:29
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_facts : Determine Atomic Host Docker Version] ******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:42
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_facts : assert] ************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:46
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_facts : Load variables] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:53
ok: [openshift] => (item=/tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/vars/default.yml) => {
    "ansible_facts": {
        "required_packages": [
            "iproute", 
            "python-dbus", 
            "PyYAML", 
            "yum-utils"
        ]
    }, 
    "item": "/tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/vars/default.yml"
}

TASK [openshift_facts : Ensure various deps are installed] *********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:59
ok: [openshift] => (item=iproute) => {
    "changed": false, 
    "item": "iproute", 
    "rc": 0, 
    "results": [
        "iproute-3.10.0-74.el7.x86_64 providing iproute is already installed"
    ]
}
ok: [openshift] => (item=python-dbus) => {
    "changed": false, 
    "item": "python-dbus", 
    "rc": 0, 
    "results": [
        "dbus-python-1.1.1-9.el7.x86_64 providing python-dbus is already installed"
    ]
}
ok: [openshift] => (item=PyYAML) => {
    "changed": false, 
    "item": "PyYAML", 
    "rc": 0, 
    "results": [
        "PyYAML-3.10-11.el7.x86_64 providing PyYAML is already installed"
    ]
}
ok: [openshift] => (item=yum-utils) => {
    "changed": false, 
    "item": "yum-utils", 
    "rc": 0, 
    "results": [
        "yum-utils-1.1.31-40.el7.noarch providing yum-utils is already installed"
    ]
}

TASK [openshift_facts : Ensure various deps for running system containers are installed] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:64
skipping: [openshift] => (item=atomic)  => {
    "changed": false, 
    "item": "atomic", 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}
skipping: [openshift] => (item=ostree)  => {
    "changed": false, 
    "item": "ostree", 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}
skipping: [openshift] => (item=runc)  => {
    "changed": false, 
    "item": "runc", 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_facts : Gather Cluster facts and set is_containerized if needed] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:71
changed: [openshift] => {
    "ansible_facts": {
        "openshift": {
            "common": {
                "admin_binary": "/usr/local/bin/oadm", 
                "all_hostnames": [
                    "ip-172-18-4-93.ec2.internal", 
                    "54.90.235.203", 
                    "172.18.4.93", 
                    "ec2-54-90-235-203.compute-1.amazonaws.com"
                ], 
                "cli_image": "openshift/origin", 
                "client_binary": "/usr/local/bin/oc", 
                "cluster_id": "default", 
                "config_base": "/etc/origin", 
                "data_dir": "/var/lib/origin", 
                "debug_level": "2", 
                "deployer_image": "openshift/origin-deployer", 
                "deployment_subtype": "basic", 
                "deployment_type": "origin", 
                "dns_domain": "cluster.local", 
                "etcd_runtime": "docker", 
                "examples_content_version": "v3.6", 
                "generate_no_proxy_hosts": true, 
                "hostname": "ip-172-18-4-93.ec2.internal", 
                "install_examples": true, 
                "internal_hostnames": [
                    "ip-172-18-4-93.ec2.internal", 
                    "172.18.4.93"
                ], 
                "ip": "172.18.4.93", 
                "is_atomic": false, 
                "is_containerized": true, 
                "is_etcd_system_container": false, 
                "is_master_system_container": false, 
                "is_node_system_container": false, 
                "is_openvswitch_system_container": false, 
                "kube_svc_ip": "172.30.0.1", 
                "pod_image": "openshift/origin-pod", 
                "portal_net": "172.30.0.0/16", 
                "public_hostname": "ec2-54-90-235-203.compute-1.amazonaws.com", 
                "public_ip": "54.90.235.203", 
                "registry_image": "openshift/origin-docker-registry", 
                "router_image": "openshift/origin-haproxy-router", 
                "sdn_network_plugin_name": "redhat/openshift-ovs-subnet", 
                "service_type": "origin", 
                "use_calico": false, 
                "use_contiv": false, 
                "use_dnsmasq": true, 
                "use_flannel": false, 
                "use_manageiq": true, 
                "use_nuage": false, 
                "use_openshift_sdn": true, 
                "version_gte_3_1_1_or_1_1_1": true, 
                "version_gte_3_1_or_1_1": true, 
                "version_gte_3_2_or_1_2": true, 
                "version_gte_3_3_or_1_3": true, 
                "version_gte_3_4_or_1_4": true, 
                "version_gte_3_5_or_1_5": true, 
                "version_gte_3_6": true
            }, 
            "current_config": {
                "roles": [
                    "node", 
                    "docker"
                ]
            }, 
            "docker": {
                "api_version": 1.24, 
                "disable_push_dockerhub": false, 
                "gte_1_10": true, 
                "options": "--log-driver=journald", 
                "service_name": "docker", 
                "version": "1.12.6"
            }, 
            "hosted": {
                "logging": {
                    "selector": null
                }, 
                "metrics": {
                    "selector": null
                }, 
                "registry": {
                    "selector": "region=infra"
                }, 
                "router": {
                    "selector": "region=infra"
                }
            }, 
            "node": {
                "annotations": {}, 
                "iptables_sync_period": "30s", 
                "kubelet_args": {
                    "node-labels": []
                }, 
                "labels": {}, 
                "local_quota_per_fsgroup": "", 
                "node_image": "openshift/node", 
                "node_system_image": "openshift/node", 
                "nodename": "ip-172-18-4-93.ec2.internal", 
                "ovs_image": "openshift/openvswitch", 
                "ovs_system_image": "openshift/openvswitch", 
                "registry_url": "openshift/origin-${component}:${version}", 
                "schedulable": true, 
                "sdn_mtu": "8951", 
                "set_node_ip": false, 
                "storage_plugin_deps": [
                    "ceph", 
                    "glusterfs", 
                    "iscsi"
                ]
            }, 
            "provider": {
                "metadata": {
                    "ami-id": "ami-83a1fc95", 
                    "ami-launch-index": "0", 
                    "ami-manifest-path": "(unknown)", 
                    "block-device-mapping": {
                        "ami": "/dev/sda1", 
                        "ebs17": "sdb", 
                        "root": "/dev/sda1"
                    }, 
                    "hostname": "ip-172-18-4-93.ec2.internal", 
                    "instance-action": "none", 
                    "instance-id": "i-08eae8de52d2e283e", 
                    "instance-type": "c4.xlarge", 
                    "local-hostname": "ip-172-18-4-93.ec2.internal", 
                    "local-ipv4": "172.18.4.93", 
                    "mac": "0e:6f:b8:95:db:82", 
                    "metrics": {
                        "vhostmd": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>"
                    }, 
                    "network": {
                        "interfaces": {
                            "macs": {
                                "0e:6f:b8:95:db:82": {
                                    "device-number": "0", 
                                    "interface-id": "eni-5b534981", 
                                    "ipv4-associations": {
                                        "54.90.235.203": "172.18.4.93"
                                    }, 
                                    "local-hostname": "ip-172-18-4-93.ec2.internal", 
                                    "local-ipv4s": "172.18.4.93", 
                                    "mac": "0e:6f:b8:95:db:82", 
                                    "owner-id": "531415883065", 
                                    "public-hostname": "ec2-54-90-235-203.compute-1.amazonaws.com", 
                                    "public-ipv4s": "54.90.235.203", 
                                    "security-group-ids": "sg-7e73221a", 
                                    "security-groups": "default", 
                                    "subnet-id": "subnet-cf57c596", 
                                    "subnet-ipv4-cidr-block": "172.18.0.0/20", 
                                    "vpc-id": "vpc-69705d0c", 
                                    "vpc-ipv4-cidr-block": "172.18.0.0/16", 
                                    "vpc-ipv4-cidr-blocks": "172.18.0.0/16"
                                }
                            }
                        }
                    }, 
                    "placement": {
                        "availability-zone": "us-east-1d"
                    }, 
                    "profile": "default-hvm", 
                    "public-hostname": "ec2-54-90-235-203.compute-1.amazonaws.com", 
                    "public-ipv4": "54.90.235.203", 
                    "public-keys/": "0=libra", 
                    "reservation-id": "r-0202c1f380d829cf7", 
                    "security-groups": "default", 
                    "services": {
                        "domain": "amazonaws.com", 
                        "partition": "aws"
                    }
                }, 
                "name": "aws", 
                "network": {
                    "hostname": "ip-172-18-4-93.ec2.internal", 
                    "interfaces": [
                        {
                            "ips": [
                                "172.18.4.93"
                            ], 
                            "network_id": "subnet-cf57c596", 
                            "network_type": "vpc", 
                            "public_ips": [
                                "54.90.235.203"
                            ]
                        }
                    ], 
                    "ip": "172.18.4.93", 
                    "ipv6_enabled": false, 
                    "public_hostname": "ec2-54-90-235-203.compute-1.amazonaws.com", 
                    "public_ip": "54.90.235.203"
                }, 
                "zone": "us-east-1d"
            }
        }
    }, 
    "changed": true
}

TASK [openshift_facts : Set repoquery command] *********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_facts/tasks/main.yml:99
ok: [openshift] => {
    "ansible_facts": {
        "repoquery_cmd": "repoquery --plugins"
    }, 
    "changed": false
}

TASK [openshift_logging : fail] ************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/main.yaml:2
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Set default image variables based on deployment_type] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/main.yaml:6
ok: [openshift] => (item=/tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/vars/default_images.yml) => {
    "ansible_facts": {
        "__openshift_logging_image_prefix": "{{ openshift_hosted_logging_deployer_prefix | default('docker.io/openshift/origin-') }}", 
        "__openshift_logging_image_version": "{{ openshift_hosted_logging_deployer_version | default('latest') }}"
    }, 
    "item": "/tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/vars/default_images.yml"
}

TASK [openshift_logging : Set logging image facts] *****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/main.yaml:12
ok: [openshift] => {
    "ansible_facts": {
        "openshift_logging_image_prefix": "172.30.224.2:5000/logging/", 
        "openshift_logging_image_version": "latest"
    }, 
    "changed": false
}

TASK [openshift_logging : Create temp directory for doing work in] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/main.yaml:17
ok: [openshift] => {
    "changed": false, 
    "cmd": [
        "mktemp", 
        "-d", 
        "/tmp/openshift-logging-ansible-XXXXXX"
    ], 
    "delta": "0:00:00.002138", 
    "end": "2017-06-09 10:37:44.295399", 
    "rc": 0, 
    "start": "2017-06-09 10:37:44.293261"
}

STDOUT:

/tmp/openshift-logging-ansible-O7IHcX

TASK [openshift_logging : debug] ***********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/main.yaml:24
ok: [openshift] => {
    "changed": false
}

MSG:

Created temp dir /tmp/openshift-logging-ansible-O7IHcX

TASK [openshift_logging : Create local temp directory for doing work in] *******
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/main.yaml:26
ok: [openshift -> 127.0.0.1] => {
    "changed": false, 
    "cmd": [
        "mktemp", 
        "-d", 
        "/tmp/openshift-logging-ansible-XXXXXX"
    ], 
    "delta": "0:00:00.001908", 
    "end": "2017-06-09 10:37:44.450072", 
    "rc": 0, 
    "start": "2017-06-09 10:37:44.448164"
}

STDOUT:

/tmp/openshift-logging-ansible-XvDfd8

TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/main.yaml:33
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml for openshift

TASK [openshift_logging : Gather OpenShift Logging Facts] **********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:2
ok: [openshift] => {
    "ansible_facts": {
        "openshift_logging_facts": {
            "curator": {
                "clusterrolebindings": {}, 
                "configmaps": {}, 
                "daemonsets": {}, 
                "deploymentconfigs": {}, 
                "oauthclients": {}, 
                "pvcs": {}, 
                "rolebindings": {}, 
                "routes": {}, 
                "sccs": {}, 
                "secrets": {}, 
                "services": {}
            }, 
            "curator_ops": {
                "clusterrolebindings": {}, 
                "configmaps": {}, 
                "daemonsets": {}, 
                "deploymentconfigs": {}, 
                "oauthclients": {}, 
                "pvcs": {}, 
                "rolebindings": {}, 
                "routes": {}, 
                "sccs": {}, 
                "secrets": {}, 
                "services": {}
            }, 
            "elasticsearch": {
                "clusterrolebindings": {}, 
                "configmaps": {}, 
                "daemonsets": {}, 
                "deploymentconfigs": {}, 
                "oauthclients": {}, 
                "pvcs": {}, 
                "rolebindings": {}, 
                "routes": {}, 
                "sccs": {}, 
                "secrets": {}, 
                "services": {}
            }, 
            "elasticsearch_ops": {
                "clusterrolebindings": {}, 
                "configmaps": {}, 
                "daemonsets": {}, 
                "deploymentconfigs": {}, 
                "oauthclients": {}, 
                "pvcs": {}, 
                "rolebindings": {}, 
                "routes": {}, 
                "sccs": {}, 
                "secrets": {}, 
                "services": {}
            }, 
            "fluentd": {
                "clusterrolebindings": {}, 
                "configmaps": {}, 
                "daemonsets": {}, 
                "deploymentconfigs": {}, 
                "oauthclients": {}, 
                "pvcs": {}, 
                "rolebindings": {}, 
                "routes": {}, 
                "sccs": {}, 
                "secrets": {}, 
                "services": {}
            }, 
            "kibana": {
                "clusterrolebindings": {}, 
                "configmaps": {}, 
                "daemonsets": {}, 
                "deploymentconfigs": {}, 
                "oauthclients": {}, 
                "pvcs": {}, 
                "rolebindings": {}, 
                "routes": {}, 
                "sccs": {}, 
                "secrets": {}, 
                "services": {}
            }, 
            "kibana_ops": {
                "clusterrolebindings": {}, 
                "configmaps": {}, 
                "daemonsets": {}, 
                "deploymentconfigs": {}, 
                "oauthclients": {}, 
                "pvcs": {}, 
                "rolebindings": {}, 
                "routes": {}, 
                "sccs": {}, 
                "secrets": {}, 
                "services": {}
            }
        }
    }, 
    "changed": false
}

TASK [openshift_logging : Set logging project] *********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:7
ok: [openshift] => {
    "changed": false, 
    "results": {
        "cmd": "/bin/oc get namespace logging -o json", 
        "results": {
            "apiVersion": "v1", 
            "kind": "Namespace", 
            "metadata": {
                "annotations": {
                    "openshift.io/description": "", 
                    "openshift.io/display-name": "", 
                    "openshift.io/node-selector": "", 
                    "openshift.io/sa.scc.mcs": "s0:c7,c4", 
                    "openshift.io/sa.scc.supplemental-groups": "1000050000/10000", 
                    "openshift.io/sa.scc.uid-range": "1000050000/10000"
                }, 
                "creationTimestamp": "2017-06-09T14:23:22Z", 
                "name": "logging", 
                "resourceVersion": "675", 
                "selfLink": "/api/v1/namespaces/logging", 
                "uid": "319a45c2-4d1f-11e7-83b0-0e6fb895db82"
            }, 
            "spec": {
                "finalizers": [
                    "openshift.io/origin", 
                    "kubernetes"
                ]
            }, 
            "status": {
                "phase": "Active"
            }
        }, 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging : Labeling logging project] ****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:13

TASK [openshift_logging : Labeling logging project] ****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:26
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Create logging cert directory] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:39
ok: [openshift] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/etc/origin/logging", 
    "secontext": "unconfined_u:object_r:etc_t:s0", 
    "size": 6, 
    "state": "directory", 
    "uid": 0
}

TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:47
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml for openshift

TASK [openshift_logging : Checking for ca.key] *********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:3
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for ca.crt] *********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:8
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for ca.serial.txt] **************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:13
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Generate certificates] *******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:18
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "/usr/local/bin/oc", 
        "adm", 
        "--config=/tmp/openshift-logging-ansible-O7IHcX/admin.kubeconfig", 
        "ca", 
        "create-signer-cert", 
        "--key=/etc/origin/logging/ca.key", 
        "--cert=/etc/origin/logging/ca.crt", 
        "--serial=/etc/origin/logging/ca.serial.txt", 
        "--name=logging-signer-test"
    ], 
    "delta": "0:00:00.414329", 
    "end": "2017-06-09 10:37:48.786629", 
    "rc": 0, 
    "start": "2017-06-09 10:37:48.372300"
}

TASK [openshift_logging : Checking for signing.conf] ***************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:29
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : template] ********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:34
changed: [openshift] => {
    "changed": true, 
    "checksum": "a5a1bda430be44f982fa9097778b7d35d2e42780", 
    "dest": "/etc/origin/logging/signing.conf", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "449087446670073f2899aac33113350c", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 4263, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019068.94-14054982914710/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:39
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml for openshift
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml for openshift
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml for openshift

TASK [openshift_logging : Checking for kibana.crt] *****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:2
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for kibana.key] *****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:7
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Trying to discover server cert variable name for kibana] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Trying to discover the server key variable name for kibana] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:20
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Creating signed server cert and key for kibana] ******
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:28
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Copying server key for kibana to generated certs directory] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:40
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Copying Server cert for kibana to generated certs directory] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:50
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Checking for kibana-ops.crt] *************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:2
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for kibana-ops.key] *************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:7
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Trying to discover server cert variable name for kibana-ops] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Trying to discover the server key variable name for kibana-ops] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:20
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Creating signed server cert and key for kibana-ops] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:28
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Copying server key for kibana-ops to generated certs directory] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:40
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Copying Server cert for kibana-ops to generated certs directory] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:50
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Checking for kibana-internal.crt] ********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:2
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for kibana-internal.key] ********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:7
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Trying to discover server cert variable name for kibana-internal] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Trying to discover the server key variable name for kibana-internal] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:20
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Creating signed server cert and key for kibana-internal] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:28
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "/usr/local/bin/oc", 
        "adm", 
        "--config=/tmp/openshift-logging-ansible-O7IHcX/admin.kubeconfig", 
        "ca", 
        "create-server-cert", 
        "--key=/etc/origin/logging/kibana-internal.key", 
        "--cert=/etc/origin/logging/kibana-internal.crt", 
        "--hostnames=kibana, kibana-ops, kibana.127.0.0.1.xip.io, kibana-ops.router.default.svc.cluster.local", 
        "--signer-cert=/etc/origin/logging/ca.crt", 
        "--signer-key=/etc/origin/logging/ca.key", 
        "--signer-serial=/etc/origin/logging/ca.serial.txt"
    ], 
    "delta": "0:00:00.604959", 
    "end": "2017-06-09 10:37:51.229461", 
    "rc": 0, 
    "start": "2017-06-09 10:37:50.624502"
}

TASK [openshift_logging : Copying server key for kibana-internal to generated certs directory] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:40
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Copying Server cert for kibana-internal to generated certs directory] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/procure_server_certs.yaml:50
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:48
skipping: [openshift] => (item={u'procure_component': u'mux', u'hostnames': u'logging-mux, mux.router.default.svc.cluster.local'})  => {
    "cert_info": {
        "hostnames": "logging-mux, mux.router.default.svc.cluster.local", 
        "procure_component": "mux"
    }, 
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:56
skipping: [openshift] => (item={u'procure_component': u'mux'})  => {
    "changed": false, 
    "shared_key_info": {
        "procure_component": "mux"
    }, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:63
skipping: [openshift] => (item={u'procure_component': u'es', u'hostnames': u'es, es.router.default.svc.cluster.local'})  => {
    "cert_info": {
        "hostnames": "es, es.router.default.svc.cluster.local", 
        "procure_component": "es"
    }, 
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:71
skipping: [openshift] => (item={u'procure_component': u'es-ops', u'hostnames': u'es-ops, es-ops.router.default.svc.cluster.local'})  => {
    "cert_info": {
        "hostnames": "es-ops, es-ops.router.default.svc.cluster.local", 
        "procure_component": "es-ops"
    }, 
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Copy proxy TLS configuration file] *******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:81
changed: [openshift] => {
    "changed": true, 
    "checksum": "36991681e03970736a99be9f084773521c44db06", 
    "dest": "/etc/origin/logging/server-tls.json", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "2a954195add2b2fdde4ed09ff5c8e1c5", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 321, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019071.69-194503427198709/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging : Copy proxy TLS configuration file] *******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:86
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Checking for ca.db] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:91
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : copy] ************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:96
changed: [openshift] => {
    "changed": true, 
    "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", 
    "dest": "/etc/origin/logging/ca.db", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "d41d8cd98f00b204e9800998ecf8427e", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 0, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019072.05-245595904787721/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging : Checking for ca.crt.srl] *****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:101
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : copy] ************************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:106
changed: [openshift] => {
    "changed": true, 
    "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", 
    "dest": "/etc/origin/logging/ca.crt.srl", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "d41d8cd98f00b204e9800998ecf8427e", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 0, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019072.38-216040601504620/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging : Generate PEM certs] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:111
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml for openshift
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml for openshift
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml for openshift
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml for openshift

TASK [openshift_logging : Checking for system.logging.fluentd.key] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:2
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for system.logging.fluentd.crt] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:7
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Creating cert req for system.logging.fluentd] ********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Creating cert req for system.logging.fluentd] ********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:22
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "openssl", 
        "req", 
        "-out", 
        "/etc/origin/logging/system.logging.fluentd.csr", 
        "-new", 
        "-newkey", 
        "rsa:2048", 
        "-keyout", 
        "/etc/origin/logging/system.logging.fluentd.key", 
        "-subj", 
        "/CN=system.logging.fluentd/OU=OpenShift/O=Logging", 
        "-days", 
        "712", 
        "-nodes"
    ], 
    "delta": "0:00:00.393231", 
    "end": "2017-06-09 10:37:53.538711", 
    "rc": 0, 
    "start": "2017-06-09 10:37:53.145480"
}

STDERR:

Generating a 2048 bit RSA private key
................+++
.....................................................................+++
writing new private key to '/etc/origin/logging/system.logging.fluentd.key'
-----

TASK [openshift_logging : Sign cert request with CA for system.logging.fluentd] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:31
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "openssl", 
        "ca", 
        "-in", 
        "/etc/origin/logging/system.logging.fluentd.csr", 
        "-notext", 
        "-out", 
        "/etc/origin/logging/system.logging.fluentd.crt", 
        "-config", 
        "/etc/origin/logging/signing.conf", 
        "-extensions", 
        "v3_req", 
        "-batch", 
        "-extensions", 
        "server_ext"
    ], 
    "delta": "0:00:00.007506", 
    "end": "2017-06-09 10:37:53.665100", 
    "rc": 0, 
    "start": "2017-06-09 10:37:53.657594"
}

STDERR:

Using configuration from /etc/origin/logging/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
        Serial Number: 2 (0x2)
        Validity
            Not Before: Jun  9 14:37:53 2017 GMT
            Not After : Jun  9 14:37:53 2019 GMT
        Subject:
            organizationName          = Logging
            organizationalUnitName    = OpenShift
            commonName                = system.logging.fluentd
        X509v3 extensions:
            X509v3 Key Usage: critical
                Digital Signature, Key Encipherment
            X509v3 Basic Constraints: 
                CA:FALSE
            X509v3 Extended Key Usage: 
                TLS Web Server Authentication, TLS Web Client Authentication
            X509v3 Subject Key Identifier: 
                9B:C7:5D:18:5F:8E:18:98:4C:8B:2D:FA:D4:4D:6C:C5:1C:57:63:81
            X509v3 Authority Key Identifier: 
                0.
Certificate is to be certified until Jun  9 14:37:53 2019 GMT (730 days)

Write out database with 1 new entries
Data Base Updated

TASK [openshift_logging : Checking for system.logging.kibana.key] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:2
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for system.logging.kibana.crt] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:7
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Creating cert req for system.logging.kibana] *********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Creating cert req for system.logging.kibana] *********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:22
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "openssl", 
        "req", 
        "-out", 
        "/etc/origin/logging/system.logging.kibana.csr", 
        "-new", 
        "-newkey", 
        "rsa:2048", 
        "-keyout", 
        "/etc/origin/logging/system.logging.kibana.key", 
        "-subj", 
        "/CN=system.logging.kibana/OU=OpenShift/O=Logging", 
        "-days", 
        "712", 
        "-nodes"
    ], 
    "delta": "0:00:00.117385", 
    "end": "2017-06-09 10:37:54.155043", 
    "rc": 0, 
    "start": "2017-06-09 10:37:54.037658"
}

STDERR:

Generating a 2048 bit RSA private key
....+++
.............................................................................+++
writing new private key to '/etc/origin/logging/system.logging.kibana.key'
-----

TASK [openshift_logging : Sign cert request with CA for system.logging.kibana] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:31
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "openssl", 
        "ca", 
        "-in", 
        "/etc/origin/logging/system.logging.kibana.csr", 
        "-notext", 
        "-out", 
        "/etc/origin/logging/system.logging.kibana.crt", 
        "-config", 
        "/etc/origin/logging/signing.conf", 
        "-extensions", 
        "v3_req", 
        "-batch", 
        "-extensions", 
        "server_ext"
    ], 
    "delta": "0:00:00.007607", 
    "end": "2017-06-09 10:37:54.283559", 
    "rc": 0, 
    "start": "2017-06-09 10:37:54.275952"
}

STDERR:

Using configuration from /etc/origin/logging/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
        Serial Number: 3 (0x3)
        Validity
            Not Before: Jun  9 14:37:54 2017 GMT
            Not After : Jun  9 14:37:54 2019 GMT
        Subject:
            organizationName          = Logging
            organizationalUnitName    = OpenShift
            commonName                = system.logging.kibana
        X509v3 extensions:
            X509v3 Key Usage: critical
                Digital Signature, Key Encipherment
            X509v3 Basic Constraints: 
                CA:FALSE
            X509v3 Extended Key Usage: 
                TLS Web Server Authentication, TLS Web Client Authentication
            X509v3 Subject Key Identifier: 
                26:08:2E:D6:28:2F:D6:1F:2F:DB:3E:07:BE:02:39:C6:D4:81:D0:87
            X509v3 Authority Key Identifier: 
                0.
Certificate is to be certified until Jun  9 14:37:54 2019 GMT (730 days)

Write out database with 1 new entries
Data Base Updated

TASK [openshift_logging : Checking for system.logging.curator.key] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:2
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for system.logging.curator.crt] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:7
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Creating cert req for system.logging.curator] ********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Creating cert req for system.logging.curator] ********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:22
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "openssl", 
        "req", 
        "-out", 
        "/etc/origin/logging/system.logging.curator.csr", 
        "-new", 
        "-newkey", 
        "rsa:2048", 
        "-keyout", 
        "/etc/origin/logging/system.logging.curator.key", 
        "-subj", 
        "/CN=system.logging.curator/OU=OpenShift/O=Logging", 
        "-days", 
        "712", 
        "-nodes"
    ], 
    "delta": "0:00:00.057359", 
    "end": "2017-06-09 10:37:54.721201", 
    "rc": 0, 
    "start": "2017-06-09 10:37:54.663842"
}

STDERR:

Generating a 2048 bit RSA private key
....................+++
................+++
writing new private key to '/etc/origin/logging/system.logging.curator.key'
-----

TASK [openshift_logging : Sign cert request with CA for system.logging.curator] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:31
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "openssl", 
        "ca", 
        "-in", 
        "/etc/origin/logging/system.logging.curator.csr", 
        "-notext", 
        "-out", 
        "/etc/origin/logging/system.logging.curator.crt", 
        "-config", 
        "/etc/origin/logging/signing.conf", 
        "-extensions", 
        "v3_req", 
        "-batch", 
        "-extensions", 
        "server_ext"
    ], 
    "delta": "0:00:00.007520", 
    "end": "2017-06-09 10:37:54.847826", 
    "rc": 0, 
    "start": "2017-06-09 10:37:54.840306"
}

STDERR:

Using configuration from /etc/origin/logging/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
        Serial Number: 4 (0x4)
        Validity
            Not Before: Jun  9 14:37:54 2017 GMT
            Not After : Jun  9 14:37:54 2019 GMT
        Subject:
            organizationName          = Logging
            organizationalUnitName    = OpenShift
            commonName                = system.logging.curator
        X509v3 extensions:
            X509v3 Key Usage: critical
                Digital Signature, Key Encipherment
            X509v3 Basic Constraints: 
                CA:FALSE
            X509v3 Extended Key Usage: 
                TLS Web Server Authentication, TLS Web Client Authentication
            X509v3 Subject Key Identifier: 
                00:EF:DD:74:0A:CD:C2:C2:83:78:37:44:DC:7C:BE:ED:1F:52:B0:05
            X509v3 Authority Key Identifier: 
                0.
Certificate is to be certified until Jun  9 14:37:54 2019 GMT (730 days)

Write out database with 1 new entries
Data Base Updated

TASK [openshift_logging : Checking for system.admin.key] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:2
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for system.admin.crt] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:7
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Creating cert req for system.admin] ******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Creating cert req for system.admin] ******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:22
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "openssl", 
        "req", 
        "-out", 
        "/etc/origin/logging/system.admin.csr", 
        "-new", 
        "-newkey", 
        "rsa:2048", 
        "-keyout", 
        "/etc/origin/logging/system.admin.key", 
        "-subj", 
        "/CN=system.admin/OU=OpenShift/O=Logging", 
        "-days", 
        "712", 
        "-nodes"
    ], 
    "delta": "0:00:00.110565", 
    "end": "2017-06-09 10:37:55.333230", 
    "rc": 0, 
    "start": "2017-06-09 10:37:55.222665"
}

STDERR:

Generating a 2048 bit RSA private key
......................................................+++
......................+++
writing new private key to '/etc/origin/logging/system.admin.key'
-----

TASK [openshift_logging : Sign cert request with CA for system.admin] **********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_pems.yaml:31
changed: [openshift] => {
    "changed": true, 
    "cmd": [
        "openssl", 
        "ca", 
        "-in", 
        "/etc/origin/logging/system.admin.csr", 
        "-notext", 
        "-out", 
        "/etc/origin/logging/system.admin.crt", 
        "-config", 
        "/etc/origin/logging/signing.conf", 
        "-extensions", 
        "v3_req", 
        "-batch", 
        "-extensions", 
        "server_ext"
    ], 
    "delta": "0:00:00.007557", 
    "end": "2017-06-09 10:37:55.460448", 
    "rc": 0, 
    "start": "2017-06-09 10:37:55.452891"
}

STDERR:

Using configuration from /etc/origin/logging/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
        Serial Number: 5 (0x5)
        Validity
            Not Before: Jun  9 14:37:55 2017 GMT
            Not After : Jun  9 14:37:55 2019 GMT
        Subject:
            organizationName          = Logging
            organizationalUnitName    = OpenShift
            commonName                = system.admin
        X509v3 extensions:
            X509v3 Key Usage: critical
                Digital Signature, Key Encipherment
            X509v3 Basic Constraints: 
                CA:FALSE
            X509v3 Extended Key Usage: 
                TLS Web Server Authentication, TLS Web Client Authentication
            X509v3 Subject Key Identifier: 
                3E:F8:85:EA:DC:96:3E:44:C3:E6:62:35:BB:88:F6:1B:C3:32:EF:1A
            X509v3 Authority Key Identifier: 
                0.
Certificate is to be certified until Jun  9 14:37:55 2019 GMT (730 days)

Write out database with 1 new entries
Data Base Updated

TASK [openshift_logging : Generate PEM cert for mux] ***************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:121
skipping: [openshift] => (item=system.logging.mux)  => {
    "changed": false, 
    "node_name": "system.logging.mux", 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Generate PEM cert for Elasticsearch external route] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:129
skipping: [openshift] => (item=system.logging.es)  => {
    "changed": false, 
    "node_name": "system.logging.es", 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Creating necessary JKS certs] ************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:137
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml for openshift

TASK [openshift_logging : Checking for elasticsearch.jks] **********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:3
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for logging-es.jks] *************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:8
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for system.admin.jks] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:13
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Checking for truststore.jks] *************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:18
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging : Create placeholder for previously created JKS certs to prevent recreating...] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:23
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Create placeholder for previously created JKS certs to prevent recreating...] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:28
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Create placeholder for previously created JKS certs to prevent recreating...] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:33
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Create placeholder for previously created JKS certs to prevent recreating...] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:38
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : pulling down signing items from host] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:43
changed: [openshift] => (item=ca.crt) => {
    "changed": true, 
    "checksum": "b80ba48dc44466363425c42a70ff48cae22da625", 
    "dest": "/tmp/openshift-logging-ansible-XvDfd8/ca.crt", 
    "item": "ca.crt", 
    "md5sum": "ea775a66a21e1d8dd60a2680a2602bb6", 
    "remote_checksum": "b80ba48dc44466363425c42a70ff48cae22da625", 
    "remote_md5sum": null
}
changed: [openshift] => (item=ca.key) => {
    "changed": true, 
    "checksum": "10b9cec49318ede3145ebaa71246ea4bda6a82ec", 
    "dest": "/tmp/openshift-logging-ansible-XvDfd8/ca.key", 
    "item": "ca.key", 
    "md5sum": "d83062e5be0209c56e9e7a35ac48d3bf", 
    "remote_checksum": "10b9cec49318ede3145ebaa71246ea4bda6a82ec", 
    "remote_md5sum": null
}
changed: [openshift] => (item=ca.serial.txt) => {
    "changed": true, 
    "checksum": "b649682b92a811746098e5c91e891e5142a41950", 
    "dest": "/tmp/openshift-logging-ansible-XvDfd8/ca.serial.txt", 
    "item": "ca.serial.txt", 
    "md5sum": "76b01ce73ac53fdac1c67d27ac040473", 
    "remote_checksum": "b649682b92a811746098e5c91e891e5142a41950", 
    "remote_md5sum": null
}
ok: [openshift] => (item=ca.crl.srl) => {
    "changed": false, 
    "file": "/etc/origin/logging/ca.crl.srl", 
    "item": "ca.crl.srl"
}

MSG:

the remote file does not exist, not transferring, ignored
changed: [openshift] => (item=ca.db) => {
    "changed": true, 
    "checksum": "d6ceed77b5740a54cba4c6aadf2e55bcc5686a4e", 
    "dest": "/tmp/openshift-logging-ansible-XvDfd8/ca.db", 
    "item": "ca.db", 
    "md5sum": "24f1391ab48efccdd30fdd38fcb06e4c", 
    "remote_checksum": "d6ceed77b5740a54cba4c6aadf2e55bcc5686a4e", 
    "remote_md5sum": null
}

TASK [openshift_logging : template] ********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:56
changed: [openshift -> 127.0.0.1] => {
    "changed": true, 
    "checksum": "af26913146653cd6a77cd322375e9977f195bd71", 
    "dest": "/tmp/openshift-logging-ansible-XvDfd8/signing.conf", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "f936f44d8145b4a106021d03ad270cb6", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 4281, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019076.89-233986368192367/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging : Run JKS generation script] ***************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:61
changed: [openshift -> 127.0.0.1] => {
    "changed": true, 
    "rc": 0
}

STDOUT:

Generating keystore and certificate for node system.admin
Generating certificate signing request for node system.admin
Sign certificate request with CA
Import back to keystore (including CA chain)
All done for system.admin
Generating keystore and certificate for node elasticsearch
Generating certificate signing request for node elasticsearch
Sign certificate request with CA
Import back to keystore (including CA chain)
All done for elasticsearch
Generating keystore and certificate for node logging-es
Generating certificate signing request for node logging-es
Sign certificate request with CA
Import back to keystore (including CA chain)
All done for logging-es
Import CA to truststore for validating client certs



STDERR:

+ '[' 2 -lt 1 ']'
+ dir=/tmp/openshift-logging-ansible-XvDfd8
+ SCRATCH_DIR=/tmp/openshift-logging-ansible-XvDfd8
+ PROJECT=logging
+ [[ ! -f /tmp/openshift-logging-ansible-XvDfd8/system.admin.jks ]]
+ generate_JKS_client_cert system.admin
+ NODE_NAME=system.admin
+ ks_pass=kspass
+ ts_pass=tspass
+ dir=/tmp/openshift-logging-ansible-XvDfd8
+ echo Generating keystore and certificate for node system.admin
+ keytool -genkey -alias system.admin -keystore /tmp/openshift-logging-ansible-XvDfd8/system.admin.jks -keyalg RSA -keysize 2048 -validity 712 -keypass kspass -storepass kspass -dname 'CN=system.admin, OU=OpenShift, O=Logging'
+ echo Generating certificate signing request for node system.admin
+ keytool -certreq -alias system.admin -keystore /tmp/openshift-logging-ansible-XvDfd8/system.admin.jks -file /tmp/openshift-logging-ansible-XvDfd8/system.admin.jks.csr -keyalg rsa -keypass kspass -storepass kspass -dname 'CN=system.admin, OU=OpenShift, O=Logging'
+ echo Sign certificate request with CA
+ openssl ca -in /tmp/openshift-logging-ansible-XvDfd8/system.admin.jks.csr -notext -out /tmp/openshift-logging-ansible-XvDfd8/system.admin.jks.crt -config /tmp/openshift-logging-ansible-XvDfd8/signing.conf -extensions v3_req -batch -extensions server_ext
Using configuration from /tmp/openshift-logging-ansible-XvDfd8/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
        Serial Number: 6 (0x6)
        Validity
            Not Before: Jun  9 14:38:11 2017 GMT
            Not After : Jun  9 14:38:11 2019 GMT
        Subject:
            organizationName          = Logging
            organizationalUnitName    = OpenShift
            commonName                = system.admin
        X509v3 extensions:
            X509v3 Key Usage: critical
                Digital Signature, Key Encipherment
            X509v3 Basic Constraints: 
                CA:FALSE
            X509v3 Extended Key Usage: 
                TLS Web Server Authentication, TLS Web Client Authentication
            X509v3 Subject Key Identifier: 
                DC:01:22:04:C2:49:94:23:08:44:E7:D2:30:C4:F4:CB:AD:00:F0:8F
            X509v3 Authority Key Identifier: 
                0.
Certificate is to be certified until Jun  9 14:38:11 2019 GMT (730 days)

Write out database with 1 new entries
Data Base Updated
+ echo 'Import back to keystore (including CA chain)'
+ keytool -import -file /tmp/openshift-logging-ansible-XvDfd8/ca.crt -keystore /tmp/openshift-logging-ansible-XvDfd8/system.admin.jks -storepass kspass -noprompt -alias sig-ca
Certificate was added to keystore
+ keytool -import -file /tmp/openshift-logging-ansible-XvDfd8/system.admin.jks.crt -keystore /tmp/openshift-logging-ansible-XvDfd8/system.admin.jks -storepass kspass -noprompt -alias system.admin
Certificate reply was installed in keystore
+ echo All done for system.admin
+ [[ ! -f /tmp/openshift-logging-ansible-XvDfd8/elasticsearch.jks ]]
++ join , logging-es logging-es-ops
++ local IFS=,
++ shift
++ echo logging-es,logging-es-ops
+ generate_JKS_chain true elasticsearch logging-es,logging-es-ops
+ dir=/tmp/openshift-logging-ansible-XvDfd8
+ ADD_OID=true
+ NODE_NAME=elasticsearch
+ CERT_NAMES=logging-es,logging-es-ops
+ ks_pass=kspass
+ ts_pass=tspass
+ rm -rf elasticsearch
+ extension_names=
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es-ops
+ '[' true = true ']'
+ extension_names=,dns:logging-es,dns:logging-es-ops,oid:1.2.3.4.5.5
+ echo Generating keystore and certificate for node elasticsearch
+ keytool -genkey -alias elasticsearch -keystore /tmp/openshift-logging-ansible-XvDfd8/elasticsearch.jks -keypass kspass -storepass kspass -keyalg RSA -keysize 2048 -validity 712 -dname 'CN=elasticsearch, OU=OpenShift, O=Logging' -ext san=dns:localhost,ip:127.0.0.1,dns:logging-es,dns:logging-es-ops,oid:1.2.3.4.5.5
+ echo Generating certificate signing request for node elasticsearch
+ keytool -certreq -alias elasticsearch -keystore /tmp/openshift-logging-ansible-XvDfd8/elasticsearch.jks -storepass kspass -file /tmp/openshift-logging-ansible-XvDfd8/elasticsearch.csr -keyalg rsa -dname 'CN=elasticsearch, OU=OpenShift, O=Logging' -ext san=dns:localhost,ip:127.0.0.1,dns:logging-es,dns:logging-es-ops,oid:1.2.3.4.5.5
+ echo Sign certificate request with CA
+ openssl ca -in /tmp/openshift-logging-ansible-XvDfd8/elasticsearch.csr -notext -out /tmp/openshift-logging-ansible-XvDfd8/elasticsearch.crt -config /tmp/openshift-logging-ansible-XvDfd8/signing.conf -extensions v3_req -batch -extensions server_ext
Using configuration from /tmp/openshift-logging-ansible-XvDfd8/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
        Serial Number: 7 (0x7)
        Validity
            Not Before: Jun  9 14:38:12 2017 GMT
            Not After : Jun  9 14:38:12 2019 GMT
        Subject:
            organizationName          = Logging
            organizationalUnitName    = OpenShift
            commonName                = elasticsearch
        X509v3 extensions:
            X509v3 Key Usage: critical
                Digital Signature, Key Encipherment
            X509v3 Basic Constraints: 
                CA:FALSE
            X509v3 Extended Key Usage: 
                TLS Web Server Authentication, TLS Web Client Authentication
            X509v3 Subject Key Identifier: 
                4A:51:65:FC:34:BA:7F:48:C0:4F:8B:FA:B3:A1:F1:5B:22:6A:89:3B
            X509v3 Authority Key Identifier: 
                0.
            X509v3 Subject Alternative Name: 
                DNS:localhost, IP Address:127.0.0.1, DNS:logging-es, DNS:logging-es-ops, Registered ID:1.2.3.4.5.5
Certificate is to be certified until Jun  9 14:38:12 2019 GMT (730 days)

Write out database with 1 new entries
Data Base Updated
+ echo 'Import back to keystore (including CA chain)'
+ keytool -import -file /tmp/openshift-logging-ansible-XvDfd8/ca.crt -keystore /tmp/openshift-logging-ansible-XvDfd8/elasticsearch.jks -storepass kspass -noprompt -alias sig-ca
Certificate was added to keystore
+ keytool -import -file /tmp/openshift-logging-ansible-XvDfd8/elasticsearch.crt -keystore /tmp/openshift-logging-ansible-XvDfd8/elasticsearch.jks -storepass kspass -noprompt -alias elasticsearch
Certificate reply was installed in keystore
+ echo All done for elasticsearch
+ [[ ! -f /tmp/openshift-logging-ansible-XvDfd8/logging-es.jks ]]
++ join , logging-es logging-es.logging.svc.cluster.local logging-es-cluster logging-es-cluster.logging.svc.cluster.local logging-es-ops logging-es-ops.logging.svc.cluster.local logging-es-ops-cluster logging-es-ops-cluster.logging.svc.cluster.local
++ local IFS=,
++ shift
++ echo logging-es,logging-es.logging.svc.cluster.local,logging-es-cluster,logging-es-cluster.logging.svc.cluster.local,logging-es-ops,logging-es-ops.logging.svc.cluster.local,logging-es-ops-cluster,logging-es-ops-cluster.logging.svc.cluster.local
+ generate_JKS_chain false logging-es logging-es,logging-es.logging.svc.cluster.local,logging-es-cluster,logging-es-cluster.logging.svc.cluster.local,logging-es-ops,logging-es-ops.logging.svc.cluster.local,logging-es-ops-cluster,logging-es-ops-cluster.logging.svc.cluster.local
+ dir=/tmp/openshift-logging-ansible-XvDfd8
+ ADD_OID=false
+ NODE_NAME=logging-es
+ CERT_NAMES=logging-es,logging-es.logging.svc.cluster.local,logging-es-cluster,logging-es-cluster.logging.svc.cluster.local,logging-es-ops,logging-es-ops.logging.svc.cluster.local,logging-es-ops-cluster,logging-es-ops-cluster.logging.svc.cluster.local
+ ks_pass=kspass
+ ts_pass=tspass
+ rm -rf logging-es
+ extension_names=
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local,dns:logging-es-ops-cluster
+ for name in '${CERT_NAMES//,/ }'
+ extension_names=,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local,dns:logging-es-ops-cluster,dns:logging-es-ops-cluster.logging.svc.cluster.local
+ '[' false = true ']'
+ echo Generating keystore and certificate for node logging-es
+ keytool -genkey -alias logging-es -keystore /tmp/openshift-logging-ansible-XvDfd8/logging-es.jks -keypass kspass -storepass kspass -keyalg RSA -keysize 2048 -validity 712 -dname 'CN=logging-es, OU=OpenShift, O=Logging' -ext san=dns:localhost,ip:127.0.0.1,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local,dns:logging-es-ops-cluster,dns:logging-es-ops-cluster.logging.svc.cluster.local
+ echo Generating certificate signing request for node logging-es
+ keytool -certreq -alias logging-es -keystore /tmp/openshift-logging-ansible-XvDfd8/logging-es.jks -storepass kspass -file /tmp/openshift-logging-ansible-XvDfd8/logging-es.csr -keyalg rsa -dname 'CN=logging-es, OU=OpenShift, O=Logging' -ext san=dns:localhost,ip:127.0.0.1,dns:logging-es,dns:logging-es.logging.svc.cluster.local,dns:logging-es-cluster,dns:logging-es-cluster.logging.svc.cluster.local,dns:logging-es-ops,dns:logging-es-ops.logging.svc.cluster.local,dns:logging-es-ops-cluster,dns:logging-es-ops-cluster.logging.svc.cluster.local
+ echo Sign certificate request with CA
+ openssl ca -in /tmp/openshift-logging-ansible-XvDfd8/logging-es.csr -notext -out /tmp/openshift-logging-ansible-XvDfd8/logging-es.crt -config /tmp/openshift-logging-ansible-XvDfd8/signing.conf -extensions v3_req -batch -extensions server_ext
Using configuration from /tmp/openshift-logging-ansible-XvDfd8/signing.conf
Check that the request matches the signature
Signature ok
Certificate Details:
        Serial Number: 8 (0x8)
        Validity
            Not Before: Jun  9 14:38:14 2017 GMT
            Not After : Jun  9 14:38:14 2019 GMT
        Subject:
            organizationName          = Logging
            organizationalUnitName    = OpenShift
            commonName                = logging-es
        X509v3 extensions:
            X509v3 Key Usage: critical
                Digital Signature, Key Encipherment
            X509v3 Basic Constraints: 
                CA:FALSE
            X509v3 Extended Key Usage: 
                TLS Web Server Authentication, TLS Web Client Authentication
            X509v3 Subject Key Identifier: 
                C8:79:25:42:27:DD:5C:BE:8B:A8:B9:EE:4C:C9:CE:62:ED:80:31:C9
            X509v3 Authority Key Identifier: 
                0.
            X509v3 Subject Alternative Name: 
                DNS:localhost, IP Address:127.0.0.1, DNS:logging-es, DNS:logging-es.logging.svc.cluster.local, DNS:logging-es-cluster, DNS:logging-es-cluster.logging.svc.cluster.local, DNS:logging-es-ops, DNS:logging-es-ops.logging.svc.cluster.local, DNS:logging-es-ops-cluster, DNS:logging-es-ops-cluster.logging.svc.cluster.local
Certificate is to be certified until Jun  9 14:38:14 2019 GMT (730 days)

Write out database with 1 new entries
Data Base Updated
+ echo 'Import back to keystore (including CA chain)'
+ keytool -import -file /tmp/openshift-logging-ansible-XvDfd8/ca.crt -keystore /tmp/openshift-logging-ansible-XvDfd8/logging-es.jks -storepass kspass -noprompt -alias sig-ca
Certificate was added to keystore
+ keytool -import -file /tmp/openshift-logging-ansible-XvDfd8/logging-es.crt -keystore /tmp/openshift-logging-ansible-XvDfd8/logging-es.jks -storepass kspass -noprompt -alias logging-es
Certificate reply was installed in keystore
+ echo All done for logging-es
+ '[' '!' -f /tmp/openshift-logging-ansible-XvDfd8/truststore.jks ']'
+ createTruststore
+ echo 'Import CA to truststore for validating client certs'
+ keytool -import -file /tmp/openshift-logging-ansible-XvDfd8/ca.crt -keystore /tmp/openshift-logging-ansible-XvDfd8/truststore.jks -storepass tspass -noprompt -alias sig-ca
Certificate was added to keystore
+ exit 0


TASK [openshift_logging : Pushing locally generated JKS certs to remote host...] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:66
changed: [openshift] => {
    "changed": true, 
    "checksum": "052b8a7ac387cf8bf22c0b0eef589255df499047", 
    "dest": "/etc/origin/logging/elasticsearch.jks", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "2e42fac89f9a328c6803434eb597f13c", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 3768, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019094.63-154644430843149/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging : Pushing locally generated JKS certs to remote host...] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:72
changed: [openshift] => {
    "changed": true, 
    "checksum": "29d8794d732e110ef7d0df624c1f35fb88c68d2b", 
    "dest": "/etc/origin/logging/logging-es.jks", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "c5d9eed2a57f14071ac4ee8dd163e484", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 3983, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019094.85-26284317928805/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging : Pushing locally generated JKS certs to remote host...] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:78
changed: [openshift] => {
    "changed": true, 
    "checksum": "daf2c5d4b9b536c4a541f754fe6dc8bfc3a7dd76", 
    "dest": "/etc/origin/logging/system.admin.jks", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "bb83e44548a3ae06f400fd4d78eef433", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 3700, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019095.07-266309157050246/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging : Pushing locally generated JKS certs to remote host...] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_jks.yaml:84
changed: [openshift] => {
    "changed": true, 
    "checksum": "c2fa2fc2acacd44370e30a660fd0884b80125dee", 
    "dest": "/etc/origin/logging/truststore.jks", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "ce07b82515615a49d9860d98931aab75", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 797, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019095.29-58532735198273/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging : Generate proxy session] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:141
ok: [openshift] => {
    "ansible_facts": {
        "session_secret": "eu3HEGJPyrV5LzStPpHS9ooyJuPfoMdvFvsvzYQ7ZiCfRDMv0BJAJxTsU5afU3rHayydoRQcnzJ13tEvdqmtfQWqSMNWD7TC9H8B2BPjm1w2gpg9UQVsPC0EWXhvPI2ZZucFBE7TiER9h2MzrVIcpN1NchL7vBrFSmOLcf5zVIHnfn95M1OYFj83PYWWnwLcwZiEV280"
    }, 
    "changed": false
}

TASK [openshift_logging : Generate oauth client secret] ************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/generate_certs.yaml:146
ok: [openshift] => {
    "ansible_facts": {
        "oauth_secret": "SpbrdGHumtfpw1qNi4oM1zlvglb2S6CrjCIKknVGDHjjRnoFQl0PZssmZXrYJipu"
    }, 
    "changed": false
}

TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:53

TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:57
ok: [openshift] => {
    "ansible_facts": {
        "es_indices": "[]"
    }, 
    "changed": false
}

TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:60
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:64

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:85
statically included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml

TASK [openshift_logging_elasticsearch : Validate Elasticsearch cluster size] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:2
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Validate Elasticsearch Ops cluster size] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:6
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:10
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:14
ok: [openshift] => {
    "ansible_facts": {
        "elasticsearch_name": "logging-elasticsearch", 
        "es_component": "es"
    }, 
    "changed": false
}

TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:3
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:7
ok: [openshift] => {
    "ansible_facts": {
        "es_version": "3_5"
    }, 
    "changed": false
}

TASK [openshift_logging_elasticsearch : debug] *********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:11
ok: [openshift] => {
    "changed": false, 
    "openshift_logging_image_version": "latest"
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:14
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:17
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Create temp directory for doing work in] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:21
ok: [openshift] => {
    "changed": false, 
    "cmd": [
        "mktemp", 
        "-d", 
        "/tmp/openshift-logging-ansible-XXXXXX"
    ], 
    "delta": "0:00:00.002024", 
    "end": "2017-06-09 10:38:16.123208", 
    "rc": 0, 
    "start": "2017-06-09 10:38:16.121184"
}

STDOUT:

/tmp/openshift-logging-ansible-VTuztn

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:26
ok: [openshift] => {
    "ansible_facts": {
        "tempdir": "/tmp/openshift-logging-ansible-VTuztn"
    }, 
    "changed": false
}

TASK [openshift_logging_elasticsearch : Create templates subdirectory] *********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:30
ok: [openshift] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/tmp/openshift-logging-ansible-VTuztn/templates", 
    "secontext": "unconfined_u:object_r:user_tmp_t:s0", 
    "size": 6, 
    "state": "directory", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : Create ES service account] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:40
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Create ES service account] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:48
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get sa aggregated-logging-elasticsearch -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "imagePullSecrets": [
                    {
                        "name": "aggregated-logging-elasticsearch-dockercfg-8b8mr"
                    }
                ], 
                "kind": "ServiceAccount", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:16Z", 
                    "name": "aggregated-logging-elasticsearch", 
                    "namespace": "logging", 
                    "resourceVersion": "1287", 
                    "selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-elasticsearch", 
                    "uid": "46763e9a-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "secrets": [
                    {
                        "name": "aggregated-logging-elasticsearch-dockercfg-8b8mr"
                    }, 
                    {
                        "name": "aggregated-logging-elasticsearch-token-7s2ck"
                    }
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:57
changed: [openshift] => {
    "changed": true, 
    "checksum": "e5015364391ac609da8655a9a1224131599a5cea", 
    "dest": "/tmp/openshift-logging-ansible-VTuztn/rolebinding-reader.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "446fb96447527f48f97e69bb41bad7be", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 135, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019097.29-250855973411529/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : Create rolebinding-reader role] ********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:61
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get clusterrole rolebinding-reader -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "ClusterRole", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:18Z", 
                    "name": "rolebinding-reader", 
                    "resourceVersion": "122", 
                    "selfLink": "/oapi/v1/clusterroles/rolebinding-reader", 
                    "uid": "471934a3-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "rules": [
                    {
                        "apiGroups": [
                            ""
                        ], 
                        "attributeRestrictions": null, 
                        "resources": [
                            "clusterrolebindings"
                        ], 
                        "verbs": [
                            "get"
                        ]
                    }
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Set rolebinding-reader permissions for ES] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:72
changed: [openshift] => {
    "changed": true, 
    "present": "present", 
    "results": {
        "cmd": "/bin/oc adm policy add-cluster-role-to-user rolebinding-reader system:serviceaccount:logging:aggregated-logging-elasticsearch -n logging", 
        "results": "", 
        "returncode": 0
    }
}

TASK [openshift_logging_elasticsearch : Generate logging-elasticsearch-view-role] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:81
ok: [openshift] => {
    "changed": false, 
    "checksum": "d752c09323565f80ed14fa806d42284f0c5aef2a", 
    "dest": "/tmp/openshift-logging-ansible-VTuztn/logging-elasticsearch-view-role.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "8299dca2fb036c06ba7c4f620680e0f6", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 183, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019098.99-80698130903945/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : Set logging-elasticsearch-view-role role] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:94
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get rolebinding logging-elasticsearch-view-role -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "groupNames": null, 
                "kind": "RoleBinding", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:19Z", 
                    "name": "logging-elasticsearch-view-role", 
                    "namespace": "logging", 
                    "resourceVersion": "710", 
                    "selfLink": "/oapi/v1/namespaces/logging/rolebindings/logging-elasticsearch-view-role", 
                    "uid": "481c4373-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "roleRef": {
                    "name": "view"
                }, 
                "subjects": [
                    {
                        "kind": "ServiceAccount", 
                        "name": "aggregated-logging-elasticsearch", 
                        "namespace": "logging"
                    }
                ], 
                "userNames": [
                    "system:serviceaccount:logging:aggregated-logging-elasticsearch"
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : template] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:105
ok: [openshift] => {
    "changed": false, 
    "checksum": "f91458d5dad42c496e2081ef872777a6f6eb9ff9", 
    "dest": "/tmp/openshift-logging-ansible-VTuztn/elasticsearch-logging.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "e4be7c33c1927bbdd8c909bfbe3d9f0b", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 2171, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019099.97-233911405499924/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : template] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:111
ok: [openshift] => {
    "changed": false, 
    "checksum": "6d4f976f6e77a6e0c8dca7e01fb5bedb68678b1d", 
    "dest": "/tmp/openshift-logging-ansible-VTuztn/elasticsearch.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "75abfd3a190832e593a8e5e7c5695e8e", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 2454, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019100.2-141840426739911/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:121
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:127
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Set ES configmap] **********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:133
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get configmap logging-elasticsearch -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "data": {
                    "elasticsearch.yml": "cluster:\n  name: ${CLUSTER_NAME}\n\nscript:\n  inline: on\n  indexed: on\n\nindex:\n  number_of_shards: 1\n  number_of_replicas: 0\n  unassigned.node_left.delayed_timeout: 2m\n  translog:\n    flush_threshold_size: 256mb\n    flush_threshold_period: 5m\n\nnode:\n  master: ${IS_MASTER}\n  data: ${HAS_DATA}\n\nnetwork:\n  host: 0.0.0.0\n\ncloud:\n  kubernetes:\n    service: ${SERVICE_DNS}\n    namespace: ${NAMESPACE}\n\ndiscovery:\n  type: kubernetes\n  zen.ping.multicast.enabled: false\n  zen.minimum_master_nodes: ${NODE_QUORUM}\n\ngateway:\n  recover_after_nodes: ${NODE_QUORUM}\n  expected_nodes: ${RECOVER_EXPECTED_NODES}\n  recover_after_time: ${RECOVER_AFTER_TIME}\n\nio.fabric8.elasticsearch.authentication.users: [\"system.logging.kibana\", \"system.logging.fluentd\", \"system.logging.curator\", \"system.admin\"]\nio.fabric8.elasticsearch.kibana.mapping.app: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\nio.fabric8.elasticsearch.kibana.mapping.ops: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\nio.fabric8.elasticsearch.kibana.mapping.empty: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\n\nopenshift.config:\n  use_common_data_model: true\n  project_index_prefix: \"project\"\n  time_field_name: \"@timestamp\"\n\nopenshift.searchguard:\n  keystore.path: /etc/elasticsearch/secret/admin.jks\n  truststore.path: /etc/elasticsearch/secret/searchguard.truststore\n\nopenshift.operations.allow_cluster_reader: false\n\npath:\n  data: /elasticsearch/persistent/${CLUSTER_NAME}/data\n  logs: /elasticsearch/${CLUSTER_NAME}/logs\n  work: /elasticsearch/${CLUSTER_NAME}/work\n  scripts: /elasticsearch/${CLUSTER_NAME}/scripts\n\nsearchguard:\n  authcz.admin_dn:\n  - CN=system.admin,OU=OpenShift,O=Logging\n  config_index_name: \".searchguard.${HOSTNAME}\"\n  ssl:\n    transport:\n      enabled: true\n      enforce_hostname_verification: false\n      keystore_type: JKS\n      keystore_filepath: /etc/elasticsearch/secret/searchguard.key\n      keystore_password: kspass\n      truststore_type: JKS\n      truststore_filepath: /etc/elasticsearch/secret/searchguard.truststore\n      truststore_password: tspass\n    http:\n      enabled: true\n      keystore_type: JKS\n      keystore_filepath: /etc/elasticsearch/secret/key\n      keystore_password: kspass\n      clientauth_mode: OPTIONAL\n      truststore_type: JKS\n      truststore_filepath: /etc/elasticsearch/secret/truststore\n      truststore_password: tspass\n", 
                    "logging.yml": "# you can override this using by setting a system property, for example -Des.logger.level=DEBUG\nes.logger.level: INFO\nrootLogger: ${es.logger.level}, console, file\nlogger:\n  # log action execution errors for easier debugging\n  action: WARN\n  # reduce the logging for aws, too much is logged under the default INFO\n  com.amazonaws: WARN\n  io.fabric8.elasticsearch: ${PLUGIN_LOGLEVEL}\n  io.fabric8.kubernetes: ${PLUGIN_LOGLEVEL}\n\n  # gateway\n  #gateway: DEBUG\n  #index.gateway: DEBUG\n\n  # peer shard recovery\n  #indices.recovery: DEBUG\n\n  # discovery\n  #discovery: TRACE\n\n  index.search.slowlog: TRACE, index_search_slow_log_file\n  index.indexing.slowlog: TRACE, index_indexing_slow_log_file\n\n  # search-guard\n  com.floragunn.searchguard: WARN\n\nadditivity:\n  index.search.slowlog: false\n  index.indexing.slowlog: false\n\nappender:\n  console:\n    type: console\n    layout:\n      type: consolePattern\n      conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n  file:\n    type: dailyRollingFile\n    file: ${path.logs}/${cluster.name}.log\n    datePattern: \"'.'yyyy-MM-dd\"\n    layout:\n      type: pattern\n      conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n  # Use the following log4j-extras RollingFileAppender to enable gzip compression of log files.\n  # For more information see https://logging.apache.org/log4j/extras/apidocs/org/apache/log4j/rolling/RollingFileAppender.html\n  #file:\n    #type: extrasRollingFile\n    #file: ${path.logs}/${cluster.name}.log\n    #rollingPolicy: timeBased\n    #rollingPolicy.FileNamePattern: ${path.logs}/${cluster.name}.log.%d{yyyy-MM-dd}.gz\n    #layout:\n      #type: pattern\n      #conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n  index_search_slow_log_file:\n    type: dailyRollingFile\n    file: ${path.logs}/${cluster.name}_index_search_slowlog.log\n    datePattern: \"'.'yyyy-MM-dd\"\n    layout:\n      type: pattern\n      conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n  index_indexing_slow_log_file:\n    type: dailyRollingFile\n    file: ${path.logs}/${cluster.name}_index_indexing_slowlog.log\n    datePattern: \"'.'yyyy-MM-dd\"\n    layout:\n      type: pattern\n      conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n"
                }, 
                "kind": "ConfigMap", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:21Z", 
                    "name": "logging-elasticsearch", 
                    "namespace": "logging", 
                    "resourceVersion": "1295", 
                    "selfLink": "/api/v1/namespaces/logging/configmaps/logging-elasticsearch", 
                    "uid": "48f83e1b-4d21-11e7-83b0-0e6fb895db82"
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Set ES secret] *************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:144
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc secrets new logging-elasticsearch key=/etc/origin/logging/logging-es.jks truststore=/etc/origin/logging/truststore.jks searchguard.key=/etc/origin/logging/elasticsearch.jks searchguard.truststore=/etc/origin/logging/truststore.jks admin-key=/etc/origin/logging/system.admin.key admin-cert=/etc/origin/logging/system.admin.crt admin-ca=/etc/origin/logging/ca.crt admin.jks=/etc/origin/logging/system.admin.jks -n logging", 
        "results": "", 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Set logging-es-cluster service] ********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:168
changed: [openshift] => {
    "changed": true, 
    "results": {
        "clusterip": "172.30.159.212", 
        "cmd": "/bin/oc get service logging-es-cluster -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "Service", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:22Z", 
                    "name": "logging-es-cluster", 
                    "namespace": "logging", 
                    "resourceVersion": "1299", 
                    "selfLink": "/api/v1/namespaces/logging/services/logging-es-cluster", 
                    "uid": "4a060bcc-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "clusterIP": "172.30.159.212", 
                    "ports": [
                        {
                            "port": 9300, 
                            "protocol": "TCP", 
                            "targetPort": 9300
                        }
                    ], 
                    "selector": {
                        "component": "es", 
                        "provider": "openshift"
                    }, 
                    "sessionAffinity": "None", 
                    "type": "ClusterIP"
                }, 
                "status": {
                    "loadBalancer": {}
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Set logging-es service] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:182
changed: [openshift] => {
    "changed": true, 
    "results": {
        "clusterip": "172.30.217.175", 
        "cmd": "/bin/oc get service logging-es -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "Service", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:23Z", 
                    "name": "logging-es", 
                    "namespace": "logging", 
                    "resourceVersion": "1302", 
                    "selfLink": "/api/v1/namespaces/logging/services/logging-es", 
                    "uid": "4aa07ff8-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "clusterIP": "172.30.217.175", 
                    "ports": [
                        {
                            "port": 9200, 
                            "protocol": "TCP", 
                            "targetPort": "restapi"
                        }
                    ], 
                    "selector": {
                        "component": "es", 
                        "provider": "openshift"
                    }, 
                    "sessionAffinity": "None", 
                    "type": "ClusterIP"
                }, 
                "status": {
                    "loadBalancer": {}
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Creating ES storage template] **********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:197
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Creating ES storage template] **********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:210
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Set ES storage] ************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:225
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:237
ok: [openshift] => {
    "ansible_facts": {
        "es_deploy_name": "logging-es-data-master-nij68urm"
    }, 
    "changed": false
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:241
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Set ES dc templates] *******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:246
changed: [openshift] => {
    "changed": true, 
    "checksum": "30851121a6ba208a54e303f846b13bd34c8f1e22", 
    "dest": "/tmp/openshift-logging-ansible-VTuztn/templates/logging-es-dc.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "42bf136a901e5c4fefd1c6c9939eaec2", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 3137, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019104.44-263205401276353/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : Set ES dc] *****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:262
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get dc logging-es-data-master-nij68urm -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "DeploymentConfig", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:25Z", 
                    "generation": 2, 
                    "labels": {
                        "component": "es", 
                        "deployment": "logging-es-data-master-nij68urm", 
                        "logging-infra": "elasticsearch", 
                        "provider": "openshift"
                    }, 
                    "name": "logging-es-data-master-nij68urm", 
                    "namespace": "logging", 
                    "resourceVersion": "1316", 
                    "selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-es-data-master-nij68urm", 
                    "uid": "4b5d865c-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "replicas": 1, 
                    "selector": {
                        "component": "es", 
                        "deployment": "logging-es-data-master-nij68urm", 
                        "logging-infra": "elasticsearch", 
                        "provider": "openshift"
                    }, 
                    "strategy": {
                        "activeDeadlineSeconds": 21600, 
                        "recreateParams": {
                            "timeoutSeconds": 600
                        }, 
                        "resources": {}, 
                        "type": "Recreate"
                    }, 
                    "template": {
                        "metadata": {
                            "creationTimestamp": null, 
                            "labels": {
                                "component": "es", 
                                "deployment": "logging-es-data-master-nij68urm", 
                                "logging-infra": "elasticsearch", 
                                "provider": "openshift"
                            }, 
                            "name": "logging-es-data-master-nij68urm"
                        }, 
                        "spec": {
                            "containers": [
                                {
                                    "env": [
                                        {
                                            "name": "NAMESPACE", 
                                            "valueFrom": {
                                                "fieldRef": {
                                                    "apiVersion": "v1", 
                                                    "fieldPath": "metadata.namespace"
                                                }
                                            }
                                        }, 
                                        {
                                            "name": "KUBERNETES_TRUST_CERT", 
                                            "value": "true"
                                        }, 
                                        {
                                            "name": "SERVICE_DNS", 
                                            "value": "logging-es-cluster"
                                        }, 
                                        {
                                            "name": "CLUSTER_NAME", 
                                            "value": "logging-es"
                                        }, 
                                        {
                                            "name": "INSTANCE_RAM", 
                                            "value": "8Gi"
                                        }, 
                                        {
                                            "name": "NODE_QUORUM", 
                                            "value": "1"
                                        }, 
                                        {
                                            "name": "RECOVER_EXPECTED_NODES", 
                                            "value": "1"
                                        }, 
                                        {
                                            "name": "RECOVER_AFTER_TIME", 
                                            "value": "5m"
                                        }, 
                                        {
                                            "name": "READINESS_PROBE_TIMEOUT", 
                                            "value": "30"
                                        }, 
                                        {
                                            "name": "IS_MASTER", 
                                            "value": "true"
                                        }, 
                                        {
                                            "name": "HAS_DATA", 
                                            "value": "true"
                                        }
                                    ], 
                                    "image": "172.30.224.2:5000/logging/logging-elasticsearch:latest", 
                                    "imagePullPolicy": "Always", 
                                    "name": "elasticsearch", 
                                    "ports": [
                                        {
                                            "containerPort": 9200, 
                                            "name": "restapi", 
                                            "protocol": "TCP"
                                        }, 
                                        {
                                            "containerPort": 9300, 
                                            "name": "cluster", 
                                            "protocol": "TCP"
                                        }
                                    ], 
                                    "readinessProbe": {
                                        "exec": {
                                            "command": [
                                                "/usr/share/elasticsearch/probe/readiness.sh"
                                            ]
                                        }, 
                                        "failureThreshold": 3, 
                                        "initialDelaySeconds": 10, 
                                        "periodSeconds": 5, 
                                        "successThreshold": 1, 
                                        "timeoutSeconds": 30
                                    }, 
                                    "resources": {
                                        "limits": {
                                            "cpu": "1", 
                                            "memory": "8Gi"
                                        }, 
                                        "requests": {
                                            "memory": "512Mi"
                                        }
                                    }, 
                                    "terminationMessagePath": "/dev/termination-log", 
                                    "terminationMessagePolicy": "File", 
                                    "volumeMounts": [
                                        {
                                            "mountPath": "/etc/elasticsearch/secret", 
                                            "name": "elasticsearch", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/usr/share/java/elasticsearch/config", 
                                            "name": "elasticsearch-config", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/elasticsearch/persistent", 
                                            "name": "elasticsearch-storage"
                                        }
                                    ]
                                }
                            ], 
                            "dnsPolicy": "ClusterFirst", 
                            "restartPolicy": "Always", 
                            "schedulerName": "default-scheduler", 
                            "securityContext": {
                                "supplementalGroups": [
                                    65534
                                ]
                            }, 
                            "serviceAccount": "aggregated-logging-elasticsearch", 
                            "serviceAccountName": "aggregated-logging-elasticsearch", 
                            "terminationGracePeriodSeconds": 30, 
                            "volumes": [
                                {
                                    "name": "elasticsearch", 
                                    "secret": {
                                        "defaultMode": 420, 
                                        "secretName": "logging-elasticsearch"
                                    }
                                }, 
                                {
                                    "configMap": {
                                        "defaultMode": 420, 
                                        "name": "logging-elasticsearch"
                                    }, 
                                    "name": "elasticsearch-config"
                                }, 
                                {
                                    "emptyDir": {}, 
                                    "name": "elasticsearch-storage"
                                }
                            ]
                        }
                    }, 
                    "test": false, 
                    "triggers": [
                        {
                            "type": "ConfigChange"
                        }
                    ]
                }, 
                "status": {
                    "availableReplicas": 0, 
                    "conditions": [
                        {
                            "lastTransitionTime": "2017-06-09T14:38:25Z", 
                            "lastUpdateTime": "2017-06-09T14:38:25Z", 
                            "message": "Deployment config does not have minimum availability.", 
                            "status": "False", 
                            "type": "Available"
                        }, 
                        {
                            "lastTransitionTime": "2017-06-09T14:38:25Z", 
                            "lastUpdateTime": "2017-06-09T14:38:25Z", 
                            "message": "replication controller \"logging-es-data-master-nij68urm-1\" is waiting for pod \"logging-es-data-master-nij68urm-1-deploy\" to run", 
                            "status": "Unknown", 
                            "type": "Progressing"
                        }
                    ], 
                    "details": {
                        "causes": [
                            {
                                "type": "ConfigChange"
                            }
                        ], 
                        "message": "config change"
                    }, 
                    "latestVersion": 1, 
                    "observedGeneration": 2, 
                    "replicas": 0, 
                    "unavailableReplicas": 0, 
                    "updatedReplicas": 0
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Delete temp directory] *****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:274
ok: [openshift] => {
    "changed": false, 
    "path": "/tmp/openshift-logging-ansible-VTuztn", 
    "state": "absent"
}

TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:99

TASK [openshift_logging : set_fact] ********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:105
ok: [openshift] => {
    "ansible_facts": {
        "es_ops_indices": "[]"
    }, 
    "changed": false
}

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:109

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:132
statically included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml

TASK [openshift_logging_elasticsearch : Validate Elasticsearch cluster size] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:2
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Validate Elasticsearch Ops cluster size] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:6
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:10
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:14
ok: [openshift] => {
    "ansible_facts": {
        "elasticsearch_name": "logging-elasticsearch-ops", 
        "es_component": "es-ops"
    }, 
    "changed": false
}

TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:3
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:7
ok: [openshift] => {
    "ansible_facts": {
        "es_version": "3_5"
    }, 
    "changed": false
}

TASK [openshift_logging_elasticsearch : debug] *********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:11
ok: [openshift] => {
    "changed": false, 
    "openshift_logging_image_version": "latest"
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:14
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : fail] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/determine_version.yaml:17
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Create temp directory for doing work in] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:21
ok: [openshift] => {
    "changed": false, 
    "cmd": [
        "mktemp", 
        "-d", 
        "/tmp/openshift-logging-ansible-XXXXXX"
    ], 
    "delta": "0:00:00.002036", 
    "end": "2017-06-09 10:38:26.367797", 
    "rc": 0, 
    "start": "2017-06-09 10:38:26.365761"
}

STDOUT:

/tmp/openshift-logging-ansible-m9oYNc

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:26
ok: [openshift] => {
    "ansible_facts": {
        "tempdir": "/tmp/openshift-logging-ansible-m9oYNc"
    }, 
    "changed": false
}

TASK [openshift_logging_elasticsearch : Create templates subdirectory] *********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:30
ok: [openshift] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/tmp/openshift-logging-ansible-m9oYNc/templates", 
    "secontext": "unconfined_u:object_r:user_tmp_t:s0", 
    "size": 6, 
    "state": "directory", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : Create ES service account] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:40
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Create ES service account] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:48
ok: [openshift] => {
    "changed": false, 
    "results": {
        "cmd": "/bin/oc get sa aggregated-logging-elasticsearch -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "imagePullSecrets": [
                    {
                        "name": "aggregated-logging-elasticsearch-dockercfg-8b8mr"
                    }
                ], 
                "kind": "ServiceAccount", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:16Z", 
                    "name": "aggregated-logging-elasticsearch", 
                    "namespace": "logging", 
                    "resourceVersion": "1287", 
                    "selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-elasticsearch", 
                    "uid": "46763e9a-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "secrets": [
                    {
                        "name": "aggregated-logging-elasticsearch-dockercfg-8b8mr"
                    }, 
                    {
                        "name": "aggregated-logging-elasticsearch-token-7s2ck"
                    }
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:57
changed: [openshift] => {
    "changed": true, 
    "checksum": "e5015364391ac609da8655a9a1224131599a5cea", 
    "dest": "/tmp/openshift-logging-ansible-m9oYNc/rolebinding-reader.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "446fb96447527f48f97e69bb41bad7be", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 135, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019107.06-68225171245733/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : Create rolebinding-reader role] ********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:61
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get clusterrole rolebinding-reader -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "ClusterRole", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:18Z", 
                    "name": "rolebinding-reader", 
                    "resourceVersion": "122", 
                    "selfLink": "/oapi/v1/clusterroles/rolebinding-reader", 
                    "uid": "471934a3-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "rules": [
                    {
                        "apiGroups": [
                            ""
                        ], 
                        "attributeRestrictions": null, 
                        "resources": [
                            "clusterrolebindings"
                        ], 
                        "verbs": [
                            "get"
                        ]
                    }
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Set rolebinding-reader permissions for ES] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:72
ok: [openshift] => {
    "changed": false, 
    "present": "present"
}

TASK [openshift_logging_elasticsearch : Generate logging-elasticsearch-view-role] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:81
ok: [openshift] => {
    "changed": false, 
    "checksum": "d752c09323565f80ed14fa806d42284f0c5aef2a", 
    "dest": "/tmp/openshift-logging-ansible-m9oYNc/logging-elasticsearch-view-role.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "8299dca2fb036c06ba7c4f620680e0f6", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 183, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019108.69-86030367841377/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : Set logging-elasticsearch-view-role role] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:94
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get rolebinding logging-elasticsearch-view-role -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "groupNames": null, 
                "kind": "RoleBinding", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:19Z", 
                    "name": "logging-elasticsearch-view-role", 
                    "namespace": "logging", 
                    "resourceVersion": "1292", 
                    "selfLink": "/oapi/v1/namespaces/logging/rolebindings/logging-elasticsearch-view-role", 
                    "uid": "481c4373-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "roleRef": {
                    "name": "view"
                }, 
                "subjects": [
                    {
                        "kind": "ServiceAccount", 
                        "name": "aggregated-logging-elasticsearch", 
                        "namespace": "logging"
                    }
                ], 
                "userNames": [
                    "system:serviceaccount:logging:aggregated-logging-elasticsearch"
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : template] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:105
ok: [openshift] => {
    "changed": false, 
    "checksum": "f91458d5dad42c496e2081ef872777a6f6eb9ff9", 
    "dest": "/tmp/openshift-logging-ansible-m9oYNc/elasticsearch-logging.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "e4be7c33c1927bbdd8c909bfbe3d9f0b", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 2171, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019109.91-40225037035947/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : template] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:111
ok: [openshift] => {
    "changed": false, 
    "checksum": "6d4f976f6e77a6e0c8dca7e01fb5bedb68678b1d", 
    "dest": "/tmp/openshift-logging-ansible-m9oYNc/elasticsearch.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "75abfd3a190832e593a8e5e7c5695e8e", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 2454, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019110.15-273360170374285/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:121
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : copy] **********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:127
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Set ES configmap] **********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:133
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get configmap logging-elasticsearch-ops -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "data": {
                    "elasticsearch.yml": "cluster:\n  name: ${CLUSTER_NAME}\n\nscript:\n  inline: on\n  indexed: on\n\nindex:\n  number_of_shards: 1\n  number_of_replicas: 0\n  unassigned.node_left.delayed_timeout: 2m\n  translog:\n    flush_threshold_size: 256mb\n    flush_threshold_period: 5m\n\nnode:\n  master: ${IS_MASTER}\n  data: ${HAS_DATA}\n\nnetwork:\n  host: 0.0.0.0\n\ncloud:\n  kubernetes:\n    service: ${SERVICE_DNS}\n    namespace: ${NAMESPACE}\n\ndiscovery:\n  type: kubernetes\n  zen.ping.multicast.enabled: false\n  zen.minimum_master_nodes: ${NODE_QUORUM}\n\ngateway:\n  recover_after_nodes: ${NODE_QUORUM}\n  expected_nodes: ${RECOVER_EXPECTED_NODES}\n  recover_after_time: ${RECOVER_AFTER_TIME}\n\nio.fabric8.elasticsearch.authentication.users: [\"system.logging.kibana\", \"system.logging.fluentd\", \"system.logging.curator\", \"system.admin\"]\nio.fabric8.elasticsearch.kibana.mapping.app: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\nio.fabric8.elasticsearch.kibana.mapping.ops: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\nio.fabric8.elasticsearch.kibana.mapping.empty: /usr/share/elasticsearch/index_patterns/com.redhat.viaq-openshift.index-pattern.json\n\nopenshift.config:\n  use_common_data_model: true\n  project_index_prefix: \"project\"\n  time_field_name: \"@timestamp\"\n\nopenshift.searchguard:\n  keystore.path: /etc/elasticsearch/secret/admin.jks\n  truststore.path: /etc/elasticsearch/secret/searchguard.truststore\n\nopenshift.operations.allow_cluster_reader: false\n\npath:\n  data: /elasticsearch/persistent/${CLUSTER_NAME}/data\n  logs: /elasticsearch/${CLUSTER_NAME}/logs\n  work: /elasticsearch/${CLUSTER_NAME}/work\n  scripts: /elasticsearch/${CLUSTER_NAME}/scripts\n\nsearchguard:\n  authcz.admin_dn:\n  - CN=system.admin,OU=OpenShift,O=Logging\n  config_index_name: \".searchguard.${HOSTNAME}\"\n  ssl:\n    transport:\n      enabled: true\n      enforce_hostname_verification: false\n      keystore_type: JKS\n      keystore_filepath: /etc/elasticsearch/secret/searchguard.key\n      keystore_password: kspass\n      truststore_type: JKS\n      truststore_filepath: /etc/elasticsearch/secret/searchguard.truststore\n      truststore_password: tspass\n    http:\n      enabled: true\n      keystore_type: JKS\n      keystore_filepath: /etc/elasticsearch/secret/key\n      keystore_password: kspass\n      clientauth_mode: OPTIONAL\n      truststore_type: JKS\n      truststore_filepath: /etc/elasticsearch/secret/truststore\n      truststore_password: tspass\n", 
                    "logging.yml": "# you can override this using by setting a system property, for example -Des.logger.level=DEBUG\nes.logger.level: INFO\nrootLogger: ${es.logger.level}, console, file\nlogger:\n  # log action execution errors for easier debugging\n  action: WARN\n  # reduce the logging for aws, too much is logged under the default INFO\n  com.amazonaws: WARN\n  io.fabric8.elasticsearch: ${PLUGIN_LOGLEVEL}\n  io.fabric8.kubernetes: ${PLUGIN_LOGLEVEL}\n\n  # gateway\n  #gateway: DEBUG\n  #index.gateway: DEBUG\n\n  # peer shard recovery\n  #indices.recovery: DEBUG\n\n  # discovery\n  #discovery: TRACE\n\n  index.search.slowlog: TRACE, index_search_slow_log_file\n  index.indexing.slowlog: TRACE, index_indexing_slow_log_file\n\n  # search-guard\n  com.floragunn.searchguard: WARN\n\nadditivity:\n  index.search.slowlog: false\n  index.indexing.slowlog: false\n\nappender:\n  console:\n    type: console\n    layout:\n      type: consolePattern\n      conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n  file:\n    type: dailyRollingFile\n    file: ${path.logs}/${cluster.name}.log\n    datePattern: \"'.'yyyy-MM-dd\"\n    layout:\n      type: pattern\n      conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n  # Use the following log4j-extras RollingFileAppender to enable gzip compression of log files.\n  # For more information see https://logging.apache.org/log4j/extras/apidocs/org/apache/log4j/rolling/RollingFileAppender.html\n  #file:\n    #type: extrasRollingFile\n    #file: ${path.logs}/${cluster.name}.log\n    #rollingPolicy: timeBased\n    #rollingPolicy.FileNamePattern: ${path.logs}/${cluster.name}.log.%d{yyyy-MM-dd}.gz\n    #layout:\n      #type: pattern\n      #conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n  index_search_slow_log_file:\n    type: dailyRollingFile\n    file: ${path.logs}/${cluster.name}_index_search_slowlog.log\n    datePattern: \"'.'yyyy-MM-dd\"\n    layout:\n      type: pattern\n      conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n\n  index_indexing_slow_log_file:\n    type: dailyRollingFile\n    file: ${path.logs}/${cluster.name}_index_indexing_slowlog.log\n    datePattern: \"'.'yyyy-MM-dd\"\n    layout:\n      type: pattern\n      conversionPattern: \"[%d{ISO8601}][%-5p][%-25c] %m%n\"\n"
                }, 
                "kind": "ConfigMap", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:30Z", 
                    "name": "logging-elasticsearch-ops", 
                    "namespace": "logging", 
                    "resourceVersion": "1338", 
                    "selfLink": "/api/v1/namespaces/logging/configmaps/logging-elasticsearch-ops", 
                    "uid": "4ecb46f4-4d21-11e7-83b0-0e6fb895db82"
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Set ES secret] *************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:144
ok: [openshift] => {
    "changed": false, 
    "results": {
        "apiVersion": "v1", 
        "data": {
            "admin-ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjME4xb1hEVEl5TURZd09ERTBNemMwT0ZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU4vVnhpck9Ub0Z1OUhrSUVtQVMwU0RlbXE0LzF1RSt3dWxoYWVMWFZXVVEKZ05QVSt3ZTlMRGw5bXJNYUN1VUFKall0YUNxdktxbGpLNUtSWVd4dldIK09lRkd0bmZXajN3RllUL0NIRXpIWAozclR3OVBzU1grelRabHlEUXl5ekZ6S29KNm15MGlsU2NwcXdMcEpIMk9tdjE2Sk12VytKck13Q2FmUURESTMvCkZPZWFSdndVcEUxVlFuK0wvRTJGcU02SlRwQ3ZKVTB3TGp4Rkg1b3ZpSDNDdXBFR29sYW1JNHhXaXp5dUR0amUKNnpuTGJPOGJ2a1BFZkk5MHN1TTR0TlpGOVlyd3VHaS9mUkN4U2c3dXc1dUlocU5xQ3RWSUVGSHN5Qng0Vm96eApTbGhSaHJzR1JuRkdDR2lWM3FVemI3aWZPN05jVUFBMzVzQUZWb0pzUUJzQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFISFoKaEtMM0cxdjZPMjhpcHByb2cxT1k3aDFIQWgrUDZzRmdEczlwbER2WnhnaER5SWdPUEt3R0FYZW9JTlkwZ1dSNApDdEJUeDJWWE82b0dxM1Yrc0pNR2ZjaVVBRzlCaDZneGh5UHIwTkY4aDlWcGQzdFJMVW9hNHVRZ0tSdHYrNlRBCnJISzdpVTF5RmhNSTNKSjF3dllaZjdVNWI4dTNrYnphbUl5WnNqMlJWOXo1cVh1U1lhY0gyd3lEWjh3UjJBdDQKRzZDT0VwNGsxMTg0TmpOZlBtTXAybi8xcFllUjJoTWtIUVlCSitzK2sxWTV5QVNJNEdBWDF6Skt6TWJ4eEhmTgpVYk1RcjBXdUdIMkRmRlVubFJwbWJaeGZWdEpZK1ppQXEwVHVyUnhmNnlpUXE3TDNJZ0YzMERWYWR6V2ZpOTdmCjltS2pEb0tOSUVOaG00Y1NVVnc9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K", 
            "admin-cert": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURQRENDQWlTZ0F3SUJBZ0lCQlRBTkJna3Foa2lHOXcwQkFRVUZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjMU5Wb1hEVEU1TURZd09URTBNemMxTlZvdwpQVEVRTUE0R0ExVUVDZ3dIVEc5bloybHVaekVTTUJBR0ExVUVDd3dKVDNCbGJsTm9hV1owTVJVd0V3WURWUVFECkRBeHplWE4wWlcwdVlXUnRhVzR3Z2dFaU1BMEdDU3FHU0liM0RRRUJBUVVBQTRJQkR3QXdnZ0VLQW9JQkFRRG0KeXpmYk1sMVdUMG8vS1dZNVlDcE92anYvT3hKbVVsMHBteXlpbVNKMUNCUjVqSXNhaGF0dXFtY05yY0NCeEl4aApyUWl6NnhNV1NHQm0xb0pVS1dCdXordm51SC9NMkhITFdFazNUWHd0N0pqQXhsbERWZVVOaFdTc2RiWFJEQXQ1ClBvcW95OGR4ZGlLUWw5QWZMMENEeFdvWGpQYWNTV045bVNPQ0laYjAxYWliOWw0VUlYUHc5WkcxRkV4L1lyeDEKbnAvZ2o1aEJKY2lZbXR1eThUSTgwVVl1R1B4aU1NZFpLZ0w5Sk9uK2VuY0JOUDhLRTFqRWxNVlltVGdaOHY4YwpHSk0vQUZTbTRWRHpoYndEUExHNkpzd3RlalV0L1JHRTRiTmZTeENTV2N3cExCVkxKSGVSZVdieVdET3hpWmlHCnE5b0xGMndhR2FHY0w0akI1ZjJ0QWdNQkFBR2paakJrTUE0R0ExVWREd0VCL3dRRUF3SUZvREFKQmdOVkhSTUUKQWpBQU1CMEdBMVVkSlFRV01CUUdDQ3NHQVFVRkJ3TUJCZ2dyQmdFRkJRY0RBakFkQmdOVkhRNEVGZ1FVUHZpRgo2dHlXUGtURDVtSTF1NGoyRzhNeTd4b3dDUVlEVlIwakJBSXdBREFOQmdrcWhraUc5dzBCQVFVRkFBT0NBUUVBClVhRzJ4cWYrdjVGbkZJMGMzSEFDOVpnS3pKS0V4NWRCUnh2bGg3dXVjd1EvOXpzRDFWN0pxQ0VjdzdJeUtOTjMKb0hlZkMzU295QmNLOG1zQ0JvNkdEVHlIWHBtRFF0eG5yUkt2U0c4V1kvZUhFYjdlUG1MWW1TYlF6cktxMFpYdwp5YzIwL2N1cHpHRzE3N04rNlNsbGFlZDJCVmRLRUk4NllZRSs0Ym5DMXFBK3ArTVpUNHBlMU1KWE9OZ0lyZnhpCjdiYVd1c1loclBOai9JRmcwV2tISU4zMTY2aHV6TjF3bXFjNzZJQzlUdVNMaTMrdlJkb0VNdUVjMG5KRDc4SE8KQnNIUC9qbnl4OStrWHZFNS9oTTQ2N3pFbkpZMURwTzdVT2tnYmtRRzAvd3BPbnhUb1luc3pwRVJTZW9BLzJsbgpNYzdzUTRWUDVKUXlSVUVCeW9ZY3V3PT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=", 
            "admin-key": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV2QUlCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktZd2dnU2lBZ0VBQW9JQkFRRG15emZiTWwxV1Qwby8KS1dZNVlDcE92anYvT3hKbVVsMHBteXlpbVNKMUNCUjVqSXNhaGF0dXFtY05yY0NCeEl4aHJRaXo2eE1XU0dCbQoxb0pVS1dCdXordm51SC9NMkhITFdFazNUWHd0N0pqQXhsbERWZVVOaFdTc2RiWFJEQXQ1UG9xb3k4ZHhkaUtRCmw5QWZMMENEeFdvWGpQYWNTV045bVNPQ0laYjAxYWliOWw0VUlYUHc5WkcxRkV4L1lyeDFucC9najVoQkpjaVkKbXR1eThUSTgwVVl1R1B4aU1NZFpLZ0w5Sk9uK2VuY0JOUDhLRTFqRWxNVlltVGdaOHY4Y0dKTS9BRlNtNFZEegpoYndEUExHNkpzd3RlalV0L1JHRTRiTmZTeENTV2N3cExCVkxKSGVSZVdieVdET3hpWmlHcTlvTEYyd2FHYUdjCkw0akI1ZjJ0QWdNQkFBRUNnZ0VBQzdyNmZNNmpoZGFEM0kvN0hTRnhKUTF3T1RuRVpxVDdEVGxRM0hiU0ZJYXoKbTdYMEFBWVpQNGV6c3pOSW5JL2ozcmNQd1MxS2pCQmpyZlpySzJIcS85YmJrNnlCK21ScDdEN3hnYmtUZmJZUAowWVNEVnZQUHI0OC9IYkRoNjkwajNDYThuM1cwRG1WRnZubW8zMXprZDQ0YUpWWkoxb200WDlMZnhpWjA2TDYyCm0xbSszQVFOOEo1SFI1QjZEbXRWUEdZSXg1ME9VTGZmQ25vQWMxUCszNWsrL25aZUV3VFRqTHZUS1RWdXI3MngKNldkbENlZlJzSUxZYzBhYkFYQVoxczk0dDI5Mjc4eHBPMitHbTdnb08xNkJNVlY1blZzMnV5ZGJLNy9wNGRDOAo1OXQ5R0NoTExSelVhcEkvOHZLSG9KYzdTVGZUUGhOelZQdmR6TU1wUVFLQmdRRDREc0dESGN5dGEwOW1iVEdUCis5UlVZNTNJRkVkUlRLQ1hnU1cvSU9CM1RZZ2lvMHpua2FIUUd2MHdMRkNYdmNveUF3MUhLRURiS0xvWm1PNU8Kb1hhYzh2NUhQZG5icHM2NU5tQmJSbXlhdHpmcG5YRVpUNXhNbDdTdDVuajlRbmh5VUd1N3Y5WXk3cXBFdUd2dAp0T1hvSUlDODNiOTJEK3RKM09vWUlhYXozUUtCZ1FEdUx2VGxsaDNudjREVGN2RkdlSWxJcUdLcUJTcXhwVVVKCmorSlRrcEdzNXRLVnNVWHVJS1hURkJ0MmIxVktQeDlGNkU0Y2RlNHdiQlk4VFNhenEyNkhTYW8wSERCYnN1SjYKMm9YVGRpQVpNeVFHVCtOZ3FuQWlIek56djhBTENGdk03WXhtZTVrRmJsRklpNlRwdjBYMXllOThJa1NpcVRjUwpuMGpxUExsOEVRS0JnQ3FZcmhaTDZiZmw4NDZOTmJSSnpGNXBkTkRFMnJKVlRFWjRBUGxmTmV5R3ZkZmszdkh0CkNqSm9VQTRUcmd0QXBoWU1mQjA2QmxrY0Ywa0ovL0kwdTkrYndwcU5Qbm4vblFRbEdFclVQeFlhWEZtYmxhNVoKQTJnTHJSVW52S01RNVFvTVVWTytUNVFUYzkyMUphTitLdlMva0J2ZW9HQXZ2NkhCYzltTS9jbHBBb0dBUlp2QgpoSXRQcUJmbWZXVmRML3d3R3VUakdNOHp3QUVONFJLRTBXbkhXeUdHTDZ0RVJhaVM3ZEkxaUhKWEdjREMwZnBuClFaZ1JpV005U0ZnLzFFaG9uKzhlWitaSkY3TjZ6dTZvSjdtZy9keGRUZDd4RTZJYjdVenpDNFBoaXhmaFJFMjAKTjRzdG1GRzZQWnBhRXRvTWh4MkIrT1JHN01iSlNvNzBqMGc1NlJFQ2dZQWlFU1RlNVM5b3dFMTRodXZUOFcwegpGcWNLRWpyU0cxWVFhMkw4eDBweG9vT1NzWEZPbnlLOEtKYm1vSnFTUmJZTmw4UkVKQ3JSWTRKdnhLQ3kyOVE3CmQ0UnNZMHhzVkc1ZnFZSnBhMXVZdmZhNmtiaXE1ZElkKzBnK1ZOV0d3VHNYTGQydmljc0VxSEw1RzhTc0cwTHYKbHZoS01tdS9WK0ZpSjhjSVJpVXFqdz09Ci0tLS0tRU5EIFBSSVZBVEUgS0VZLS0tLS0K", 
            "admin.jks": "/u3+7QAAAAIAAAACAAAAAQAMc3lzdGVtLmFkbWluAAABXI1Ki94AAAUBMIIE/TAOBgorBgEEASoCEQEBBQAEggTpFesxiIaFBi0JKeRFKQgcGyGK94IiSNkPr4sCz/sN0GpEyCcCXMNB7vsqH+gk1WkNc/Ar3XmABEEGX6iFyMvYgNuKSORzqLD9mWPanc3GneI18gePyf0CBNsOlS+kk49J80u0T8iL4UsAL5FYMtpyKUCp5NZ/yrN6/c+gUSxvWd4RAsW80xDWmiK+wEkof3jRTzHG6mKVbxsffJ7Q1R3ue+uNolDPicM+ALwKEm/O1IzdQGS1Hs7UJ0l9kXymtm6Ofo01PZgSrpE3AO08Sh4p3HTmttvbbciPkdLPitGBnQOQ86oHXP+PYo78M6Sk3LIqMiLKKJPhBUv7UCZfHukt7bwfjx+gl5Hod3xrjJQtfyq9gk2Hi2MVb6rLTGU0Kiecj0ncL4YYqtCXpT1gkxMrokwPv2JVpFnJGNI5ID0GIEOHJbs2dj64cjSqae760zkkhoH/FY8yrRraQI6hf+lUfsF6XIld7PjJqmUsW64cA4GN5A1ta8+A0GwNXReAhM0zwMTbUX9vFw2CbbIujZjFbjrWvSKRnG5vD7pYH4SRNJ4cbG5jesYhOdKLDQk8TTT4LlacTI92b05BtOymLWE4sOQniA9DKPIy1OC2SwyWfXXFM7iG+mhsoX/MJGqct3VPodz/IX2008f0HlVAe/Zzpel+pTaSAPpqInGS3UkrkcK5bkAYqNJz9baYnNAW7DGVkg2RXHBjJ4Aqiod4/HNEqFS7jYSxGdfbCnUCcdZCeYvuSYLE0qJC/nDbg18OHL6BSWxXt+jHSQ9L1mue2hmAR+Eo3R38d+gcMeXvkikhCVu4w8W/u4Bw+Q2U4pe3cWEG/yTSDpIyiH39dQB1M8/NmxRSXgDWaAEBXq91xi6Bu27eZdVWtEEPAil9CSFl4XXGRdcs4NP93tLJm45LUNtvJ8ssnSZ+1qIo+YuMvoH9rAy8ExS/IcSdvIpPVfgfiUawkz3UXspz+a1hoTm2WYIyiop9StNleX5TUdT2gG5brXZ8BRiVRivb6kpszDvIfAFOl3dFhM39oRpgapBuXrdyleS9+VZA6M7K65xJhWb7i5x6rVJORB/yC2vyNm/IM+GIau9wMi+jzXlzn+ff/ywkBmLr41LwifqD/mT7yJ8poeWE8mV+/GcA6msUiQVVmI1pPexgDfHoqSBCmE8SltOgJ2vfRfe9OpZVEtct8RMMTl4Me4odYh/+fRpdr3Kt7UorY4ma9vHHQTZZMjTJ6F96CCgNWGuCIjaVngtMmpEp54Y+EZpqXxbzDTWAJLE3+Qh1FNcRNRuhLCmPtwvaY8TMAjvZwhMCi5bqv8rkTcxdCM3t1XKUeeey/d/lQ7OFWD1k/joJv6hvuoy1Oov4rytMijSefDl5HYAQCu5LD4jkDLOfx3WuFhmODiVK8WxBjn1zTdHGKoRV5Sx0K6MtNicsGlWLNAmtXEG3oQ99MpxKt+p/B/sKwADy3Jy7x3RdBpjwqHErcwZdHfjkjEL6w9oFfUMAb/giIAl2+MHUPsOYN7MzMm/dAwrHUQUzdi7N7+TULXI91U7dAYekjmHTWRgRa5sUfFl4r5IpHVjjLLHpYJDvXstsrBIws5eAIoajYCufnN0o3q+Ycy/U+Ue72wpwMYmOvj7nAFTihW+cdzoozuzeGsR4Fvm5eXTIBbNVutD2GpLwRb4RSjgdAAAAAgAFWC41MDkAAANAMIIDPDCCAiSgAwIBAgIBBjANBgkqhkiG9w0BAQUFADAeMRwwGgYDVQQDExNsb2dnaW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwOTE0MzgxMVoXDTE5MDYwOTE0MzgxMVowPTEQMA4GA1UEChMHTG9nZ2luZzESMBAGA1UECxMJT3BlblNoaWZ0MRUwEwYDVQQDEwxzeXN0ZW0uYWRtaW4wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCEG3NMB3UM8+gI8/c/Xl/0V54BD6bNQvW9UqDyfcmYUaNFjI49IfEqRtiN7sKer6KgI+ocS5XTeD9nMWgIFxeNvYkXrovTLIhIowuc54pwcIc+pA0FXIzoo8JTUQII8/xwbfUvmtVRvW2gj8euTWwgs3Bid3l3keV2ja+POTiaPSt46PftSHlGumr5BAJD/83Zh+4IQWqNMQfSm+OQN9Af9pTI+tQWpr7Xr8hGtufuJUD5Tl4JYxeYfepeR90A+fGJi2pJdoLj5YnFj5Wa316k+m4DDosVP+jNdn7P8F7DBHxq4YyP0P/ml3Ro6IR2XEY/z5JWdzHCViwltNt5Ejp5AgMBAAGjZjBkMA4GA1UdDwEB/wQEAwIFoDAJBgNVHRMEAjAAMB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjAdBgNVHQ4EFgQU3AEiBMJJlCMIROfSMMT0y60A8I8wCQYDVR0jBAIwADANBgkqhkiG9w0BAQUFAAOCAQEAM8P+KFsHtIzy0BLF9ocL8uqK5ycwa+0F/wnRgkSC7PLZlVoVVZD3q3LCG4ZpFBVX4Hajl9ms8HB7h0wc+n+6+WC8J7nEIeyrjjIigxS/mKXKmHg7z93dNsa7xzywV1u+j4WwJrHO7Hcp3FlHWic39hdRkQiflL8UB07OqzTG+HxkbtuUgL26+VSn3LVrncSXoSnJIaPLcyzw21w5Hik4ZHo5n8/5sJvRQ56W5sor+pKtFcF3dNxWtYlSN45cZ2yjvSTKi2i3h8bE7I1du8BdwobA6eAzEHLnQVQTcgPebVU9h3Xv0Z5/mCjOqIIdE03P+uM3zaP1DWY3lo53dvnKWAAFWC41MDkAAALeMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dnaW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwOTE0Mzc0N1oXDTIyMDYwODE0Mzc0OFowHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAN/VxirOToFu9HkIEmAS0SDemq4/1uE+wulhaeLXVWUQgNPU+we9LDl9mrMaCuUAJjYtaCqvKqljK5KRYWxvWH+OeFGtnfWj3wFYT/CHEzHX3rTw9PsSX+zTZlyDQyyzFzKoJ6my0ilScpqwLpJH2Omv16JMvW+JrMwCafQDDI3/FOeaRvwUpE1VQn+L/E2FqM6JTpCvJU0wLjxFH5oviH3CupEGolamI4xWizyuDtje6znLbO8bvkPEfI90suM4tNZF9YrwuGi/fRCxSg7uw5uIhqNqCtVIEFHsyBx4VozxSlhRhrsGRnFGCGiV3qUzb7ifO7NcUAA35sAFVoJsQBsCAwEAAaMjMCEwDgYDVR0PAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAHHZhKL3G1v6O28ipprog1OY7h1HAh+P6sFgDs9plDvZxghDyIgOPKwGAXeoINY0gWR4CtBTx2VXO6oGq3V+sJMGfciUAG9Bh6gxhyPr0NF8h9Vpd3tRLUoa4uQgKRtv+6TArHK7iU1yFhMI3JJ1wvYZf7U5b8u3kbzamIyZsj2RV9z5qXuSYacH2wyDZ8wR2At4G6COEp4k1184NjNfPmMp2n/1pYeR2hMkHQYBJ+s+k1Y5yASI4GAX1zJKzMbxxHfNUbMQr0WuGH2DfFUnlRpmbZxfVtJY+ZiAq0TurRxf6yiQq7L3IgF30DVadzWfi97f9mKjDoKNIENhm4cSUVwAAAACAAZzaWctY2EAAAFcjUqLVQAFWC41MDkAAALeMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dnaW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwOTE0Mzc0N1oXDTIyMDYwODE0Mzc0OFowHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAN/VxirOToFu9HkIEmAS0SDemq4/1uE+wulhaeLXVWUQgNPU+we9LDl9mrMaCuUAJjYtaCqvKqljK5KRYWxvWH+OeFGtnfWj3wFYT/CHEzHX3rTw9PsSX+zTZlyDQyyzFzKoJ6my0ilScpqwLpJH2Omv16JMvW+JrMwCafQDDI3/FOeaRvwUpE1VQn+L/E2FqM6JTpCvJU0wLjxFH5oviH3CupEGolamI4xWizyuDtje6znLbO8bvkPEfI90suM4tNZF9YrwuGi/fRCxSg7uw5uIhqNqCtVIEFHsyBx4VozxSlhRhrsGRnFGCGiV3qUzb7ifO7NcUAA35sAFVoJsQBsCAwEAAaMjMCEwDgYDVR0PAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAHHZhKL3G1v6O28ipprog1OY7h1HAh+P6sFgDs9plDvZxghDyIgOPKwGAXeoINY0gWR4CtBTx2VXO6oGq3V+sJMGfciUAG9Bh6gxhyPr0NF8h9Vpd3tRLUoa4uQgKRtv+6TArHK7iU1yFhMI3JJ1wvYZf7U5b8u3kbzamIyZsj2RV9z5qXuSYacH2wyDZ8wR2At4G6COEp4k1184NjNfPmMp2n/1pYeR2hMkHQYBJ+s+k1Y5yASI4GAX1zJKzMbxxHfNUbMQr0WuGH2DfFUnlRpmbZxfVtJY+ZiAq0TurRxf6yiQq7L3IgF30DVadzWfi97f9mKjDoKNIENhm4cSUVyY+ovJVfsvWf+3yR2x3nvY5ctIqg==", 
            "key": "/u3+7QAAAAIAAAACAAAAAQAKbG9nZ2luZy1lcwAAAVyNSpViAAAFAjCCBP4wDgYKKwYBBAEqAhEBAQUABIIE6tQZ3im9rFgD83V0BY9DfJvk7JMzWBYm9LKLAeJdL+hAsEKR1aKZhQ1Kdij2mpsHT7MepTZnYX9GRbeICXcjvTuk84mTB7a0DNvsb5dyc901QQjixh31Ph053/um0vmat6RLBvbzOf+tM8MaytHprHqX2ycUBoaBb/OqDOQgWTKtoxBeHu2wXRMpEP4H6RtSDrHSU0MIfnmsIQ/fCopjF5cr+uDXpiopxwFvF2i5eKZCXvRl+2V6HSfOmrKKF1VYG1XG5na/u1y/HUkNkuVWCd5KI/jgEapx8XNjo9tt915diRkKIuPqXvlS2k829cZ996/7ynpg9tNypwhgBozpIe4eAYYvLY6kI8fXIEwLEZO5VuPs5AGvU+1XBHD8hoDqr5VjR1ON8tp4Gxr1x1E4mMQ5MxVU8IcOhCoW9ua7VERejMiK6HcwYRkpHWK35lQW71g6E6IQBQbfYj2IPPTxNCqTKTPGC7HTBSzuIqj1janUdf13zGla7SCNnq85TIichsx5Sa/iej19Ck6wX79j1mlRuY+8Rn2L8Ha7QU150EUeM6NCnvYNPlGHa6QqrHUNCCKo6o4NcEWCRtkgc0SOoc67mNYa1X/EamR/vvtHpTZKWjIb8D9G+6uk2LdkWi5ncb0yP+5Kq04i+3h5gL2lK3XAGHEruMbuhs+DCamjZdHA/Fb1P+Ez4MpD9NmcOgk8Lk3s1Bc+wE7moVrNeWtBfLmW0BOSaqLePVx9i03dtngmVVph0+sSR/43Vlxmn54jMkwGmJvFce38H4h5khOmQdQQOgeOLK6bxkkU3dd57t0nQNw+T4YKh827O5tTgbIesFbU+yulrvssDCWfm7c8sJmA8bKFnSgos7igoUWwfMeyQUilfWg1Wp2rpkDLuu6E2pD4hP4wBOYNNVzQ6jFMHQm8/1jCIF6WjkYXCzcCXCNAdfj5IPvUu06jekETTKKaZxD49SAeSBNF0GQDUP1qqVaEBE/97KSRVuxa/xZZKqzi6A/GReo3xMbXzil3scHDx1wdjh2AzcVN8hBEiz3vbBYyGFlsCXrc558MqOv1KUNYnlOPVKzu69GdF9vsRqIzkXatUxg7pHzx7IWP+GcmRg+i3LGwa2J73zsgy0Uf2GlyrBt5dkz3ypizd9GZJLueXVFr/MBwbZcnOvygcU2qQjQ7E8J6IKTuaj1vbeWHlkpD5sRj5TF8vPdB7FlhOJAyZZq0KReHfza6ErLmhe2UxC5rvgJleizhgWpw7Kt3RXSIsyo/K38/rtWD90NKf1ljHCKbrJBVF7x1GkFPOzCdPtPFvqe71AVnVAWwFVMNp4f1oiQFwE6nd+eoRa9641SIs6WLB4Nw3S1PdcK1pv4CL3IePVk1o1dqWSth1uwLy4c/CkXK+MVJtbtNmQXaD87mH3VBH21VxGK1NEX5bNNL2oAQEIz9XzpTje/Ii6D8baaal8ujQ28fcGuDi+yQBQPkEtw/6gNrx64BheMpCXwPPC1WrIN0fno97YEcWS3nUpv5fNST7vxL0dgMIGEsUUZHl3VS31YCK1Jk75Iuk8CHY4jhTdRCCzlGQCvtFceCn4aox4aj9OHee4GeTOVs6FZW8v0vzQTMZoSMk5fVhp5tdvnMcyn4x4c6uBsACTMpcw4e8M3N8lE8ngxeGMsm6YM/dBXEq2BiH6vTVmUAAAACAAVYLjUwOQAABFwwggRYMIIDQKADAgECAgEIMA0GCSqGSIb3DQEBBQUAMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwHhcNMTcwNjA5MTQzODE0WhcNMTkwNjA5MTQzODE0WjA7MRAwDgYDVQQKEwdMb2dnaW5nMRIwEAYDVQQLEwlPcGVuU2hpZnQxEzARBgNVBAMTCmxvZ2dpbmctZXMwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCC2K4B149+X2eHF088F4YqmkdPXOzOmHT3iwZ25fnBJ1UOOfEFDHGfenFwlJTzJ0IzjBTnrumtMYolCR7PY9GvXFgYY8m8+tYfe/cFKW4ngXlMYHlmi/5YwopcxOsmZyKbJyqnFXb2zi25b0TMPe/rBchBJliD/c1t2FGwzlj2RwTse3u4E1EKP2a1snPgCtJdCeOfCs94jrcxj+eDJhkH+D4Wbcp3JaR742aTXkkNQDz5N1rc1jnluS6sxY5jtuVcXcYZi+FKzLKpL2AtBUWbY2h9rXZ57C5XWezitJNLhetMD7Dd7hcOmhfnHWLu6m9D2utxxSeAdaThnS4uJguPAgMBAAGjggGCMIIBfjAOBgNVHQ8BAf8EBAMCBaAwCQYDVR0TBAIwADAdBgNVHSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwHQYDVR0OBBYEFMh5JUIn3Vy+i6i57kzJzmLtgDHJMAkGA1UdIwQCMAAwggEWBgNVHREEggENMIIBCYIJbG9jYWxob3N0hwR/AAABggpsb2dnaW5nLWVzgiRsb2dnaW5nLWVzLmxvZ2dpbmcuc3ZjLmNsdXN0ZXIubG9jYWyCEmxvZ2dpbmctZXMtY2x1c3RlcoIsbG9nZ2luZy1lcy1jbHVzdGVyLmxvZ2dpbmcuc3ZjLmNsdXN0ZXIubG9jYWyCDmxvZ2dpbmctZXMtb3Bzgihsb2dnaW5nLWVzLW9wcy5sb2dnaW5nLnN2Yy5jbHVzdGVyLmxvY2FsghZsb2dnaW5nLWVzLW9wcy1jbHVzdGVygjBsb2dnaW5nLWVzLW9wcy1jbHVzdGVyLmxvZ2dpbmcuc3ZjLmNsdXN0ZXIubG9jYWwwDQYJKoZIhvcNAQEFBQADggEBABQMRcZcJ39O+A3SHcSaFTvmJZhLrJSoObLG1U2bcR/J4huwC8C9mfvGSHlKIwaOzKaaNHODrppzOaQX4ljyL27gSlqdbDcoHkDwo22S/Dm4DEDfh/KVc8Ic2hiccpLVNLeZVSVOcahjzyjO0prxn++KDlCg6gNDJxidvV0AkMLqJXGesH9k1VSnsxXxJA+1t1F10ZrlUttuPr4KJMgMPUkR0DkUqxb7jVDMdoEMh9XtOqsQuSQVztiOUiHiaE1IGoxkR6yo1b+f1viGcwi+e5WDLdePUeE9vJk0bFOjfqTjmekCnxwz8+sTav+10M5ISLVwFuuLkH3inAM8riRk9k4ABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDkxNDM3NDdaFw0yMjA2MDgxNDM3NDhaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDf1cYqzk6BbvR5CBJgEtEg3pquP9bhPsLpYWni11VlEIDT1PsHvSw5fZqzGgrlACY2LWgqryqpYyuSkWFsb1h/jnhRrZ31o98BWE/whxMx19608PT7El/s02Zcg0MssxcyqCepstIpUnKasC6SR9jpr9eiTL1viazMAmn0AwyN/xTnmkb8FKRNVUJ/i/xNhajOiU6QryVNMC48RR+aL4h9wrqRBqJWpiOMVos8rg7Y3us5y2zvG75DxHyPdLLjOLTWRfWK8Lhov30QsUoO7sObiIajagrVSBBR7MgceFaM8UpYUYa7BkZxRghold6lM2+4nzuzXFAAN+bABVaCbEAbAgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQBx2YSi9xtb+jtvIqaa6INTmO4dRwIfj+rBYA7PaZQ72cYIQ8iIDjysBgF3qCDWNIFkeArQU8dlVzuqBqt1frCTBn3IlABvQYeoMYcj69DRfIfVaXd7US1KGuLkICkbb/ukwKxyu4lNchYTCNySdcL2GX+1OW/Lt5G82piMmbI9kVfc+al7kmGnB9sMg2fMEdgLeBugjhKeJNdfODYzXz5jKdp/9aWHkdoTJB0GASfrPpNWOcgEiOBgF9cySszG8cR3zVGzEK9Frhh9g3xVJ5UaZm2cX1bSWPmYgKtE7q0cX+sokKuy9yIBd9A1Wnc1n4ve3/Ziow6CjSBDYZuHElFcAAAAAgAGc2lnLWNhAAABXI1KlNoABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDkxNDM3NDdaFw0yMjA2MDgxNDM3NDhaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDf1cYqzk6BbvR5CBJgEtEg3pquP9bhPsLpYWni11VlEIDT1PsHvSw5fZqzGgrlACY2LWgqryqpYyuSkWFsb1h/jnhRrZ31o98BWE/whxMx19608PT7El/s02Zcg0MssxcyqCepstIpUnKasC6SR9jpr9eiTL1viazMAmn0AwyN/xTnmkb8FKRNVUJ/i/xNhajOiU6QryVNMC48RR+aL4h9wrqRBqJWpiOMVos8rg7Y3us5y2zvG75DxHyPdLLjOLTWRfWK8Lhov30QsUoO7sObiIajagrVSBBR7MgceFaM8UpYUYa7BkZxRghold6lM2+4nzuzXFAAN+bABVaCbEAbAgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQBx2YSi9xtb+jtvIqaa6INTmO4dRwIfj+rBYA7PaZQ72cYIQ8iIDjysBgF3qCDWNIFkeArQU8dlVzuqBqt1frCTBn3IlABvQYeoMYcj69DRfIfVaXd7US1KGuLkICkbb/ukwKxyu4lNchYTCNySdcL2GX+1OW/Lt5G82piMmbI9kVfc+al7kmGnB9sMg2fMEdgLeBugjhKeJNdfODYzXz5jKdp/9aWHkdoTJB0GASfrPpNWOcgEiOBgF9cySszG8cR3zVGzEK9Frhh9g3xVJ5UaZm2cX1bSWPmYgKtE7q0cX+sokKuy9yIBd9A1Wnc1n4ve3/Ziow6CjSBDYZuHElFcNsZ/R2iQjodvCrglzkSIsZM8OI4=", 
            "searchguard.key": "/u3+7QAAAAIAAAACAAAAAQANZWxhc3RpY3NlYXJjaAAAAVyNSpDlAAAFAjCCBP4wDgYKKwYBBAEqAhEBAQUABIIE6re1k2+uNYghyKOiNOh5BjRv8VSXpPoxEMTNNkCtaMeiX3LvqvmolY0joxjioUKdsWw6FT00hLQf2my6B+o15krwyPawkYNtiSqTzT+N60isYOxzK7gcGRP8SFxtk87UkaGuF0Z08B0HOSMv6nsm/cC8jzZ97e2y2qejYlMEFkbj7OdKdclJUSmOudsJlwI30sjz1sdc8ZT7EVSRqRbGmwXY6xoGda9S4+/E7dP4ZDxaMq2xZ2fDLUEYvfV1JUVRfcs7Q1PImozfd6hSheODMNyE9OPZT+2SLzoOgaZcoywjKlEdrSfLiLghxkLc5G/FCIaRp6PGN/HDkfmB34+ChV70gyrOoapRyr0WIPsZSG7YjiMHzJBPEuri3iHHHakgTTcvnzebkJqJhWlpZkAPQ222XDZdVLGyXTUmVAvO9KaXNP1cbFhdpjxRqJc5FzEzwMOAKAOri8lq9kKDB32l5rLpWIirzzIKN/2jb86SGOHY1QAl2GMjP0MWAVWHZq/wqE8hyc2+h53CCWZYgtzbVZ3xOeP96z6MbcFikt5kaGtki/LKIreTi6iMd8zrWHaXc4wToQuKlt6ThfkVZpTJWTjDyIhH+3BZ+AeVnPKFMG3USdxucOlKmxYRaFdj6iheV/enjofsfe7k6OP3Q3FGzYbQH9o8xKfYOvh3N8GH/CYb/KjWH7SnHovzksd26wbuq0fFV55Dq7idtLEbr2eta5S0eiF4b93vSDoLKcp38OObcduj3SGj8qGSXUr6L1SguydsJfAfPaQvAr5W6Ds73FgH5L9SXCCiROBQJJXCpzFxnY39bh8j8ImTZOXvlIec+Vs4aEVPgSQpQuTUs6I/ACXjbjNhHJvUTifP4de8ypqTlKpUhqRWSBhRPAf9glSFImtEp0CFiBKX1XW+x5u8sOKynkSk+H41sH2L0H6ziL7vK7MZKpVGI9V+l74P79HtHeh9KTNJ6dlv/wgDYdc5wrx4fgp1Oc9vvFv7rR/OgnKrex0+y3SAhRvVVrH3Pzo4EBkrRdwFZnhAET3dQZxg5NMOAm7BhsXYx/dLhu3Elld9UBBOWjmR5o7CI3u1tMeZQkrPWWn3m/YEIcLgUO7bnMIk6MvqAB+PAyYOF0Z8+4gjEtDNLnVXnOMcc+UBhOi0y771vp5r9DOjLGlUs8Rtot0EwJcELXYebjlP/sUtyPA6fqHTJ24ZKk/avhryIFMDI6JPPkY9n5WHWfsJIZ93j0FNMNsr6oPhuqcp8EvskEMAzUgLnnVr0eAlNStYlDed4ZejXzVc4CT4+T3kXwkzMngC3ruQU+itj73GI1QzpIZ+xbf9T8lweIvMSTwttUrsnae5YGtdMqua0vyip3MllZDA62l144QnAbsoR2x+poK0igOtp3hcsC0cU0L5pVbBTEZN/MkV6L4ozpOheHhXZoIrZT1KVot55SuM8BWtnPHSm+pH2auIGeUOfgqUuKpRSDYUDitkpp9k2cI80S6ZH798JauUyem75z24ndi/nBMvpAhPgVArzk3XI+SNisdfTrG69ILYeD60ibPo43mF4i3BnvPzlDvr7hSxLowQods/Kc6UgerytcPCJKfQF9qXRmNziDvokmQlG2PBCv628JkgRpxqYd+TAfNUc6tMizXKy3e+ArOkCIBsZUOrmv7kRDDulw1CC1mz3t4AAAACAAVYLjUwOQAAA4IwggN+MIICZqADAgECAgEHMA0GCSqGSIb3DQEBBQUAMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwHhcNMTcwNjA5MTQzODEyWhcNMTkwNjA5MTQzODEyWjA+MRAwDgYDVQQKEwdMb2dnaW5nMRIwEAYDVQQLEwlPcGVuU2hpZnQxFjAUBgNVBAMTDWVsYXN0aWNzZWFyY2gwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCdeDtRpJBPlCG9Ll20PqeKYAKSgjijjrl0aUkeea/MrcT5A7Yht4Q/Y4iFwRVq2ZBuM+6o5iaZYrf/ZfGz+PgOFEetmO7PCf9HsbKjCwmWlA1mGLZMAp2szc/e8eGnPtKXBTXN0NJP5/i70xkF0eVA4WcKLbnenJrinx0W5seUUkvI+UuRFjnb3h8MgI+sr1sgXBkWS/J2ZOav9PMUTMbNC7LM08ntIG3/hn3VLdtpmnzE60RVLRfy/bftiFKE1GfAJasRWMhtBc79mJ88cPx+PWzjPi4YCOVOraEtVYHTfG7o2QLLsy4SzI1R1fHUqgufqkkJMgRQUd7afXKvQzcrAgMBAAGjgaYwgaMwDgYDVR0PAQH/BAQDAgWgMAkGA1UdEwQCMAAwHQYDVR0lBBYwFAYIKwYBBQUHAwEGCCsGAQUFBwMCMB0GA1UdDgQWBBRKUWX8NLp/SMBPi/qzofFbImqJOzAJBgNVHSMEAjAAMD0GA1UdEQQ2MDSCCWxvY2FsaG9zdIcEfwAAAYIKbG9nZ2luZy1lc4IObG9nZ2luZy1lcy1vcHOIBSoDBAUFMA0GCSqGSIb3DQEBBQUAA4IBAQBkf+uzDU6AxaXasvX6DvsXySvaBZE+DKLE8JorBJFXLStquDDH+LWQ1gWDjK7GweMo36YMeyvGgqOdnIEvMGfZBSV82MIfHYKaal0YFunwXKMs8oPbv7SzXullMaLel82n6YzUNBNMqiSnzn0dnx4g7uujMcNbUMzbRfGCXYZPZTe9lsKaWtU674hs1z3xvHjxnTUaExydVABHthWtopARYzofXrmaiJQcVKvzomAr/owNqrrejiaPlsCmJ0Xs0kSit6rSiQLVob+zEtYvSx+3oBY3i9c2vbgEg9/7Q+3e4MHAoui+ZSZl9Bk4x3NUg0UlKtoZI0VG1k1C3eAjgbo8AAVYLjUwOQAAAt4wggLaMIIBwqADAgECAgEBMA0GCSqGSIb3DQEBCwUAMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwHhcNMTcwNjA5MTQzNzQ3WhcNMjIwNjA4MTQzNzQ4WjAeMRwwGgYDVQQDExNsb2dnaW5nLXNpZ25lci10ZXN0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA39XGKs5OgW70eQgSYBLRIN6arj/W4T7C6WFp4tdVZRCA09T7B70sOX2asxoK5QAmNi1oKq8qqWMrkpFhbG9Yf454Ua2d9aPfAVhP8IcTMdfetPD0+xJf7NNmXINDLLMXMqgnqbLSKVJymrAukkfY6a/Xoky9b4mszAJp9AMMjf8U55pG/BSkTVVCf4v8TYWozolOkK8lTTAuPEUfmi+IfcK6kQaiVqYjjFaLPK4O2N7rOcts7xu+Q8R8j3Sy4zi01kX1ivC4aL99ELFKDu7Dm4iGo2oK1UgQUezIHHhWjPFKWFGGuwZGcUYIaJXepTNvuJ87s1xQADfmwAVWgmxAGwIDAQABoyMwITAOBgNVHQ8BAf8EBAMCAqQwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG9w0BAQsFAAOCAQEAcdmEovcbW/o7byKmmuiDU5juHUcCH4/qwWAOz2mUO9nGCEPIiA48rAYBd6gg1jSBZHgK0FPHZVc7qgardX6wkwZ9yJQAb0GHqDGHI+vQ0XyH1Wl3e1EtShri5CApG2/7pMCscruJTXIWEwjcknXC9hl/tTlvy7eRvNqYjJmyPZFX3Pmpe5JhpwfbDINnzBHYC3gboI4SniTXXzg2M18+Yynaf/Wlh5HaEyQdBgEn6z6TVjnIBIjgYBfXMkrMxvHEd81RsxCvRa4YfYN8VSeVGmZtnF9W0lj5mICrRO6tHF/rKJCrsvciAXfQNVp3NZ+L3t/2YqMOgo0gQ2GbhxJRXAAAAAIABnNpZy1jYQAAAVyNSpBbAAVYLjUwOQAAAt4wggLaMIIBwqADAgECAgEBMA0GCSqGSIb3DQEBCwUAMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwHhcNMTcwNjA5MTQzNzQ3WhcNMjIwNjA4MTQzNzQ4WjAeMRwwGgYDVQQDExNsb2dnaW5nLXNpZ25lci10ZXN0MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA39XGKs5OgW70eQgSYBLRIN6arj/W4T7C6WFp4tdVZRCA09T7B70sOX2asxoK5QAmNi1oKq8qqWMrkpFhbG9Yf454Ua2d9aPfAVhP8IcTMdfetPD0+xJf7NNmXINDLLMXMqgnqbLSKVJymrAukkfY6a/Xoky9b4mszAJp9AMMjf8U55pG/BSkTVVCf4v8TYWozolOkK8lTTAuPEUfmi+IfcK6kQaiVqYjjFaLPK4O2N7rOcts7xu+Q8R8j3Sy4zi01kX1ivC4aL99ELFKDu7Dm4iGo2oK1UgQUezIHHhWjPFKWFGGuwZGcUYIaJXepTNvuJ87s1xQADfmwAVWgmxAGwIDAQABoyMwITAOBgNVHQ8BAf8EBAMCAqQwDwYDVR0TAQH/BAUwAwEB/zANBgkqhkiG9w0BAQsFAAOCAQEAcdmEovcbW/o7byKmmuiDU5juHUcCH4/qwWAOz2mUO9nGCEPIiA48rAYBd6gg1jSBZHgK0FPHZVc7qgardX6wkwZ9yJQAb0GHqDGHI+vQ0XyH1Wl3e1EtShri5CApG2/7pMCscruJTXIWEwjcknXC9hl/tTlvy7eRvNqYjJmyPZFX3Pmpe5JhpwfbDINnzBHYC3gboI4SniTXXzg2M18+Yynaf/Wlh5HaEyQdBgEn6z6TVjnIBIjgYBfXMkrMxvHEd81RsxCvRa4YfYN8VSeVGmZtnF9W0lj5mICrRO6tHF/rKJCrsvciAXfQNVp3NZ+L3t/2YqMOgo0gQ2GbhxJRXMadXu5BN4PT8uVkqOJ+Xmnz0h01", 
            "searchguard.truststore": "/u3+7QAAAAIAAAABAAAAAgAGc2lnLWNhAAABXI1KleUABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDkxNDM3NDdaFw0yMjA2MDgxNDM3NDhaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDf1cYqzk6BbvR5CBJgEtEg3pquP9bhPsLpYWni11VlEIDT1PsHvSw5fZqzGgrlACY2LWgqryqpYyuSkWFsb1h/jnhRrZ31o98BWE/whxMx19608PT7El/s02Zcg0MssxcyqCepstIpUnKasC6SR9jpr9eiTL1viazMAmn0AwyN/xTnmkb8FKRNVUJ/i/xNhajOiU6QryVNMC48RR+aL4h9wrqRBqJWpiOMVos8rg7Y3us5y2zvG75DxHyPdLLjOLTWRfWK8Lhov30QsUoO7sObiIajagrVSBBR7MgceFaM8UpYUYa7BkZxRghold6lM2+4nzuzXFAAN+bABVaCbEAbAgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQBx2YSi9xtb+jtvIqaa6INTmO4dRwIfj+rBYA7PaZQ72cYIQ8iIDjysBgF3qCDWNIFkeArQU8dlVzuqBqt1frCTBn3IlABvQYeoMYcj69DRfIfVaXd7US1KGuLkICkbb/ukwKxyu4lNchYTCNySdcL2GX+1OW/Lt5G82piMmbI9kVfc+al7kmGnB9sMg2fMEdgLeBugjhKeJNdfODYzXz5jKdp/9aWHkdoTJB0GASfrPpNWOcgEiOBgF9cySszG8cR3zVGzEK9Frhh9g3xVJ5UaZm2cX1bSWPmYgKtE7q0cX+sokKuy9yIBd9A1Wnc1n4ve3/Ziow6CjSBDYZuHElFcZNJPe5RiuhqXYmjQoxrmkQdb130=", 
            "truststore": "/u3+7QAAAAIAAAABAAAAAgAGc2lnLWNhAAABXI1KleUABVguNTA5AAAC3jCCAtowggHCoAMCAQICAQEwDQYJKoZIhvcNAQELBQAwHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDAeFw0xNzA2MDkxNDM3NDdaFw0yMjA2MDgxNDM3NDhaMB4xHDAaBgNVBAMTE2xvZ2dpbmctc2lnbmVyLXRlc3QwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDf1cYqzk6BbvR5CBJgEtEg3pquP9bhPsLpYWni11VlEIDT1PsHvSw5fZqzGgrlACY2LWgqryqpYyuSkWFsb1h/jnhRrZ31o98BWE/whxMx19608PT7El/s02Zcg0MssxcyqCepstIpUnKasC6SR9jpr9eiTL1viazMAmn0AwyN/xTnmkb8FKRNVUJ/i/xNhajOiU6QryVNMC48RR+aL4h9wrqRBqJWpiOMVos8rg7Y3us5y2zvG75DxHyPdLLjOLTWRfWK8Lhov30QsUoO7sObiIajagrVSBBR7MgceFaM8UpYUYa7BkZxRghold6lM2+4nzuzXFAAN+bABVaCbEAbAgMBAAGjIzAhMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQBx2YSi9xtb+jtvIqaa6INTmO4dRwIfj+rBYA7PaZQ72cYIQ8iIDjysBgF3qCDWNIFkeArQU8dlVzuqBqt1frCTBn3IlABvQYeoMYcj69DRfIfVaXd7US1KGuLkICkbb/ukwKxyu4lNchYTCNySdcL2GX+1OW/Lt5G82piMmbI9kVfc+al7kmGnB9sMg2fMEdgLeBugjhKeJNdfODYzXz5jKdp/9aWHkdoTJB0GASfrPpNWOcgEiOBgF9cySszG8cR3zVGzEK9Frhh9g3xVJ5UaZm2cX1bSWPmYgKtE7q0cX+sokKuy9yIBd9A1Wnc1n4ve3/Ziow6CjSBDYZuHElFcZNJPe5RiuhqXYmjQoxrmkQdb130="
        }, 
        "kind": "Secret", 
        "metadata": {
            "creationTimestamp": null, 
            "name": "logging-elasticsearch"
        }, 
        "type": "Opaque"
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Set logging-es-ops-cluster service] ****
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:168
changed: [openshift] => {
    "changed": true, 
    "results": {
        "clusterip": "172.30.18.33", 
        "cmd": "/bin/oc get service logging-es-ops-cluster -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "Service", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:32Z", 
                    "name": "logging-es-ops-cluster", 
                    "namespace": "logging", 
                    "resourceVersion": "1343", 
                    "selfLink": "/api/v1/namespaces/logging/services/logging-es-ops-cluster", 
                    "uid": "4fc11ddd-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "clusterIP": "172.30.18.33", 
                    "ports": [
                        {
                            "port": 9300, 
                            "protocol": "TCP", 
                            "targetPort": 9300
                        }
                    ], 
                    "selector": {
                        "component": "es-ops", 
                        "provider": "openshift"
                    }, 
                    "sessionAffinity": "None", 
                    "type": "ClusterIP"
                }, 
                "status": {
                    "loadBalancer": {}
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Set logging-es-ops service] ************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:182
changed: [openshift] => {
    "changed": true, 
    "results": {
        "clusterip": "172.30.110.130", 
        "cmd": "/bin/oc get service logging-es-ops -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "Service", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:33Z", 
                    "name": "logging-es-ops", 
                    "namespace": "logging", 
                    "resourceVersion": "1352", 
                    "selfLink": "/api/v1/namespaces/logging/services/logging-es-ops", 
                    "uid": "507b76be-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "clusterIP": "172.30.110.130", 
                    "ports": [
                        {
                            "port": 9200, 
                            "protocol": "TCP", 
                            "targetPort": "restapi"
                        }
                    ], 
                    "selector": {
                        "component": "es-ops", 
                        "provider": "openshift"
                    }, 
                    "sessionAffinity": "None", 
                    "type": "ClusterIP"
                }, 
                "status": {
                    "loadBalancer": {}
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Creating ES storage template] **********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:197
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Creating ES storage template] **********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:210
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Set ES storage] ************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:225
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:237
ok: [openshift] => {
    "ansible_facts": {
        "es_deploy_name": "logging-es-ops-data-master-1lbwcltv"
    }, 
    "changed": false
}

TASK [openshift_logging_elasticsearch : set_fact] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:241
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_elasticsearch : Set ES dc templates] *******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:246
changed: [openshift] => {
    "changed": true, 
    "checksum": "1c700294b95dfa1776dcefafd78f6eca56d7d596", 
    "dest": "/tmp/openshift-logging-ansible-m9oYNc/templates/logging-es-dc.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "88c05b131e6d4e8e272ed3b5f0d8215e", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 3177, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019114.29-97465928700537/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_elasticsearch : Set ES dc] *****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:262
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get dc logging-es-ops-data-master-1lbwcltv -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "DeploymentConfig", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:35Z", 
                    "generation": 2, 
                    "labels": {
                        "component": "es-ops", 
                        "deployment": "logging-es-ops-data-master-1lbwcltv", 
                        "logging-infra": "elasticsearch", 
                        "provider": "openshift"
                    }, 
                    "name": "logging-es-ops-data-master-1lbwcltv", 
                    "namespace": "logging", 
                    "resourceVersion": "1366", 
                    "selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-es-ops-data-master-1lbwcltv", 
                    "uid": "5151d7ba-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "replicas": 1, 
                    "selector": {
                        "component": "es-ops", 
                        "deployment": "logging-es-ops-data-master-1lbwcltv", 
                        "logging-infra": "elasticsearch", 
                        "provider": "openshift"
                    }, 
                    "strategy": {
                        "activeDeadlineSeconds": 21600, 
                        "recreateParams": {
                            "timeoutSeconds": 600
                        }, 
                        "resources": {}, 
                        "type": "Recreate"
                    }, 
                    "template": {
                        "metadata": {
                            "creationTimestamp": null, 
                            "labels": {
                                "component": "es-ops", 
                                "deployment": "logging-es-ops-data-master-1lbwcltv", 
                                "logging-infra": "elasticsearch", 
                                "provider": "openshift"
                            }, 
                            "name": "logging-es-ops-data-master-1lbwcltv"
                        }, 
                        "spec": {
                            "containers": [
                                {
                                    "env": [
                                        {
                                            "name": "NAMESPACE", 
                                            "valueFrom": {
                                                "fieldRef": {
                                                    "apiVersion": "v1", 
                                                    "fieldPath": "metadata.namespace"
                                                }
                                            }
                                        }, 
                                        {
                                            "name": "KUBERNETES_TRUST_CERT", 
                                            "value": "true"
                                        }, 
                                        {
                                            "name": "SERVICE_DNS", 
                                            "value": "logging-es-ops-cluster"
                                        }, 
                                        {
                                            "name": "CLUSTER_NAME", 
                                            "value": "logging-es-ops"
                                        }, 
                                        {
                                            "name": "INSTANCE_RAM", 
                                            "value": "8Gi"
                                        }, 
                                        {
                                            "name": "NODE_QUORUM", 
                                            "value": "1"
                                        }, 
                                        {
                                            "name": "RECOVER_EXPECTED_NODES", 
                                            "value": "1"
                                        }, 
                                        {
                                            "name": "RECOVER_AFTER_TIME", 
                                            "value": "5m"
                                        }, 
                                        {
                                            "name": "READINESS_PROBE_TIMEOUT", 
                                            "value": "30"
                                        }, 
                                        {
                                            "name": "IS_MASTER", 
                                            "value": "true"
                                        }, 
                                        {
                                            "name": "HAS_DATA", 
                                            "value": "true"
                                        }
                                    ], 
                                    "image": "172.30.224.2:5000/logging/logging-elasticsearch:latest", 
                                    "imagePullPolicy": "Always", 
                                    "name": "elasticsearch", 
                                    "ports": [
                                        {
                                            "containerPort": 9200, 
                                            "name": "restapi", 
                                            "protocol": "TCP"
                                        }, 
                                        {
                                            "containerPort": 9300, 
                                            "name": "cluster", 
                                            "protocol": "TCP"
                                        }
                                    ], 
                                    "readinessProbe": {
                                        "exec": {
                                            "command": [
                                                "/usr/share/elasticsearch/probe/readiness.sh"
                                            ]
                                        }, 
                                        "failureThreshold": 3, 
                                        "initialDelaySeconds": 10, 
                                        "periodSeconds": 5, 
                                        "successThreshold": 1, 
                                        "timeoutSeconds": 30
                                    }, 
                                    "resources": {
                                        "limits": {
                                            "cpu": "1", 
                                            "memory": "8Gi"
                                        }, 
                                        "requests": {
                                            "memory": "512Mi"
                                        }
                                    }, 
                                    "terminationMessagePath": "/dev/termination-log", 
                                    "terminationMessagePolicy": "File", 
                                    "volumeMounts": [
                                        {
                                            "mountPath": "/etc/elasticsearch/secret", 
                                            "name": "elasticsearch", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/usr/share/java/elasticsearch/config", 
                                            "name": "elasticsearch-config", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/elasticsearch/persistent", 
                                            "name": "elasticsearch-storage"
                                        }
                                    ]
                                }
                            ], 
                            "dnsPolicy": "ClusterFirst", 
                            "restartPolicy": "Always", 
                            "schedulerName": "default-scheduler", 
                            "securityContext": {
                                "supplementalGroups": [
                                    65534
                                ]
                            }, 
                            "serviceAccount": "aggregated-logging-elasticsearch", 
                            "serviceAccountName": "aggregated-logging-elasticsearch", 
                            "terminationGracePeriodSeconds": 30, 
                            "volumes": [
                                {
                                    "name": "elasticsearch", 
                                    "secret": {
                                        "defaultMode": 420, 
                                        "secretName": "logging-elasticsearch"
                                    }
                                }, 
                                {
                                    "configMap": {
                                        "defaultMode": 420, 
                                        "name": "logging-elasticsearch"
                                    }, 
                                    "name": "elasticsearch-config"
                                }, 
                                {
                                    "emptyDir": {}, 
                                    "name": "elasticsearch-storage"
                                }
                            ]
                        }
                    }, 
                    "test": false, 
                    "triggers": [
                        {
                            "type": "ConfigChange"
                        }
                    ]
                }, 
                "status": {
                    "availableReplicas": 0, 
                    "conditions": [
                        {
                            "lastTransitionTime": "2017-06-09T14:38:35Z", 
                            "lastUpdateTime": "2017-06-09T14:38:35Z", 
                            "message": "Deployment config does not have minimum availability.", 
                            "status": "False", 
                            "type": "Available"
                        }, 
                        {
                            "lastTransitionTime": "2017-06-09T14:38:35Z", 
                            "lastUpdateTime": "2017-06-09T14:38:35Z", 
                            "message": "replication controller \"logging-es-ops-data-master-1lbwcltv-1\" is waiting for pod \"logging-es-ops-data-master-1lbwcltv-1-deploy\" to run", 
                            "status": "Unknown", 
                            "type": "Progressing"
                        }
                    ], 
                    "details": {
                        "causes": [
                            {
                                "type": "ConfigChange"
                            }
                        ], 
                        "message": "config change"
                    }, 
                    "latestVersion": 1, 
                    "observedGeneration": 2, 
                    "replicas": 0, 
                    "unavailableReplicas": 0, 
                    "updatedReplicas": 0
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_elasticsearch : Delete temp directory] *****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_elasticsearch/tasks/main.yaml:274
ok: [openshift] => {
    "changed": false, 
    "path": "/tmp/openshift-logging-ansible-m9oYNc", 
    "state": "absent"
}

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:151
statically included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml

TASK [openshift_logging_kibana : fail] *****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:3
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:7
ok: [openshift] => {
    "ansible_facts": {
        "kibana_version": "3_5"
    }, 
    "changed": false
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : fail] *****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:15
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : Create temp directory for doing work in] ******
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:7
ok: [openshift] => {
    "changed": false, 
    "cmd": [
        "mktemp", 
        "-d", 
        "/tmp/openshift-logging-ansible-XXXXXX"
    ], 
    "delta": "0:00:00.002321", 
    "end": "2017-06-09 10:38:36.330302", 
    "rc": 0, 
    "start": "2017-06-09 10:38:36.327981"
}

STDOUT:

/tmp/openshift-logging-ansible-1KGJ33

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:12
ok: [openshift] => {
    "ansible_facts": {
        "tempdir": "/tmp/openshift-logging-ansible-1KGJ33"
    }, 
    "changed": false
}

TASK [openshift_logging_kibana : Create templates subdirectory] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:16
ok: [openshift] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/tmp/openshift-logging-ansible-1KGJ33/templates", 
    "secontext": "unconfined_u:object_r:user_tmp_t:s0", 
    "size": 6, 
    "state": "directory", 
    "uid": 0
}

TASK [openshift_logging_kibana : Create Kibana service account] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:26
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : Create Kibana service account] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:34
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get sa aggregated-logging-kibana -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "imagePullSecrets": [
                    {
                        "name": "aggregated-logging-kibana-dockercfg-tqm6f"
                    }
                ], 
                "kind": "ServiceAccount", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:37Z", 
                    "name": "aggregated-logging-kibana", 
                    "namespace": "logging", 
                    "resourceVersion": "1375", 
                    "selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-kibana", 
                    "uid": "5299210b-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "secrets": [
                    {
                        "name": "aggregated-logging-kibana-token-hgw85"
                    }, 
                    {
                        "name": "aggregated-logging-kibana-dockercfg-tqm6f"
                    }
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:42
ok: [openshift] => {
    "ansible_facts": {
        "kibana_component": "kibana", 
        "kibana_name": "logging-kibana"
    }, 
    "changed": false
}

TASK [openshift_logging_kibana : Checking for session_secret] ******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:47
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging_kibana : Checking for oauth_secret] ********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:51
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "exists": false
    }
}

TASK [openshift_logging_kibana : Generate session secret] **********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:56
changed: [openshift] => {
    "changed": true, 
    "checksum": "adf4aeae989a1bbdf0b6fec127ee16e4fda68b6a", 
    "dest": "/etc/origin/logging/session_secret", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "956b384f3ada383be823578bd846d264", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 200, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019118.15-11382119112717/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_kibana : Generate oauth secret] ************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:64
changed: [openshift] => {
    "changed": true, 
    "checksum": "ba82e9e2e78f595b88c9297945184551ad704096", 
    "dest": "/etc/origin/logging/oauth_secret", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "5eae645a757d1c131dde4fd427bc7594", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "system_u:object_r:etc_t:s0", 
    "size": 64, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019118.46-36189428242533/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_kibana : Retrieving the cert to use when generating secrets for the logging components] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:71
ok: [openshift] => (item={u'name': u'ca_file', u'file': u'ca.crt'}) => {
    "changed": false, 
    "content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjME4xb1hEVEl5TURZd09ERTBNemMwT0ZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU4vVnhpck9Ub0Z1OUhrSUVtQVMwU0RlbXE0LzF1RSt3dWxoYWVMWFZXVVEKZ05QVSt3ZTlMRGw5bXJNYUN1VUFKall0YUNxdktxbGpLNUtSWVd4dldIK09lRkd0bmZXajN3RllUL0NIRXpIWAozclR3OVBzU1grelRabHlEUXl5ekZ6S29KNm15MGlsU2NwcXdMcEpIMk9tdjE2Sk12VytKck13Q2FmUURESTMvCkZPZWFSdndVcEUxVlFuK0wvRTJGcU02SlRwQ3ZKVTB3TGp4Rkg1b3ZpSDNDdXBFR29sYW1JNHhXaXp5dUR0amUKNnpuTGJPOGJ2a1BFZkk5MHN1TTR0TlpGOVlyd3VHaS9mUkN4U2c3dXc1dUlocU5xQ3RWSUVGSHN5Qng0Vm96eApTbGhSaHJzR1JuRkdDR2lWM3FVemI3aWZPN05jVUFBMzVzQUZWb0pzUUJzQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFISFoKaEtMM0cxdjZPMjhpcHByb2cxT1k3aDFIQWgrUDZzRmdEczlwbER2WnhnaER5SWdPUEt3R0FYZW9JTlkwZ1dSNApDdEJUeDJWWE82b0dxM1Yrc0pNR2ZjaVVBRzlCaDZneGh5UHIwTkY4aDlWcGQzdFJMVW9hNHVRZ0tSdHYrNlRBCnJISzdpVTF5RmhNSTNKSjF3dllaZjdVNWI4dTNrYnphbUl5WnNqMlJWOXo1cVh1U1lhY0gyd3lEWjh3UjJBdDQKRzZDT0VwNGsxMTg0TmpOZlBtTXAybi8xcFllUjJoTWtIUVlCSitzK2sxWTV5QVNJNEdBWDF6Skt6TWJ4eEhmTgpVYk1RcjBXdUdIMkRmRlVubFJwbWJaeGZWdEpZK1ppQXEwVHVyUnhmNnlpUXE3TDNJZ0YzMERWYWR6V2ZpOTdmCjltS2pEb0tOSUVOaG00Y1NVVnc9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K", 
    "encoding": "base64", 
    "item": {
        "file": "ca.crt", 
        "name": "ca_file"
    }, 
    "source": "/etc/origin/logging/ca.crt"
}
ok: [openshift] => (item={u'name': u'kibana_internal_key', u'file': u'kibana-internal.key'}) => {
    "changed": false, 
    "content": "LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktLS0tLQpNSUlFb3dJQkFBS0NBUUVBeUNTL2hKbnEwa2pEd2xoZG9Gd3ZXRHZ3LytCd3o2Yml6UHllNTVDRmh2cVZtNE85CjNUY3hqb1p5cHNiZm9lTHZpK0FYVllWQmdBZDhPZG9IQzlhT1V1ckZBZVFNTFBBMHluTUxBTzRUd3pPU0p0TjEKa2x6aTlGa3U5YnNLbUNSbVFMYkZDaEVTbFBlS2s3RGZ2ak5zZXF6RGtlRTZYUHNVMTFpTFY4V2M3bDlhNmd6KwpWa1Jtc25NZE5ZeWlqb29jQjVuc2s0RjZzTW5zeDlRTmk2enlmSVJ1VkVrN3lKZDNxOUROckxxNDlSS3lWaUZpClhzSUM1WE80dUhNK0hVVGE5YkxxK0ZZenJDT2FZYk9JN1d5aXFBd1VxZ0N3S3FyMSs1RXphRTRvdWg3aDhBVUgKNkJsMElUSW0vejhxQVhvMDdBM0wycGkrdEYzUVhQTzBHZDRWMndJREFRQUJBb0lCQVFDUHhrNUtUR055MGxERwpRTUpwV3krcm04dkJsSktWcVJZT0dYOXhhOUZ3S0h6bXJabnIyeVZmZEZmU1ZOVDdyMUZUMHhRUUhGejBRdXhMCmhzTS9EczlJNDF2SXd2QzRLNHBRMEpuYi9pcjJOQXJPbDJORFZEUzVRWVBKaEtiVXFubEdEY1c0T0pGM3IrZTkKdHZiVDVJOE9CU09zblBaWEt3dEtzMUhPS0toV1Y2WXRhMzJiRXplVm9UclU1WDFMWkEvbms4V04zaWl5M0pSVQpTY1NEbkIrYmVFNE9uSXR0d1pMdnRwUlZ1UlBreWJXTEVPbmZOZFcrNEdDQjhQRy9QWmJzSXdHTmsxcDhwaE1sCnNjWWxHamkzVm9uZDBaT244ekxENjhlSHRHWHp1ZThBbGtuR1J0OTY5OTQ0eGxtNnY1K2gySGFBSmdCZWdnZXAKVWtWc1NpZ1pBb0dCQU5qWU1DcHZlT0g2K29zbjlsQ2NldzRHOEZUWEZhYnQyc0owUjNNc1NFYTRZOFJ4cnByTwpNR0Z4S1V2cnFGN1FNUmZxYkhsd3pmTE96WFlrMTFYOHV4a3NQZG1KSkEwejBxTVI3SEJBZXU0WDdJNjl0ZXpoCmJNdy95VW80UEc1dWdpZkNTRzBieTZNTFVJOEoyRC9abWZxZnRkT21aSjU2OEhCdm5rUm92Y1kzQW9HQkFPeEkKaTYwRGdjQzdxaGdqemNMenFwcGd3anJta2ZQNnd1S25Obkc1V0ZwOGJySmM2bWo2dVRhdlhEWVF6VTZEZk94SAoxbHQ1SzQ5WHBTNXEzTExCWUloN0dHOEUxMDdvOEM2ZWtlbFRORTZQWmdiZXROSnRXWWhJZGhmZzNyZ0VMRGt1CjlzMVZLcVJMMTc2ME5PbWQ3UDVKR2kzdmhld0NVZHFQbUJrYTE1dDlBb0dBUmliYW5qL2w2YVhhZkQ1M2IyalEKWHA5Y0RQWndhTXEyWlFaZFB5TnFWb2E4c0FiZkovSGdzUVY5Q0xTNmljSHN3QUgxQ2V5MmxBRFhjNHREcHV2VwpVN1IrWmV2Nkg5Tk9KN2RhdUk0RHR4ZENUb09OWVk4a05ZZkZSUitnWFZHZkJlSFNzSW0zZlkzaGlBVDFVdUxBCjc5WFBheU4vbGMzTUQzUHN6ZERjNUUwQ2dZQkxSYmtwZnVxQkNjZmdOTmZCK1hvcUFCVWdTbi9JcCtRWjdJY04KcDZ3YjkzUVVZa0ZTL0R5d3pTQ2xJS2tuRUFCbURXU2VjM1dMRHJMU25MeCtQZGlRNGhZZ2wvdzNhVUhLdUQxbgpoVmd1aHNSTC8vcSs1cE1WTlhCWm53dVV0OCtXei8xVDRJUGJIMFkxdkpiMnJaYm9VMFdCeU1Kek16SDhYSzVwCm9RRjZ5UUtCZ0F0VkUzRmVWSXNmMjRIOTZTZWo5N2ZnM25sZ203VElRaEV0U283U1NuUkhXV3VGTkVDK1JQNGkKWFVZQjNRNWJLMVFiSUpCTkU0cGNNNGJRbDBFcVZjbVVuckRzS3NzK0Y4RFk5Z2JSTzdobEcyb2hlRFh4alo3KwpwTWFWdXpQUlNrMy9uUTg4aGovVkpWZ01WK3VGanI4TzhOYndMTmRmMmZlbm1hdk1IUGFmCi0tLS0tRU5EIFJTQSBQUklWQVRFIEtFWS0tLS0tCg==", 
    "encoding": "base64", 
    "item": {
        "file": "kibana-internal.key", 
        "name": "kibana_internal_key"
    }, 
    "source": "/etc/origin/logging/kibana-internal.key"
}
ok: [openshift] => (item={u'name': u'kibana_internal_cert', u'file': u'kibana-internal.crt'}) => {
    "changed": false, 
    "content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURUakNDQWphZ0F3SUJBZ0lCQWpBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjMU1Gb1hEVEU1TURZd09URTBNemMxTVZvdwpGakVVTUJJR0ExVUVBeE1MSUd0cFltRnVZUzF2Y0hNd2dnRWlNQTBHQ1NxR1NJYjNEUUVCQVFVQUE0SUJEd0F3CmdnRUtBb0lCQVFESUpMK0VtZXJTU01QQ1dGMmdYQzlZTy9ELzRIRFBwdUxNL0o3bmtJV0crcFdiZzczZE56R08KaG5LbXh0K2g0dStMNEJkVmhVR0FCM3c1MmdjTDFvNVM2c1VCNUF3czhEVEtjd3NBN2hQRE01SW0wM1dTWE9MMApXUzcxdXdxWUpHWkF0c1VLRVJLVTk0cVRzTisrTTJ4NnJNT1I0VHBjK3hUWFdJdFh4Wnp1WDFycURQNVdSR2F5CmN4MDFqS0tPaWh3SG1leVRnWHF3eWV6SDFBMkxyUEo4aEc1VVNUdklsM2VyME0yc3VyajFFckpXSVdKZXdnTGwKYzdpNGN6NGRSTnIxc3VyNFZqT3NJNXBoczRqdGJLS29EQlNxQUxBcXF2WDdrVE5vVGlpNkh1SHdCUWZvR1hRaApNaWIvUHlvQmVqVHNEY3ZhbUw2MFhkQmM4N1FaM2hYYkFnTUJBQUdqZ1o0d2dac3dEZ1lEVlIwUEFRSC9CQVFECkFnV2dNQk1HQTFVZEpRUU1NQW9HQ0NzR0FRVUZCd01CTUF3R0ExVWRFd0VCL3dRQ01BQXdaZ1lEVlIwUkJGOHcKWFlJTElHdHBZbUZ1WVMxdmNIT0NMQ0JyYVdKaGJtRXRiM0J6TG5KdmRYUmxjaTVrWldaaGRXeDBMbk4yWXk1agpiSFZ6ZEdWeUxteHZZMkZzZ2hnZ2EybGlZVzVoTGpFeU55NHdMakF1TVM1NGFYQXVhVytDQm10cFltRnVZVEFOCkJna3Foa2lHOXcwQkFRc0ZBQU9DQVFFQXVwSTMzejZzMG5aZ2FSK01Pb29BckhBdGo0Y21uL09laEZCYlVLSW0KejFqdzZuZzFINENxSXgzZ1JueXIzTlNXbVdxSTg0VDcyWkRhUkdkYjdLeXNEc0VHTDdPRFd3YzJGU0sxMGx0MAp3K2pCcTMzZjJUejhsWGMwbUd1QWFEVXIvaVF2eFVmSmp5Z01JUjZmQzRpQUc5dXI1cXBqanMzMHIzUE5CNlVMCmdTVktkWEw2UG85V0lOak9ZbjNrc2FlM0UzUk9uZFM5eUlwWnhUK0NZcnpZTmxwY2Mxd1pva0I5M0RvalM2TGYKaVdraWJVNEc0ZGQrcjB6WngzQWRrc2hFUS9JUHNCeWNkdXloYmFwUGg3M2tQRGc5WUU1U0ZzZHNsa1B0SC94TApETlZmOWJLUHozWGJRYVg2MjdLeHQ2L3krM1JNVFMwd2pHK0FTbU93clpiRW1RPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQotLS0tLUJFR0lOIENFUlRJRklDQVRFLS0tLS0KTUlJQzJqQ0NBY0tnQXdJQkFnSUJBVEFOQmdrcWhraUc5dzBCQVFzRkFEQWVNUnd3R2dZRFZRUURFeE5zYjJkbgphVzVuTFhOcFoyNWxjaTEwWlhOME1CNFhEVEUzTURZd09URTBNemMwTjFvWERUSXlNRFl3T0RFME16YzBPRm93CkhqRWNNQm9HQTFVRUF4TVRiRzluWjJsdVp5MXphV2R1WlhJdGRHVnpkRENDQVNJd0RRWUpLb1pJaHZjTkFRRUIKQlFBRGdnRVBBRENDQVFvQ2dnRUJBTi9WeGlyT1RvRnU5SGtJRW1BUzBTRGVtcTQvMXVFK3d1bGhhZUxYVldVUQpnTlBVK3dlOUxEbDltck1hQ3VVQUpqWXRhQ3F2S3Fsaks1S1JZV3h2V0grT2VGR3RuZldqM3dGWVQvQ0hFekhYCjNyVHc5UHNTWCt6VFpseURReXl6RnpLb0o2bXkwaWxTY3Bxd0xwSkgyT212MTZKTXZXK0pyTXdDYWZRRERJMy8KRk9lYVJ2d1VwRTFWUW4rTC9FMkZxTTZKVHBDdkpVMHdManhGSDVvdmlIM0N1cEVHb2xhbUk0eFdpenl1RHRqZQo2em5MYk84YnZrUEVmSTkwc3VNNHROWkY5WXJ3dUdpL2ZSQ3hTZzd1dzV1SWhxTnFDdFZJRUZIc3lCeDRWb3p4ClNsaFJocnNHUm5GR0NHaVYzcVV6YjdpZk83TmNVQUEzNXNBRlZvSnNRQnNDQXdFQUFhTWpNQ0V3RGdZRFZSMFAKQVFIL0JBUURBZ0trTUE4R0ExVWRFd0VCL3dRRk1BTUJBZjh3RFFZSktvWklodmNOQVFFTEJRQURnZ0VCQUhIWgpoS0wzRzF2Nk8yOGlwcHJvZzFPWTdoMUhBaCtQNnNGZ0RzOXBsRHZaeGdoRHlJZ09QS3dHQVhlb0lOWTBnV1I0CkN0QlR4MlZYTzZvR3EzVitzSk1HZmNpVUFHOUJoNmd4aHlQcjBORjhoOVZwZDN0UkxVb2E0dVFnS1J0dis2VEEKckhLN2lVMXlGaE1JM0pKMXd2WVpmN1U1Yjh1M2tiemFtSXlac2oyUlY5ejVxWHVTWWFjSDJ3eURaOHdSMkF0NApHNkNPRXA0azExODROak5mUG1NcDJuLzFwWWVSMmhNa0hRWUJKK3MrazFZNXlBU0k0R0FYMXpKS3pNYnh4SGZOClViTVFyMFd1R0gyRGZGVW5sUnBtYlp4ZlZ0SlkrWmlBcTBUdXJSeGY2eWlRcTdMM0lnRjMwRFZhZHpXZmk5N2YKOW1LakRvS05JRU5obTRjU1VWdz0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=", 
    "encoding": "base64", 
    "item": {
        "file": "kibana-internal.crt", 
        "name": "kibana_internal_cert"
    }, 
    "source": "/etc/origin/logging/kibana-internal.crt"
}
ok: [openshift] => (item={u'name': u'server_tls', u'file': u'server-tls.json'}) => {
    "changed": false, 
    "content": "Ly8gU2VlIGZvciBhdmFpbGFibGUgb3B0aW9uczogaHR0cHM6Ly9ub2RlanMub3JnL2FwaS90bHMuaHRtbCN0bHNfdGxzX2NyZWF0ZXNlcnZlcl9vcHRpb25zX3NlY3VyZWNvbm5lY3Rpb25saXN0ZW5lcgp0bHNfb3B0aW9ucyA9IHsKCWNpcGhlcnM6ICdrRUVDREg6K2tFRUNESCtTSEE6a0VESDora0VESCtTSEE6K2tFREgrQ0FNRUxMSUE6a0VDREg6K2tFQ0RIK1NIQTprUlNBOitrUlNBK1NIQTora1JTQStDQU1FTExJQTohYU5VTEw6IWVOVUxMOiFTU0x2MjohUkM0OiFERVM6IUVYUDohU0VFRDohSURFQTorM0RFUycsCglob25vckNpcGhlck9yZGVyOiB0cnVlCn0K", 
    "encoding": "base64", 
    "item": {
        "file": "server-tls.json", 
        "name": "server_tls"
    }, 
    "source": "/etc/origin/logging/server-tls.json"
}
ok: [openshift] => (item={u'name': u'session_secret', u'file': u'session_secret'}) => {
    "changed": false, 
    "content": "M2oxZU5nTUtiNWNOWjJvV3QzdU1qUmpqenBoZHRrU3JHZXRNUUhEa1JKS2NJNUNlQkx3SmU3NDluc2pPRk5xOTNLVnBIY1FscnRvdHQ2VTN4ck1IRUg3Q3BRZnVpR2Q4bldZdWwxZEl6YWNnU3NaRW84UktkemdnM3YySXQ5TXNIY2FKS3hmRzV5Z2R5ZVg2TDVXYXNqSmxVbDFWUllGWmVGcXdxSmtXbHB5T2V4N0FFZjg5RGVoeERJUDZmWENZenhmNlRIaWc=", 
    "encoding": "base64", 
    "item": {
        "file": "session_secret", 
        "name": "session_secret"
    }, 
    "source": "/etc/origin/logging/session_secret"
}
ok: [openshift] => (item={u'name': u'oauth_secret', u'file': u'oauth_secret'}) => {
    "changed": false, 
    "content": "aHE2SGVLV3h3RHhHRWs4NzBxSHE5SHFma2lzWHdHRXo0SFpnU0QyaTNla1I1STR6b2p4UlNyTkV6NXF6VlBpWg==", 
    "encoding": "base64", 
    "item": {
        "file": "oauth_secret", 
        "name": "oauth_secret"
    }, 
    "source": "/etc/origin/logging/oauth_secret"
}

TASK [openshift_logging_kibana : Set logging-kibana service] *******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:84
changed: [openshift] => {
    "changed": true, 
    "results": {
        "clusterip": "172.30.30.149", 
        "cmd": "/bin/oc get service logging-kibana -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "Service", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:40Z", 
                    "name": "logging-kibana", 
                    "namespace": "logging", 
                    "resourceVersion": "1393", 
                    "selfLink": "/api/v1/namespaces/logging/services/logging-kibana", 
                    "uid": "546efeef-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "clusterIP": "172.30.30.149", 
                    "ports": [
                        {
                            "port": 443, 
                            "protocol": "TCP", 
                            "targetPort": "oaproxy"
                        }
                    ], 
                    "selector": {
                        "component": "kibana", 
                        "provider": "openshift"
                    }, 
                    "sessionAffinity": "None", 
                    "type": "ClusterIP"
                }, 
                "status": {
                    "loadBalancer": {}
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:101
 [WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_key | trim | length
> 0 }}
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:106
 [WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_cert | trim | length
> 0 }}
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:111
 [WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_ca | trim | length >
0 }}
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:116
ok: [openshift] => {
    "ansible_facts": {
        "kibana_ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjME4xb1hEVEl5TURZd09ERTBNemMwT0ZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU4vVnhpck9Ub0Z1OUhrSUVtQVMwU0RlbXE0LzF1RSt3dWxoYWVMWFZXVVEKZ05QVSt3ZTlMRGw5bXJNYUN1VUFKall0YUNxdktxbGpLNUtSWVd4dldIK09lRkd0bmZXajN3RllUL0NIRXpIWAozclR3OVBzU1grelRabHlEUXl5ekZ6S29KNm15MGlsU2NwcXdMcEpIMk9tdjE2Sk12VytKck13Q2FmUURESTMvCkZPZWFSdndVcEUxVlFuK0wvRTJGcU02SlRwQ3ZKVTB3TGp4Rkg1b3ZpSDNDdXBFR29sYW1JNHhXaXp5dUR0amUKNnpuTGJPOGJ2a1BFZkk5MHN1TTR0TlpGOVlyd3VHaS9mUkN4U2c3dXc1dUlocU5xQ3RWSUVGSHN5Qng0Vm96eApTbGhSaHJzR1JuRkdDR2lWM3FVemI3aWZPN05jVUFBMzVzQUZWb0pzUUJzQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFISFoKaEtMM0cxdjZPMjhpcHByb2cxT1k3aDFIQWgrUDZzRmdEczlwbER2WnhnaER5SWdPUEt3R0FYZW9JTlkwZ1dSNApDdEJUeDJWWE82b0dxM1Yrc0pNR2ZjaVVBRzlCaDZneGh5UHIwTkY4aDlWcGQzdFJMVW9hNHVRZ0tSdHYrNlRBCnJISzdpVTF5RmhNSTNKSjF3dllaZjdVNWI4dTNrYnphbUl5WnNqMlJWOXo1cVh1U1lhY0gyd3lEWjh3UjJBdDQKRzZDT0VwNGsxMTg0TmpOZlBtTXAybi8xcFllUjJoTWtIUVlCSitzK2sxWTV5QVNJNEdBWDF6Skt6TWJ4eEhmTgpVYk1RcjBXdUdIMkRmRlVubFJwbWJaeGZWdEpZK1ppQXEwVHVyUnhmNnlpUXE3TDNJZ0YzMERWYWR6V2ZpOTdmCjltS2pEb0tOSUVOaG00Y1NVVnc9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K"
    }, 
    "changed": false
}

TASK [openshift_logging_kibana : Generating Kibana route template] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:121
ok: [openshift] => {
    "changed": false, 
    "checksum": "e837f0412ad02af8b1b5e4c1fc94a925d406098c", 
    "dest": "/tmp/openshift-logging-ansible-1KGJ33/templates/kibana-route.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "705edd86939da83185b13cc705ede6e5", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 2714, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019121.02-23785180962822/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_kibana : Setting Kibana route] *************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:141
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get route logging-kibana -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "Route", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:42Z", 
                    "labels": {
                        "component": "support", 
                        "logging-infra": "support", 
                        "provider": "openshift"
                    }, 
                    "name": "logging-kibana", 
                    "namespace": "logging", 
                    "resourceVersion": "1401", 
                    "selfLink": "/oapi/v1/namespaces/logging/routes/logging-kibana", 
                    "uid": "556da26d-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "host": "kibana.router.default.svc.cluster.local", 
                    "tls": {
                        "caCertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwOTE0Mzc0N1oXDTIyMDYwODE0Mzc0OFow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAN/VxirOToFu9HkIEmAS0SDemq4/1uE+wulhaeLXVWUQ\ngNPU+we9LDl9mrMaCuUAJjYtaCqvKqljK5KRYWxvWH+OeFGtnfWj3wFYT/CHEzHX\n3rTw9PsSX+zTZlyDQyyzFzKoJ6my0ilScpqwLpJH2Omv16JMvW+JrMwCafQDDI3/\nFOeaRvwUpE1VQn+L/E2FqM6JTpCvJU0wLjxFH5oviH3CupEGolamI4xWizyuDtje\n6znLbO8bvkPEfI90suM4tNZF9YrwuGi/fRCxSg7uw5uIhqNqCtVIEFHsyBx4Vozx\nSlhRhrsGRnFGCGiV3qUzb7ifO7NcUAA35sAFVoJsQBsCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAHHZ\nhKL3G1v6O28ipprog1OY7h1HAh+P6sFgDs9plDvZxghDyIgOPKwGAXeoINY0gWR4\nCtBTx2VXO6oGq3V+sJMGfciUAG9Bh6gxhyPr0NF8h9Vpd3tRLUoa4uQgKRtv+6TA\nrHK7iU1yFhMI3JJ1wvYZf7U5b8u3kbzamIyZsj2RV9z5qXuSYacH2wyDZ8wR2At4\nG6COEp4k1184NjNfPmMp2n/1pYeR2hMkHQYBJ+s+k1Y5yASI4GAX1zJKzMbxxHfN\nUbMQr0WuGH2DfFUnlRpmbZxfVtJY+ZiAq0TurRxf6yiQq7L3IgF30DVadzWfi97f\n9mKjDoKNIENhm4cSUVw=\n-----END CERTIFICATE-----\n", 
                        "destinationCACertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwOTE0Mzc0N1oXDTIyMDYwODE0Mzc0OFow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAN/VxirOToFu9HkIEmAS0SDemq4/1uE+wulhaeLXVWUQ\ngNPU+we9LDl9mrMaCuUAJjYtaCqvKqljK5KRYWxvWH+OeFGtnfWj3wFYT/CHEzHX\n3rTw9PsSX+zTZlyDQyyzFzKoJ6my0ilScpqwLpJH2Omv16JMvW+JrMwCafQDDI3/\nFOeaRvwUpE1VQn+L/E2FqM6JTpCvJU0wLjxFH5oviH3CupEGolamI4xWizyuDtje\n6znLbO8bvkPEfI90suM4tNZF9YrwuGi/fRCxSg7uw5uIhqNqCtVIEFHsyBx4Vozx\nSlhRhrsGRnFGCGiV3qUzb7ifO7NcUAA35sAFVoJsQBsCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAHHZ\nhKL3G1v6O28ipprog1OY7h1HAh+P6sFgDs9plDvZxghDyIgOPKwGAXeoINY0gWR4\nCtBTx2VXO6oGq3V+sJMGfciUAG9Bh6gxhyPr0NF8h9Vpd3tRLUoa4uQgKRtv+6TA\nrHK7iU1yFhMI3JJ1wvYZf7U5b8u3kbzamIyZsj2RV9z5qXuSYacH2wyDZ8wR2At4\nG6COEp4k1184NjNfPmMp2n/1pYeR2hMkHQYBJ+s+k1Y5yASI4GAX1zJKzMbxxHfN\nUbMQr0WuGH2DfFUnlRpmbZxfVtJY+ZiAq0TurRxf6yiQq7L3IgF30DVadzWfi97f\n9mKjDoKNIENhm4cSUVw=\n-----END CERTIFICATE-----\n", 
                        "insecureEdgeTerminationPolicy": "Redirect", 
                        "termination": "reencrypt"
                    }, 
                    "to": {
                        "kind": "Service", 
                        "name": "logging-kibana", 
                        "weight": 100
                    }, 
                    "wildcardPolicy": "None"
                }, 
                "status": {
                    "ingress": [
                        {
                            "conditions": [
                                {
                                    "lastTransitionTime": "2017-06-09T14:38:42Z", 
                                    "status": "True", 
                                    "type": "Admitted"
                                }
                            ], 
                            "host": "kibana.router.default.svc.cluster.local", 
                            "routerName": "router", 
                            "wildcardPolicy": "None"
                        }
                    ]
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Get current oauthclient hostnames] ************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:151
ok: [openshift] => {
    "changed": false, 
    "results": {
        "cmd": "/bin/oc get oauthclient kibana-proxy -o json -n logging", 
        "results": [
            {}
        ], 
        "returncode": 0, 
        "stderr": "Error from server (NotFound): oauthclients.oauth.openshift.io \"kibana-proxy\" not found\n", 
        "stdout": ""
    }, 
    "state": "list"
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:159
ok: [openshift] => {
    "ansible_facts": {
        "proxy_hostnames": [
            "https://kibana.router.default.svc.cluster.local"
        ]
    }, 
    "changed": false
}

TASK [openshift_logging_kibana : Create oauth-client template] *****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:162
changed: [openshift] => {
    "changed": true, 
    "checksum": "2f8e721b4514e49e25a976593a771e11b43cbe9e", 
    "dest": "/tmp/openshift-logging-ansible-1KGJ33/templates/oauth-client.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "b9bf180e8c871b4d30eeafd9e9abdc1f", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 328, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019123.05-129456691361695/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_kibana : Set kibana-proxy oauth-client] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:170
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get oauthclient kibana-proxy -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "OAuthClient", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:44Z", 
                    "labels": {
                        "logging-infra": "support"
                    }, 
                    "name": "kibana-proxy", 
                    "resourceVersion": "1407", 
                    "selfLink": "/oapi/v1/oauthclients/kibana-proxy", 
                    "uid": "56b36338-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "redirectURIs": [
                    "https://kibana.router.default.svc.cluster.local"
                ], 
                "scopeRestrictions": [
                    {
                        "literals": [
                            "user:info", 
                            "user:check-access", 
                            "user:list-projects"
                        ]
                    }
                ], 
                "secret": "hq6HeKWxwDxGEk870qHq9HqfkisXwGEz4HZgSD2i3ekR5I4zojxRSrNEz5qzVPiZ"
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Set Kibana secret] ****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:181
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc secrets new logging-kibana ca=/etc/origin/logging/ca.crt key=/etc/origin/logging/system.logging.kibana.key cert=/etc/origin/logging/system.logging.kibana.crt -n logging", 
        "results": "", 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Set Kibana Proxy secret] **********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:195
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc secrets new logging-kibana-proxy oauth-secret=/tmp/oauth-secret-qwLs5M session-secret=/tmp/session-secret-9fgfAx server-key=/tmp/server-key-OVMMdD server-cert=/tmp/server-cert-5Qpjcs server-tls.json=/tmp/server-tls.json-aIaE9A -n logging", 
        "results": "", 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Generate Kibana DC template] ******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:221
changed: [openshift] => {
    "changed": true, 
    "checksum": "d798a7b25877426ba5194405c5d79538429fe6bf", 
    "dest": "/tmp/openshift-logging-ansible-1KGJ33/templates/kibana-dc.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "add5ec292e970d3afe5a326fd4555438", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 3733, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019126.41-97585140752483/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_kibana : Set Kibana DC] ********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:240
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get dc logging-kibana -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "DeploymentConfig", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:47Z", 
                    "generation": 2, 
                    "labels": {
                        "component": "kibana", 
                        "logging-infra": "kibana", 
                        "provider": "openshift"
                    }, 
                    "name": "logging-kibana", 
                    "namespace": "logging", 
                    "resourceVersion": "1422", 
                    "selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-kibana", 
                    "uid": "58aaa25c-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "replicas": 1, 
                    "selector": {
                        "component": "kibana", 
                        "logging-infra": "kibana", 
                        "provider": "openshift"
                    }, 
                    "strategy": {
                        "activeDeadlineSeconds": 21600, 
                        "resources": {}, 
                        "rollingParams": {
                            "intervalSeconds": 1, 
                            "maxSurge": "25%", 
                            "maxUnavailable": "25%", 
                            "timeoutSeconds": 600, 
                            "updatePeriodSeconds": 1
                        }, 
                        "type": "Rolling"
                    }, 
                    "template": {
                        "metadata": {
                            "creationTimestamp": null, 
                            "labels": {
                                "component": "kibana", 
                                "logging-infra": "kibana", 
                                "provider": "openshift"
                            }, 
                            "name": "logging-kibana"
                        }, 
                        "spec": {
                            "containers": [
                                {
                                    "env": [
                                        {
                                            "name": "ES_HOST", 
                                            "value": "logging-es"
                                        }, 
                                        {
                                            "name": "ES_PORT", 
                                            "value": "9200"
                                        }, 
                                        {
                                            "name": "KIBANA_MEMORY_LIMIT", 
                                            "valueFrom": {
                                                "resourceFieldRef": {
                                                    "containerName": "kibana", 
                                                    "divisor": "0", 
                                                    "resource": "limits.memory"
                                                }
                                            }
                                        }
                                    ], 
                                    "image": "172.30.224.2:5000/logging/logging-kibana:latest", 
                                    "imagePullPolicy": "Always", 
                                    "name": "kibana", 
                                    "readinessProbe": {
                                        "exec": {
                                            "command": [
                                                "/usr/share/kibana/probe/readiness.sh"
                                            ]
                                        }, 
                                        "failureThreshold": 3, 
                                        "initialDelaySeconds": 5, 
                                        "periodSeconds": 5, 
                                        "successThreshold": 1, 
                                        "timeoutSeconds": 4
                                    }, 
                                    "resources": {
                                        "limits": {
                                            "memory": "736Mi"
                                        }
                                    }, 
                                    "terminationMessagePath": "/dev/termination-log", 
                                    "terminationMessagePolicy": "File", 
                                    "volumeMounts": [
                                        {
                                            "mountPath": "/etc/kibana/keys", 
                                            "name": "kibana", 
                                            "readOnly": true
                                        }
                                    ]
                                }, 
                                {
                                    "env": [
                                        {
                                            "name": "OAP_BACKEND_URL", 
                                            "value": "http://localhost:5601"
                                        }, 
                                        {
                                            "name": "OAP_AUTH_MODE", 
                                            "value": "oauth2"
                                        }, 
                                        {
                                            "name": "OAP_TRANSFORM", 
                                            "value": "user_header,token_header"
                                        }, 
                                        {
                                            "name": "OAP_OAUTH_ID", 
                                            "value": "kibana-proxy"
                                        }, 
                                        {
                                            "name": "OAP_MASTER_URL", 
                                            "value": "https://kubernetes.default.svc.cluster.local"
                                        }, 
                                        {
                                            "name": "OAP_PUBLIC_MASTER_URL", 
                                            "value": "https://172.18.4.93:8443"
                                        }, 
                                        {
                                            "name": "OAP_LOGOUT_REDIRECT", 
                                            "value": "https://172.18.4.93:8443/console/logout"
                                        }, 
                                        {
                                            "name": "OAP_MASTER_CA_FILE", 
                                            "value": "/var/run/secrets/kubernetes.io/serviceaccount/ca.crt"
                                        }, 
                                        {
                                            "name": "OAP_DEBUG", 
                                            "value": "False"
                                        }, 
                                        {
                                            "name": "OAP_OAUTH_SECRET_FILE", 
                                            "value": "/secret/oauth-secret"
                                        }, 
                                        {
                                            "name": "OAP_SERVER_CERT_FILE", 
                                            "value": "/secret/server-cert"
                                        }, 
                                        {
                                            "name": "OAP_SERVER_KEY_FILE", 
                                            "value": "/secret/server-key"
                                        }, 
                                        {
                                            "name": "OAP_SERVER_TLS_FILE", 
                                            "value": "/secret/server-tls.json"
                                        }, 
                                        {
                                            "name": "OAP_SESSION_SECRET_FILE", 
                                            "value": "/secret/session-secret"
                                        }, 
                                        {
                                            "name": "OCP_AUTH_PROXY_MEMORY_LIMIT", 
                                            "valueFrom": {
                                                "resourceFieldRef": {
                                                    "containerName": "kibana-proxy", 
                                                    "divisor": "0", 
                                                    "resource": "limits.memory"
                                                }
                                            }
                                        }
                                    ], 
                                    "image": "172.30.224.2:5000/logging/logging-auth-proxy:latest", 
                                    "imagePullPolicy": "Always", 
                                    "name": "kibana-proxy", 
                                    "ports": [
                                        {
                                            "containerPort": 3000, 
                                            "name": "oaproxy", 
                                            "protocol": "TCP"
                                        }
                                    ], 
                                    "resources": {
                                        "limits": {
                                            "memory": "96Mi"
                                        }
                                    }, 
                                    "terminationMessagePath": "/dev/termination-log", 
                                    "terminationMessagePolicy": "File", 
                                    "volumeMounts": [
                                        {
                                            "mountPath": "/secret", 
                                            "name": "kibana-proxy", 
                                            "readOnly": true
                                        }
                                    ]
                                }
                            ], 
                            "dnsPolicy": "ClusterFirst", 
                            "restartPolicy": "Always", 
                            "schedulerName": "default-scheduler", 
                            "securityContext": {}, 
                            "serviceAccount": "aggregated-logging-kibana", 
                            "serviceAccountName": "aggregated-logging-kibana", 
                            "terminationGracePeriodSeconds": 30, 
                            "volumes": [
                                {
                                    "name": "kibana", 
                                    "secret": {
                                        "defaultMode": 420, 
                                        "secretName": "logging-kibana"
                                    }
                                }, 
                                {
                                    "name": "kibana-proxy", 
                                    "secret": {
                                        "defaultMode": 420, 
                                        "secretName": "logging-kibana-proxy"
                                    }
                                }
                            ]
                        }
                    }, 
                    "test": false, 
                    "triggers": [
                        {
                            "type": "ConfigChange"
                        }
                    ]
                }, 
                "status": {
                    "availableReplicas": 0, 
                    "conditions": [
                        {
                            "lastTransitionTime": "2017-06-09T14:38:47Z", 
                            "lastUpdateTime": "2017-06-09T14:38:47Z", 
                            "message": "Deployment config does not have minimum availability.", 
                            "status": "False", 
                            "type": "Available"
                        }, 
                        {
                            "lastTransitionTime": "2017-06-09T14:38:47Z", 
                            "lastUpdateTime": "2017-06-09T14:38:47Z", 
                            "message": "replication controller \"logging-kibana-1\" is waiting for pod \"logging-kibana-1-deploy\" to run", 
                            "status": "Unknown", 
                            "type": "Progressing"
                        }
                    ], 
                    "details": {
                        "causes": [
                            {
                                "type": "ConfigChange"
                            }
                        ], 
                        "message": "config change"
                    }, 
                    "latestVersion": 1, 
                    "observedGeneration": 2, 
                    "replicas": 0, 
                    "unavailableReplicas": 0, 
                    "updatedReplicas": 0
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Delete temp directory] ************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:252
ok: [openshift] => {
    "changed": false, 
    "path": "/tmp/openshift-logging-ansible-1KGJ33", 
    "state": "absent"
}

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:166
statically included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml

TASK [openshift_logging_kibana : fail] *****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:3
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:7
ok: [openshift] => {
    "ansible_facts": {
        "kibana_version": "3_5"
    }, 
    "changed": false
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : fail] *****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/determine_version.yaml:15
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : Create temp directory for doing work in] ******
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:7
ok: [openshift] => {
    "changed": false, 
    "cmd": [
        "mktemp", 
        "-d", 
        "/tmp/openshift-logging-ansible-XXXXXX"
    ], 
    "delta": "0:00:00.002067", 
    "end": "2017-06-09 10:38:49.148193", 
    "rc": 0, 
    "start": "2017-06-09 10:38:49.146126"
}

STDOUT:

/tmp/openshift-logging-ansible-aMED3y

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:12
ok: [openshift] => {
    "ansible_facts": {
        "tempdir": "/tmp/openshift-logging-ansible-aMED3y"
    }, 
    "changed": false
}

TASK [openshift_logging_kibana : Create templates subdirectory] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:16
ok: [openshift] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/tmp/openshift-logging-ansible-aMED3y/templates", 
    "secontext": "unconfined_u:object_r:user_tmp_t:s0", 
    "size": 6, 
    "state": "directory", 
    "uid": 0
}

TASK [openshift_logging_kibana : Create Kibana service account] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:26
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : Create Kibana service account] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:34
ok: [openshift] => {
    "changed": false, 
    "results": {
        "cmd": "/bin/oc get sa aggregated-logging-kibana -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "imagePullSecrets": [
                    {
                        "name": "aggregated-logging-kibana-dockercfg-tqm6f"
                    }
                ], 
                "kind": "ServiceAccount", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:37Z", 
                    "name": "aggregated-logging-kibana", 
                    "namespace": "logging", 
                    "resourceVersion": "1375", 
                    "selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-kibana", 
                    "uid": "5299210b-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "secrets": [
                    {
                        "name": "aggregated-logging-kibana-token-hgw85"
                    }, 
                    {
                        "name": "aggregated-logging-kibana-dockercfg-tqm6f"
                    }
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:42
ok: [openshift] => {
    "ansible_facts": {
        "kibana_component": "kibana-ops", 
        "kibana_name": "logging-kibana-ops"
    }, 
    "changed": false
}

TASK [openshift_logging_kibana : Checking for session_secret] ******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:47
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "atime": 1497019119.2705734, 
        "attr_flags": "", 
        "attributes": [], 
        "block_size": 4096, 
        "blocks": 8, 
        "charset": "us-ascii", 
        "checksum": "adf4aeae989a1bbdf0b6fec127ee16e4fda68b6a", 
        "ctime": 1497019118.2815607, 
        "dev": 51714, 
        "device_type": 0, 
        "executable": false, 
        "exists": true, 
        "gid": 0, 
        "gr_name": "root", 
        "inode": 110101646, 
        "isblk": false, 
        "ischr": false, 
        "isdir": false, 
        "isfifo": false, 
        "isgid": false, 
        "islnk": false, 
        "isreg": true, 
        "issock": false, 
        "isuid": false, 
        "md5": "956b384f3ada383be823578bd846d264", 
        "mimetype": "text/plain", 
        "mode": "0644", 
        "mtime": 1497019118.1685593, 
        "nlink": 1, 
        "path": "/etc/origin/logging/session_secret", 
        "pw_name": "root", 
        "readable": true, 
        "rgrp": true, 
        "roth": true, 
        "rusr": true, 
        "size": 200, 
        "uid": 0, 
        "version": "18446744071762072371", 
        "wgrp": false, 
        "woth": false, 
        "writeable": true, 
        "wusr": true, 
        "xgrp": false, 
        "xoth": false, 
        "xusr": false
    }
}

TASK [openshift_logging_kibana : Checking for oauth_secret] ********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:51
ok: [openshift] => {
    "changed": false, 
    "stat": {
        "atime": 1497019119.3835747, 
        "attr_flags": "", 
        "attributes": [], 
        "block_size": 4096, 
        "blocks": 8, 
        "charset": "us-ascii", 
        "checksum": "ba82e9e2e78f595b88c9297945184551ad704096", 
        "ctime": 1497019118.5975647, 
        "dev": 51714, 
        "device_type": 0, 
        "executable": false, 
        "exists": true, 
        "gid": 0, 
        "gr_name": "root", 
        "inode": 134708086, 
        "isblk": false, 
        "ischr": false, 
        "isdir": false, 
        "isfifo": false, 
        "isgid": false, 
        "islnk": false, 
        "isreg": true, 
        "issock": false, 
        "isuid": false, 
        "md5": "5eae645a757d1c131dde4fd427bc7594", 
        "mimetype": "text/plain", 
        "mode": "0644", 
        "mtime": 1497019118.475563, 
        "nlink": 1, 
        "path": "/etc/origin/logging/oauth_secret", 
        "pw_name": "root", 
        "readable": true, 
        "rgrp": true, 
        "roth": true, 
        "rusr": true, 
        "size": 64, 
        "uid": 0, 
        "version": "18446744072472619233", 
        "wgrp": false, 
        "woth": false, 
        "writeable": true, 
        "wusr": true, 
        "xgrp": false, 
        "xoth": false, 
        "xusr": false
    }
}

TASK [openshift_logging_kibana : Generate session secret] **********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:56
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : Generate oauth secret] ************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:64
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : Retrieving the cert to use when generating secrets for the logging components] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:71
ok: [openshift] => (item={u'name': u'ca_file', u'file': u'ca.crt'}) => {
    "changed": false, 
    "content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjME4xb1hEVEl5TURZd09ERTBNemMwT0ZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU4vVnhpck9Ub0Z1OUhrSUVtQVMwU0RlbXE0LzF1RSt3dWxoYWVMWFZXVVEKZ05QVSt3ZTlMRGw5bXJNYUN1VUFKall0YUNxdktxbGpLNUtSWVd4dldIK09lRkd0bmZXajN3RllUL0NIRXpIWAozclR3OVBzU1grelRabHlEUXl5ekZ6S29KNm15MGlsU2NwcXdMcEpIMk9tdjE2Sk12VytKck13Q2FmUURESTMvCkZPZWFSdndVcEUxVlFuK0wvRTJGcU02SlRwQ3ZKVTB3TGp4Rkg1b3ZpSDNDdXBFR29sYW1JNHhXaXp5dUR0amUKNnpuTGJPOGJ2a1BFZkk5MHN1TTR0TlpGOVlyd3VHaS9mUkN4U2c3dXc1dUlocU5xQ3RWSUVGSHN5Qng0Vm96eApTbGhSaHJzR1JuRkdDR2lWM3FVemI3aWZPN05jVUFBMzVzQUZWb0pzUUJzQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFISFoKaEtMM0cxdjZPMjhpcHByb2cxT1k3aDFIQWgrUDZzRmdEczlwbER2WnhnaER5SWdPUEt3R0FYZW9JTlkwZ1dSNApDdEJUeDJWWE82b0dxM1Yrc0pNR2ZjaVVBRzlCaDZneGh5UHIwTkY4aDlWcGQzdFJMVW9hNHVRZ0tSdHYrNlRBCnJISzdpVTF5RmhNSTNKSjF3dllaZjdVNWI4dTNrYnphbUl5WnNqMlJWOXo1cVh1U1lhY0gyd3lEWjh3UjJBdDQKRzZDT0VwNGsxMTg0TmpOZlBtTXAybi8xcFllUjJoTWtIUVlCSitzK2sxWTV5QVNJNEdBWDF6Skt6TWJ4eEhmTgpVYk1RcjBXdUdIMkRmRlVubFJwbWJaeGZWdEpZK1ppQXEwVHVyUnhmNnlpUXE3TDNJZ0YzMERWYWR6V2ZpOTdmCjltS2pEb0tOSUVOaG00Y1NVVnc9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K", 
    "encoding": "base64", 
    "item": {
        "file": "ca.crt", 
        "name": "ca_file"
    }, 
    "source": "/etc/origin/logging/ca.crt"
}
ok: [openshift] => (item={u'name': u'kibana_internal_key', u'file': u'kibana-internal.key'}) => {
    "changed": false, 
    "content": "LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktLS0tLQpNSUlFb3dJQkFBS0NBUUVBeUNTL2hKbnEwa2pEd2xoZG9Gd3ZXRHZ3LytCd3o2Yml6UHllNTVDRmh2cVZtNE85CjNUY3hqb1p5cHNiZm9lTHZpK0FYVllWQmdBZDhPZG9IQzlhT1V1ckZBZVFNTFBBMHluTUxBTzRUd3pPU0p0TjEKa2x6aTlGa3U5YnNLbUNSbVFMYkZDaEVTbFBlS2s3RGZ2ak5zZXF6RGtlRTZYUHNVMTFpTFY4V2M3bDlhNmd6KwpWa1Jtc25NZE5ZeWlqb29jQjVuc2s0RjZzTW5zeDlRTmk2enlmSVJ1VkVrN3lKZDNxOUROckxxNDlSS3lWaUZpClhzSUM1WE80dUhNK0hVVGE5YkxxK0ZZenJDT2FZYk9JN1d5aXFBd1VxZ0N3S3FyMSs1RXphRTRvdWg3aDhBVUgKNkJsMElUSW0vejhxQVhvMDdBM0wycGkrdEYzUVhQTzBHZDRWMndJREFRQUJBb0lCQVFDUHhrNUtUR055MGxERwpRTUpwV3krcm04dkJsSktWcVJZT0dYOXhhOUZ3S0h6bXJabnIyeVZmZEZmU1ZOVDdyMUZUMHhRUUhGejBRdXhMCmhzTS9EczlJNDF2SXd2QzRLNHBRMEpuYi9pcjJOQXJPbDJORFZEUzVRWVBKaEtiVXFubEdEY1c0T0pGM3IrZTkKdHZiVDVJOE9CU09zblBaWEt3dEtzMUhPS0toV1Y2WXRhMzJiRXplVm9UclU1WDFMWkEvbms4V04zaWl5M0pSVQpTY1NEbkIrYmVFNE9uSXR0d1pMdnRwUlZ1UlBreWJXTEVPbmZOZFcrNEdDQjhQRy9QWmJzSXdHTmsxcDhwaE1sCnNjWWxHamkzVm9uZDBaT244ekxENjhlSHRHWHp1ZThBbGtuR1J0OTY5OTQ0eGxtNnY1K2gySGFBSmdCZWdnZXAKVWtWc1NpZ1pBb0dCQU5qWU1DcHZlT0g2K29zbjlsQ2NldzRHOEZUWEZhYnQyc0owUjNNc1NFYTRZOFJ4cnByTwpNR0Z4S1V2cnFGN1FNUmZxYkhsd3pmTE96WFlrMTFYOHV4a3NQZG1KSkEwejBxTVI3SEJBZXU0WDdJNjl0ZXpoCmJNdy95VW80UEc1dWdpZkNTRzBieTZNTFVJOEoyRC9abWZxZnRkT21aSjU2OEhCdm5rUm92Y1kzQW9HQkFPeEkKaTYwRGdjQzdxaGdqemNMenFwcGd3anJta2ZQNnd1S25Obkc1V0ZwOGJySmM2bWo2dVRhdlhEWVF6VTZEZk94SAoxbHQ1SzQ5WHBTNXEzTExCWUloN0dHOEUxMDdvOEM2ZWtlbFRORTZQWmdiZXROSnRXWWhJZGhmZzNyZ0VMRGt1CjlzMVZLcVJMMTc2ME5PbWQ3UDVKR2kzdmhld0NVZHFQbUJrYTE1dDlBb0dBUmliYW5qL2w2YVhhZkQ1M2IyalEKWHA5Y0RQWndhTXEyWlFaZFB5TnFWb2E4c0FiZkovSGdzUVY5Q0xTNmljSHN3QUgxQ2V5MmxBRFhjNHREcHV2VwpVN1IrWmV2Nkg5Tk9KN2RhdUk0RHR4ZENUb09OWVk4a05ZZkZSUitnWFZHZkJlSFNzSW0zZlkzaGlBVDFVdUxBCjc5WFBheU4vbGMzTUQzUHN6ZERjNUUwQ2dZQkxSYmtwZnVxQkNjZmdOTmZCK1hvcUFCVWdTbi9JcCtRWjdJY04KcDZ3YjkzUVVZa0ZTL0R5d3pTQ2xJS2tuRUFCbURXU2VjM1dMRHJMU25MeCtQZGlRNGhZZ2wvdzNhVUhLdUQxbgpoVmd1aHNSTC8vcSs1cE1WTlhCWm53dVV0OCtXei8xVDRJUGJIMFkxdkpiMnJaYm9VMFdCeU1Kek16SDhYSzVwCm9RRjZ5UUtCZ0F0VkUzRmVWSXNmMjRIOTZTZWo5N2ZnM25sZ203VElRaEV0U283U1NuUkhXV3VGTkVDK1JQNGkKWFVZQjNRNWJLMVFiSUpCTkU0cGNNNGJRbDBFcVZjbVVuckRzS3NzK0Y4RFk5Z2JSTzdobEcyb2hlRFh4alo3KwpwTWFWdXpQUlNrMy9uUTg4aGovVkpWZ01WK3VGanI4TzhOYndMTmRmMmZlbm1hdk1IUGFmCi0tLS0tRU5EIFJTQSBQUklWQVRFIEtFWS0tLS0tCg==", 
    "encoding": "base64", 
    "item": {
        "file": "kibana-internal.key", 
        "name": "kibana_internal_key"
    }, 
    "source": "/etc/origin/logging/kibana-internal.key"
}
ok: [openshift] => (item={u'name': u'kibana_internal_cert', u'file': u'kibana-internal.crt'}) => {
    "changed": false, 
    "content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURUakNDQWphZ0F3SUJBZ0lCQWpBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjMU1Gb1hEVEU1TURZd09URTBNemMxTVZvdwpGakVVTUJJR0ExVUVBeE1MSUd0cFltRnVZUzF2Y0hNd2dnRWlNQTBHQ1NxR1NJYjNEUUVCQVFVQUE0SUJEd0F3CmdnRUtBb0lCQVFESUpMK0VtZXJTU01QQ1dGMmdYQzlZTy9ELzRIRFBwdUxNL0o3bmtJV0crcFdiZzczZE56R08KaG5LbXh0K2g0dStMNEJkVmhVR0FCM3c1MmdjTDFvNVM2c1VCNUF3czhEVEtjd3NBN2hQRE01SW0wM1dTWE9MMApXUzcxdXdxWUpHWkF0c1VLRVJLVTk0cVRzTisrTTJ4NnJNT1I0VHBjK3hUWFdJdFh4Wnp1WDFycURQNVdSR2F5CmN4MDFqS0tPaWh3SG1leVRnWHF3eWV6SDFBMkxyUEo4aEc1VVNUdklsM2VyME0yc3VyajFFckpXSVdKZXdnTGwKYzdpNGN6NGRSTnIxc3VyNFZqT3NJNXBoczRqdGJLS29EQlNxQUxBcXF2WDdrVE5vVGlpNkh1SHdCUWZvR1hRaApNaWIvUHlvQmVqVHNEY3ZhbUw2MFhkQmM4N1FaM2hYYkFnTUJBQUdqZ1o0d2dac3dEZ1lEVlIwUEFRSC9CQVFECkFnV2dNQk1HQTFVZEpRUU1NQW9HQ0NzR0FRVUZCd01CTUF3R0ExVWRFd0VCL3dRQ01BQXdaZ1lEVlIwUkJGOHcKWFlJTElHdHBZbUZ1WVMxdmNIT0NMQ0JyYVdKaGJtRXRiM0J6TG5KdmRYUmxjaTVrWldaaGRXeDBMbk4yWXk1agpiSFZ6ZEdWeUxteHZZMkZzZ2hnZ2EybGlZVzVoTGpFeU55NHdMakF1TVM1NGFYQXVhVytDQm10cFltRnVZVEFOCkJna3Foa2lHOXcwQkFRc0ZBQU9DQVFFQXVwSTMzejZzMG5aZ2FSK01Pb29BckhBdGo0Y21uL09laEZCYlVLSW0KejFqdzZuZzFINENxSXgzZ1JueXIzTlNXbVdxSTg0VDcyWkRhUkdkYjdLeXNEc0VHTDdPRFd3YzJGU0sxMGx0MAp3K2pCcTMzZjJUejhsWGMwbUd1QWFEVXIvaVF2eFVmSmp5Z01JUjZmQzRpQUc5dXI1cXBqanMzMHIzUE5CNlVMCmdTVktkWEw2UG85V0lOak9ZbjNrc2FlM0UzUk9uZFM5eUlwWnhUK0NZcnpZTmxwY2Mxd1pva0I5M0RvalM2TGYKaVdraWJVNEc0ZGQrcjB6WngzQWRrc2hFUS9JUHNCeWNkdXloYmFwUGg3M2tQRGc5WUU1U0ZzZHNsa1B0SC94TApETlZmOWJLUHozWGJRYVg2MjdLeHQ2L3krM1JNVFMwd2pHK0FTbU93clpiRW1RPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQotLS0tLUJFR0lOIENFUlRJRklDQVRFLS0tLS0KTUlJQzJqQ0NBY0tnQXdJQkFnSUJBVEFOQmdrcWhraUc5dzBCQVFzRkFEQWVNUnd3R2dZRFZRUURFeE5zYjJkbgphVzVuTFhOcFoyNWxjaTEwWlhOME1CNFhEVEUzTURZd09URTBNemMwTjFvWERUSXlNRFl3T0RFME16YzBPRm93CkhqRWNNQm9HQTFVRUF4TVRiRzluWjJsdVp5MXphV2R1WlhJdGRHVnpkRENDQVNJd0RRWUpLb1pJaHZjTkFRRUIKQlFBRGdnRVBBRENDQVFvQ2dnRUJBTi9WeGlyT1RvRnU5SGtJRW1BUzBTRGVtcTQvMXVFK3d1bGhhZUxYVldVUQpnTlBVK3dlOUxEbDltck1hQ3VVQUpqWXRhQ3F2S3Fsaks1S1JZV3h2V0grT2VGR3RuZldqM3dGWVQvQ0hFekhYCjNyVHc5UHNTWCt6VFpseURReXl6RnpLb0o2bXkwaWxTY3Bxd0xwSkgyT212MTZKTXZXK0pyTXdDYWZRRERJMy8KRk9lYVJ2d1VwRTFWUW4rTC9FMkZxTTZKVHBDdkpVMHdManhGSDVvdmlIM0N1cEVHb2xhbUk0eFdpenl1RHRqZQo2em5MYk84YnZrUEVmSTkwc3VNNHROWkY5WXJ3dUdpL2ZSQ3hTZzd1dzV1SWhxTnFDdFZJRUZIc3lCeDRWb3p4ClNsaFJocnNHUm5GR0NHaVYzcVV6YjdpZk83TmNVQUEzNXNBRlZvSnNRQnNDQXdFQUFhTWpNQ0V3RGdZRFZSMFAKQVFIL0JBUURBZ0trTUE4R0ExVWRFd0VCL3dRRk1BTUJBZjh3RFFZSktvWklodmNOQVFFTEJRQURnZ0VCQUhIWgpoS0wzRzF2Nk8yOGlwcHJvZzFPWTdoMUhBaCtQNnNGZ0RzOXBsRHZaeGdoRHlJZ09QS3dHQVhlb0lOWTBnV1I0CkN0QlR4MlZYTzZvR3EzVitzSk1HZmNpVUFHOUJoNmd4aHlQcjBORjhoOVZwZDN0UkxVb2E0dVFnS1J0dis2VEEKckhLN2lVMXlGaE1JM0pKMXd2WVpmN1U1Yjh1M2tiemFtSXlac2oyUlY5ejVxWHVTWWFjSDJ3eURaOHdSMkF0NApHNkNPRXA0azExODROak5mUG1NcDJuLzFwWWVSMmhNa0hRWUJKK3MrazFZNXlBU0k0R0FYMXpKS3pNYnh4SGZOClViTVFyMFd1R0gyRGZGVW5sUnBtYlp4ZlZ0SlkrWmlBcTBUdXJSeGY2eWlRcTdMM0lnRjMwRFZhZHpXZmk5N2YKOW1LakRvS05JRU5obTRjU1VWdz0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=", 
    "encoding": "base64", 
    "item": {
        "file": "kibana-internal.crt", 
        "name": "kibana_internal_cert"
    }, 
    "source": "/etc/origin/logging/kibana-internal.crt"
}
ok: [openshift] => (item={u'name': u'server_tls', u'file': u'server-tls.json'}) => {
    "changed": false, 
    "content": "Ly8gU2VlIGZvciBhdmFpbGFibGUgb3B0aW9uczogaHR0cHM6Ly9ub2RlanMub3JnL2FwaS90bHMuaHRtbCN0bHNfdGxzX2NyZWF0ZXNlcnZlcl9vcHRpb25zX3NlY3VyZWNvbm5lY3Rpb25saXN0ZW5lcgp0bHNfb3B0aW9ucyA9IHsKCWNpcGhlcnM6ICdrRUVDREg6K2tFRUNESCtTSEE6a0VESDora0VESCtTSEE6K2tFREgrQ0FNRUxMSUE6a0VDREg6K2tFQ0RIK1NIQTprUlNBOitrUlNBK1NIQTora1JTQStDQU1FTExJQTohYU5VTEw6IWVOVUxMOiFTU0x2MjohUkM0OiFERVM6IUVYUDohU0VFRDohSURFQTorM0RFUycsCglob25vckNpcGhlck9yZGVyOiB0cnVlCn0K", 
    "encoding": "base64", 
    "item": {
        "file": "server-tls.json", 
        "name": "server_tls"
    }, 
    "source": "/etc/origin/logging/server-tls.json"
}
ok: [openshift] => (item={u'name': u'session_secret', u'file': u'session_secret'}) => {
    "changed": false, 
    "content": "M2oxZU5nTUtiNWNOWjJvV3QzdU1qUmpqenBoZHRrU3JHZXRNUUhEa1JKS2NJNUNlQkx3SmU3NDluc2pPRk5xOTNLVnBIY1FscnRvdHQ2VTN4ck1IRUg3Q3BRZnVpR2Q4bldZdWwxZEl6YWNnU3NaRW84UktkemdnM3YySXQ5TXNIY2FKS3hmRzV5Z2R5ZVg2TDVXYXNqSmxVbDFWUllGWmVGcXdxSmtXbHB5T2V4N0FFZjg5RGVoeERJUDZmWENZenhmNlRIaWc=", 
    "encoding": "base64", 
    "item": {
        "file": "session_secret", 
        "name": "session_secret"
    }, 
    "source": "/etc/origin/logging/session_secret"
}
ok: [openshift] => (item={u'name': u'oauth_secret', u'file': u'oauth_secret'}) => {
    "changed": false, 
    "content": "aHE2SGVLV3h3RHhHRWs4NzBxSHE5SHFma2lzWHdHRXo0SFpnU0QyaTNla1I1STR6b2p4UlNyTkV6NXF6VlBpWg==", 
    "encoding": "base64", 
    "item": {
        "file": "oauth_secret", 
        "name": "oauth_secret"
    }, 
    "source": "/etc/origin/logging/oauth_secret"
}

TASK [openshift_logging_kibana : Set logging-kibana-ops service] ***************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:84
changed: [openshift] => {
    "changed": true, 
    "results": {
        "clusterip": "172.30.164.117", 
        "cmd": "/bin/oc get service logging-kibana-ops -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "Service", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:53Z", 
                    "name": "logging-kibana-ops", 
                    "namespace": "logging", 
                    "resourceVersion": "1443", 
                    "selfLink": "/api/v1/namespaces/logging/services/logging-kibana-ops", 
                    "uid": "5c260d94-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "clusterIP": "172.30.164.117", 
                    "ports": [
                        {
                            "port": 443, 
                            "protocol": "TCP", 
                            "targetPort": "oaproxy"
                        }
                    ], 
                    "selector": {
                        "component": "kibana-ops", 
                        "provider": "openshift"
                    }, 
                    "sessionAffinity": "None", 
                    "type": "ClusterIP"
                }, 
                "status": {
                    "loadBalancer": {}
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:101
 [WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_key | trim | length
> 0 }}
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:106
 [WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_cert | trim | length
> 0 }}
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:111
 [WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_ca | trim | length >
0 }}
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:116
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_kibana : Generating Kibana route template] *************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:121
ok: [openshift] => {
    "changed": false, 
    "checksum": "7abe82a466d3682dd957d19927eb24c1836a91d2", 
    "dest": "/tmp/openshift-logging-ansible-aMED3y/templates/kibana-route.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "391e2af7aa49c135d91430bb364f01a2", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 2726, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019134.23-10743612443536/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_kibana : Setting Kibana route] *************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:141
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get route logging-kibana-ops -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "Route", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:55Z", 
                    "labels": {
                        "component": "support", 
                        "logging-infra": "support", 
                        "provider": "openshift"
                    }, 
                    "name": "logging-kibana-ops", 
                    "namespace": "logging", 
                    "resourceVersion": "1455", 
                    "selfLink": "/oapi/v1/namespaces/logging/routes/logging-kibana-ops", 
                    "uid": "5d9c8ac5-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "host": "kibana-ops.router.default.svc.cluster.local", 
                    "tls": {
                        "caCertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwOTE0Mzc0N1oXDTIyMDYwODE0Mzc0OFow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAN/VxirOToFu9HkIEmAS0SDemq4/1uE+wulhaeLXVWUQ\ngNPU+we9LDl9mrMaCuUAJjYtaCqvKqljK5KRYWxvWH+OeFGtnfWj3wFYT/CHEzHX\n3rTw9PsSX+zTZlyDQyyzFzKoJ6my0ilScpqwLpJH2Omv16JMvW+JrMwCafQDDI3/\nFOeaRvwUpE1VQn+L/E2FqM6JTpCvJU0wLjxFH5oviH3CupEGolamI4xWizyuDtje\n6znLbO8bvkPEfI90suM4tNZF9YrwuGi/fRCxSg7uw5uIhqNqCtVIEFHsyBx4Vozx\nSlhRhrsGRnFGCGiV3qUzb7ifO7NcUAA35sAFVoJsQBsCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAHHZ\nhKL3G1v6O28ipprog1OY7h1HAh+P6sFgDs9plDvZxghDyIgOPKwGAXeoINY0gWR4\nCtBTx2VXO6oGq3V+sJMGfciUAG9Bh6gxhyPr0NF8h9Vpd3tRLUoa4uQgKRtv+6TA\nrHK7iU1yFhMI3JJ1wvYZf7U5b8u3kbzamIyZsj2RV9z5qXuSYacH2wyDZ8wR2At4\nG6COEp4k1184NjNfPmMp2n/1pYeR2hMkHQYBJ+s+k1Y5yASI4GAX1zJKzMbxxHfN\nUbMQr0WuGH2DfFUnlRpmbZxfVtJY+ZiAq0TurRxf6yiQq7L3IgF30DVadzWfi97f\n9mKjDoKNIENhm4cSUVw=\n-----END CERTIFICATE-----\n", 
                        "destinationCACertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwOTE0Mzc0N1oXDTIyMDYwODE0Mzc0OFow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAN/VxirOToFu9HkIEmAS0SDemq4/1uE+wulhaeLXVWUQ\ngNPU+we9LDl9mrMaCuUAJjYtaCqvKqljK5KRYWxvWH+OeFGtnfWj3wFYT/CHEzHX\n3rTw9PsSX+zTZlyDQyyzFzKoJ6my0ilScpqwLpJH2Omv16JMvW+JrMwCafQDDI3/\nFOeaRvwUpE1VQn+L/E2FqM6JTpCvJU0wLjxFH5oviH3CupEGolamI4xWizyuDtje\n6znLbO8bvkPEfI90suM4tNZF9YrwuGi/fRCxSg7uw5uIhqNqCtVIEFHsyBx4Vozx\nSlhRhrsGRnFGCGiV3qUzb7ifO7NcUAA35sAFVoJsQBsCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBAHHZ\nhKL3G1v6O28ipprog1OY7h1HAh+P6sFgDs9plDvZxghDyIgOPKwGAXeoINY0gWR4\nCtBTx2VXO6oGq3V+sJMGfciUAG9Bh6gxhyPr0NF8h9Vpd3tRLUoa4uQgKRtv+6TA\nrHK7iU1yFhMI3JJ1wvYZf7U5b8u3kbzamIyZsj2RV9z5qXuSYacH2wyDZ8wR2At4\nG6COEp4k1184NjNfPmMp2n/1pYeR2hMkHQYBJ+s+k1Y5yASI4GAX1zJKzMbxxHfN\nUbMQr0WuGH2DfFUnlRpmbZxfVtJY+ZiAq0TurRxf6yiQq7L3IgF30DVadzWfi97f\n9mKjDoKNIENhm4cSUVw=\n-----END CERTIFICATE-----\n", 
                        "insecureEdgeTerminationPolicy": "Redirect", 
                        "termination": "reencrypt"
                    }, 
                    "to": {
                        "kind": "Service", 
                        "name": "logging-kibana-ops", 
                        "weight": 100
                    }, 
                    "wildcardPolicy": "None"
                }, 
                "status": {
                    "ingress": [
                        {
                            "conditions": [
                                {
                                    "lastTransitionTime": "2017-06-09T14:38:55Z", 
                                    "status": "True", 
                                    "type": "Admitted"
                                }
                            ], 
                            "host": "kibana-ops.router.default.svc.cluster.local", 
                            "routerName": "router", 
                            "wildcardPolicy": "None"
                        }
                    ]
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Get current oauthclient hostnames] ************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:151
ok: [openshift] => {
    "changed": false, 
    "results": {
        "cmd": "/bin/oc get oauthclient kibana-proxy -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "OAuthClient", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:44Z", 
                    "labels": {
                        "logging-infra": "support"
                    }, 
                    "name": "kibana-proxy", 
                    "resourceVersion": "1407", 
                    "selfLink": "/oapi/v1/oauthclients/kibana-proxy", 
                    "uid": "56b36338-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "redirectURIs": [
                    "https://kibana.router.default.svc.cluster.local"
                ], 
                "scopeRestrictions": [
                    {
                        "literals": [
                            "user:info", 
                            "user:check-access", 
                            "user:list-projects"
                        ]
                    }
                ], 
                "secret": "hq6HeKWxwDxGEk870qHq9HqfkisXwGEz4HZgSD2i3ekR5I4zojxRSrNEz5qzVPiZ"
            }
        ], 
        "returncode": 0
    }, 
    "state": "list"
}

TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:159
ok: [openshift] => {
    "ansible_facts": {
        "proxy_hostnames": [
            "https://kibana.router.default.svc.cluster.local", 
            "https://kibana-ops.router.default.svc.cluster.local"
        ]
    }, 
    "changed": false
}

TASK [openshift_logging_kibana : Create oauth-client template] *****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:162
changed: [openshift] => {
    "changed": true, 
    "checksum": "66774b83423565cc5220b359aa296e8da50c8eeb", 
    "dest": "/tmp/openshift-logging-ansible-aMED3y/templates/oauth-client.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "1e46c851749e74474452530b4d86cd4c", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 382, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019136.92-51857510141092/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_kibana : Set kibana-proxy oauth-client] ****************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:170
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get oauthclient kibana-proxy -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "OAuthClient", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:38:44Z", 
                    "labels": {
                        "logging-infra": "support"
                    }, 
                    "name": "kibana-proxy", 
                    "resourceVersion": "1465", 
                    "selfLink": "/oapi/v1/oauthclients/kibana-proxy", 
                    "uid": "56b36338-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "redirectURIs": [
                    "https://kibana.router.default.svc.cluster.local", 
                    "https://kibana-ops.router.default.svc.cluster.local"
                ], 
                "scopeRestrictions": [
                    {
                        "literals": [
                            "user:info", 
                            "user:check-access", 
                            "user:list-projects"
                        ]
                    }
                ], 
                "secret": "hq6HeKWxwDxGEk870qHq9HqfkisXwGEz4HZgSD2i3ekR5I4zojxRSrNEz5qzVPiZ"
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Set Kibana secret] ****************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:181
ok: [openshift] => {
    "changed": false, 
    "results": {
        "apiVersion": "v1", 
        "data": {
            "ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjME4xb1hEVEl5TURZd09ERTBNemMwT0ZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU4vVnhpck9Ub0Z1OUhrSUVtQVMwU0RlbXE0LzF1RSt3dWxoYWVMWFZXVVEKZ05QVSt3ZTlMRGw5bXJNYUN1VUFKall0YUNxdktxbGpLNUtSWVd4dldIK09lRkd0bmZXajN3RllUL0NIRXpIWAozclR3OVBzU1grelRabHlEUXl5ekZ6S29KNm15MGlsU2NwcXdMcEpIMk9tdjE2Sk12VytKck13Q2FmUURESTMvCkZPZWFSdndVcEUxVlFuK0wvRTJGcU02SlRwQ3ZKVTB3TGp4Rkg1b3ZpSDNDdXBFR29sYW1JNHhXaXp5dUR0amUKNnpuTGJPOGJ2a1BFZkk5MHN1TTR0TlpGOVlyd3VHaS9mUkN4U2c3dXc1dUlocU5xQ3RWSUVGSHN5Qng0Vm96eApTbGhSaHJzR1JuRkdDR2lWM3FVemI3aWZPN05jVUFBMzVzQUZWb0pzUUJzQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFISFoKaEtMM0cxdjZPMjhpcHByb2cxT1k3aDFIQWgrUDZzRmdEczlwbER2WnhnaER5SWdPUEt3R0FYZW9JTlkwZ1dSNApDdEJUeDJWWE82b0dxM1Yrc0pNR2ZjaVVBRzlCaDZneGh5UHIwTkY4aDlWcGQzdFJMVW9hNHVRZ0tSdHYrNlRBCnJISzdpVTF5RmhNSTNKSjF3dllaZjdVNWI4dTNrYnphbUl5WnNqMlJWOXo1cVh1U1lhY0gyd3lEWjh3UjJBdDQKRzZDT0VwNGsxMTg0TmpOZlBtTXAybi8xcFllUjJoTWtIUVlCSitzK2sxWTV5QVNJNEdBWDF6Skt6TWJ4eEhmTgpVYk1RcjBXdUdIMkRmRlVubFJwbWJaeGZWdEpZK1ppQXEwVHVyUnhmNnlpUXE3TDNJZ0YzMERWYWR6V2ZpOTdmCjltS2pEb0tOSUVOaG00Y1NVVnc9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K", 
            "cert": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURSVENDQWkyZ0F3SUJBZ0lCQXpBTkJna3Foa2lHOXcwQkFRVUZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjMU5Gb1hEVEU1TURZd09URTBNemMxTkZvdwpSakVRTUE0R0ExVUVDZ3dIVEc5bloybHVaekVTTUJBR0ExVUVDd3dKVDNCbGJsTm9hV1owTVI0d0hBWURWUVFECkRCVnplWE4wWlcwdWJHOW5aMmx1Wnk1cmFXSmhibUV3Z2dFaU1BMEdDU3FHU0liM0RRRUJBUVVBQTRJQkR3QXcKZ2dFS0FvSUJBUURZVzh5cTBEM1lFVm41eGhlUm9KYVpNcTRwOCs5bUEwWENWc1I2eExvZVZZaE9YMVZVVEQyaQpyNHM4MEp4RmhhTUJqRk40ai9VeExnVW1Eb3J5YTkrMXlDRUcwWEFrMS9uc1Z5ZlNLU1VQeEp6eTJTZ1phak83CndOelVKNTBCbVdMcVZqVEVENmJtYThUWW5aN09mUTF1dEhpV3YrN3BzT1pWa2ZiMEhHKy9BamFlRHQ4MGJVWVMKZHJDQllCRXc3cTJRc0dZR0Z4d0MvVS9vUHp1THdGcHVDcHVIZ0Fra2QzVURXeGZoVzZFNXlBdk4vVGw3cWtWagpWRWZ1cndESWltcTdRT3NWZ0hVMThFQkpoMlNoSS9Zbk9WYUg2K05mS1RORjZhK3RVQ0lPOHZ4VGUxV0ZLWi84CkxVd0xrL0NoZXhaVndYSHBUTnAxRjQ1L2dSZ2l5SENMQWdNQkFBR2paakJrTUE0R0ExVWREd0VCL3dRRUF3SUYKb0RBSkJnTlZIUk1FQWpBQU1CMEdBMVVkSlFRV01CUUdDQ3NHQVFVRkJ3TUJCZ2dyQmdFRkJRY0RBakFkQmdOVgpIUTRFRmdRVUpnZ3UxaWd2MWg4djJ6NEh2Z0k1eHRTQjBJY3dDUVlEVlIwakJBSXdBREFOQmdrcWhraUc5dzBCCkFRVUZBQU9DQVFFQWF3ZHVGY0Z3dk5SZ3JzajdPbGt1WE9rYzJ3NzVGb2FySDdlTXU2cHhtQnhwc2V0M0xBRXgKeWcvMk9YRm8vREZuK3hRTll0WGxMOEJGQ3lscG9PSHFmZStCeGRSdFovM2dCd3R6dUlEZndaL2dWM2tYRmF2Ywo4SGFpU3NUMTZIMmxGanh2cmoyNGtYUkhOQXFHK0tva3ZCNENmMGlod1dabmpQNnJaWkI2OVd2UWI3dGlCMVJHCjJMZ1ZZWlZzVDd0WlgyQ3ZQcldnZDBrY3RnVzROT0lmcDMzWU82NHpVRlp5K0kwaUFTZFNoZHlxNzh5QjVyc1MKWjhXakFRcFVnMktDWmxRKzB6SWhGWHBtdXRnV0dKcUhHdC9NVzdJbzUvNkRUUlR0Q2N3Q1p0YzNmR0pTSmpURApGSjBRNEhML2p0NTR0Uno3eXA3QklxRXp0cjZaMXJyTXlBPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=", 
            "key": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV2QUlCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktZd2dnU2lBZ0VBQW9JQkFRRFlXOHlxMEQzWUVWbjUKeGhlUm9KYVpNcTRwOCs5bUEwWENWc1I2eExvZVZZaE9YMVZVVEQyaXI0czgwSnhGaGFNQmpGTjRqL1V4TGdVbQpEb3J5YTkrMXlDRUcwWEFrMS9uc1Z5ZlNLU1VQeEp6eTJTZ1phak83d056VUo1MEJtV0xxVmpURUQ2Ym1hOFRZCm5aN09mUTF1dEhpV3YrN3BzT1pWa2ZiMEhHKy9BamFlRHQ4MGJVWVNkckNCWUJFdzdxMlFzR1lHRnh3Qy9VL28KUHp1THdGcHVDcHVIZ0Fra2QzVURXeGZoVzZFNXlBdk4vVGw3cWtWalZFZnVyd0RJaW1xN1FPc1ZnSFUxOEVCSgpoMlNoSS9Zbk9WYUg2K05mS1RORjZhK3RVQ0lPOHZ4VGUxV0ZLWi84TFV3TGsvQ2hleFpWd1hIcFROcDFGNDUvCmdSZ2l5SENMQWdNQkFBRUNnZ0VBQmhxbzdwVExlZUliY01HVy9xUFNPK1pESmFuZE1qcWJhRDA4Yzk1REJnSVIKdzJ4TEl3SzNwblJmY0VyT2JlTzBVcUhiNVFYaXZBMTVWYmFKVXdlYUd5M1hTTEwyQUFRYjZBQVpmb05zcVVJNQp6MXd6Rm14NW95MXd6WWVFbFh2M1gzY3BLL0xwR1kwbTA1bTIxa3FPNjNXcUJVN0s5Y3JvNUxjbWlZS1g0SUtwCk1WeFFRTFdtbTlCQXFNb0N0Zm9Wb3JnOEVNT2N3R29BSTFvN1lqczUrOXVZRjgvelJFai9oSTlrdnhRMHY5S1IKaTJIemYxM0lieWptcTc2UUR5MHFHTU1BRUZzbjgxb21yYlJlMFlQQnVsUUdHR3U4cUFYZXRtditZd2pLdS84TApCQUNNbllzM3lGcEw1MWFNNlRnRkJmT244T0JTQ0Yzb0RYMVdzQTZMNlFLQmdRRHdqS3pJM0h3bWZrcXg1M1VPCkZMMEFoWHd1bWhadkFZeEthRm16eGUzUkx4SGYzeStRZGl5YXpLM21YOUdXN3V0aGpZdmJLcmZUL0pVbEwxWDgKbGNkY085cWdJbEdGVFhKdDJ5Z2ZiR3B3MVg1eUE0a3RsbnBZQUt5dTF4dzRjaGdSWG9rM3FiRXdERkdpelV3NgoyQ1ZwdW1la3NnWFI4ZWdBMFNaaGF4TnNod0tCZ1FEbVFWczMySlFKNm5LS1MzWHpNSk16WHRrQmJPRENtY1k3CmNvM0puTUY2QVNRMjB4Y2tEYVpsc25kSzZuNUxvMjJUSXdXckhTQnY0dVNLb3U2MkpaUHgyUnFWUC82VEV2dUkKaEdFaTRxNUZ3ZWxhc2RKSzdtaHJvd01Fa2pLUFh0dmlwTFFCQzlCUFVrbU90ZnF5RklZODRRbURmRERrc2tTdAp2MVIxYzQ1QTNRS0JnSHNPRzkzdEhQMjhJWTM4Nzk2eU1UV3hFNlByWTRxSjZqQ3VUU1V3SGNhakk2dTR5dVpGCjdlK0t3WE5XbFN0M01hYUgrSUczeXE2QlNmTHAwajBTaWhmZTdxcDRvK2RqaFNRNXRyN01nVzhUYUVuL0VRL24KS05NWUZUNmtBR1hIeUJRcW5sdUJUWWxjNCs0Nnc4SmVqRlVYN0tsT0FMZ3ozaEY4N1B0dEFWVEZBb0dBQy8rRgoyRkFoenE1M3ozc0ZKMm4xa2F5MGxCMXZUTlNrZnI2R1l1WVhNdHRTWHNUN0pMYk5YK2svekJpU0FqRCs2YmJlCmViOG56SXkvS2ZZNWlDWkhJa2dVMllnZkpOek5YSnpCVHdjTzI3V3lDamJQNVhXczNVYitSa096L01LTnBLNisKYm51R0hFd0daemhXV3VjMDQ0RTZkSzZKQWJIRVlHVmk2a2ZlTnNVQ2dZQi9KczlpRmhnVmhDcjVOZk5wUXRVawplYnVGTVd3eGhaQ1Y4V1hhckRNeUU4bHlVOVNkRFBUdEsyNjRPVzhsZW95WHA4TnN0dmlhajZ2NzVGamhyTmFjCmtjNk1HL2JCWTRwRkhsYlB1OTVyTUhKcCsweGtwdjNHaWh2eUpnWnB6UkNvUTU2K0RsRFgwS1QrN0xJU3ZWQTUKRFU0VjFKUUd0Yi91SkxCejJwa1RSZz09Ci0tLS0tRU5EIFBSSVZBVEUgS0VZLS0tLS0K"
        }, 
        "kind": "Secret", 
        "metadata": {
            "creationTimestamp": null, 
            "name": "logging-kibana"
        }, 
        "type": "Opaque"
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Set Kibana Proxy secret] **********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:195
ok: [openshift] => {
    "changed": false, 
    "results": {
        "apiVersion": "v1", 
        "data": {
            "oauth-secret": "aHE2SGVLV3h3RHhHRWs4NzBxSHE5SHFma2lzWHdHRXo0SFpnU0QyaTNla1I1STR6b2p4UlNyTkV6NXF6VlBpWg==", 
            "server-cert": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURUakNDQWphZ0F3SUJBZ0lCQWpBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjMU1Gb1hEVEU1TURZd09URTBNemMxTVZvdwpGakVVTUJJR0ExVUVBeE1MSUd0cFltRnVZUzF2Y0hNd2dnRWlNQTBHQ1NxR1NJYjNEUUVCQVFVQUE0SUJEd0F3CmdnRUtBb0lCQVFESUpMK0VtZXJTU01QQ1dGMmdYQzlZTy9ELzRIRFBwdUxNL0o3bmtJV0crcFdiZzczZE56R08KaG5LbXh0K2g0dStMNEJkVmhVR0FCM3c1MmdjTDFvNVM2c1VCNUF3czhEVEtjd3NBN2hQRE01SW0wM1dTWE9MMApXUzcxdXdxWUpHWkF0c1VLRVJLVTk0cVRzTisrTTJ4NnJNT1I0VHBjK3hUWFdJdFh4Wnp1WDFycURQNVdSR2F5CmN4MDFqS0tPaWh3SG1leVRnWHF3eWV6SDFBMkxyUEo4aEc1VVNUdklsM2VyME0yc3VyajFFckpXSVdKZXdnTGwKYzdpNGN6NGRSTnIxc3VyNFZqT3NJNXBoczRqdGJLS29EQlNxQUxBcXF2WDdrVE5vVGlpNkh1SHdCUWZvR1hRaApNaWIvUHlvQmVqVHNEY3ZhbUw2MFhkQmM4N1FaM2hYYkFnTUJBQUdqZ1o0d2dac3dEZ1lEVlIwUEFRSC9CQVFECkFnV2dNQk1HQTFVZEpRUU1NQW9HQ0NzR0FRVUZCd01CTUF3R0ExVWRFd0VCL3dRQ01BQXdaZ1lEVlIwUkJGOHcKWFlJTElHdHBZbUZ1WVMxdmNIT0NMQ0JyYVdKaGJtRXRiM0J6TG5KdmRYUmxjaTVrWldaaGRXeDBMbk4yWXk1agpiSFZ6ZEdWeUxteHZZMkZzZ2hnZ2EybGlZVzVoTGpFeU55NHdMakF1TVM1NGFYQXVhVytDQm10cFltRnVZVEFOCkJna3Foa2lHOXcwQkFRc0ZBQU9DQVFFQXVwSTMzejZzMG5aZ2FSK01Pb29BckhBdGo0Y21uL09laEZCYlVLSW0KejFqdzZuZzFINENxSXgzZ1JueXIzTlNXbVdxSTg0VDcyWkRhUkdkYjdLeXNEc0VHTDdPRFd3YzJGU0sxMGx0MAp3K2pCcTMzZjJUejhsWGMwbUd1QWFEVXIvaVF2eFVmSmp5Z01JUjZmQzRpQUc5dXI1cXBqanMzMHIzUE5CNlVMCmdTVktkWEw2UG85V0lOak9ZbjNrc2FlM0UzUk9uZFM5eUlwWnhUK0NZcnpZTmxwY2Mxd1pva0I5M0RvalM2TGYKaVdraWJVNEc0ZGQrcjB6WngzQWRrc2hFUS9JUHNCeWNkdXloYmFwUGg3M2tQRGc5WUU1U0ZzZHNsa1B0SC94TApETlZmOWJLUHozWGJRYVg2MjdLeHQ2L3krM1JNVFMwd2pHK0FTbU93clpiRW1RPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQotLS0tLUJFR0lOIENFUlRJRklDQVRFLS0tLS0KTUlJQzJqQ0NBY0tnQXdJQkFnSUJBVEFOQmdrcWhraUc5dzBCQVFzRkFEQWVNUnd3R2dZRFZRUURFeE5zYjJkbgphVzVuTFhOcFoyNWxjaTEwWlhOME1CNFhEVEUzTURZd09URTBNemMwTjFvWERUSXlNRFl3T0RFME16YzBPRm93CkhqRWNNQm9HQTFVRUF4TVRiRzluWjJsdVp5MXphV2R1WlhJdGRHVnpkRENDQVNJd0RRWUpLb1pJaHZjTkFRRUIKQlFBRGdnRVBBRENDQVFvQ2dnRUJBTi9WeGlyT1RvRnU5SGtJRW1BUzBTRGVtcTQvMXVFK3d1bGhhZUxYVldVUQpnTlBVK3dlOUxEbDltck1hQ3VVQUpqWXRhQ3F2S3Fsaks1S1JZV3h2V0grT2VGR3RuZldqM3dGWVQvQ0hFekhYCjNyVHc5UHNTWCt6VFpseURReXl6RnpLb0o2bXkwaWxTY3Bxd0xwSkgyT212MTZKTXZXK0pyTXdDYWZRRERJMy8KRk9lYVJ2d1VwRTFWUW4rTC9FMkZxTTZKVHBDdkpVMHdManhGSDVvdmlIM0N1cEVHb2xhbUk0eFdpenl1RHRqZQo2em5MYk84YnZrUEVmSTkwc3VNNHROWkY5WXJ3dUdpL2ZSQ3hTZzd1dzV1SWhxTnFDdFZJRUZIc3lCeDRWb3p4ClNsaFJocnNHUm5GR0NHaVYzcVV6YjdpZk83TmNVQUEzNXNBRlZvSnNRQnNDQXdFQUFhTWpNQ0V3RGdZRFZSMFAKQVFIL0JBUURBZ0trTUE4R0ExVWRFd0VCL3dRRk1BTUJBZjh3RFFZSktvWklodmNOQVFFTEJRQURnZ0VCQUhIWgpoS0wzRzF2Nk8yOGlwcHJvZzFPWTdoMUhBaCtQNnNGZ0RzOXBsRHZaeGdoRHlJZ09QS3dHQVhlb0lOWTBnV1I0CkN0QlR4MlZYTzZvR3EzVitzSk1HZmNpVUFHOUJoNmd4aHlQcjBORjhoOVZwZDN0UkxVb2E0dVFnS1J0dis2VEEKckhLN2lVMXlGaE1JM0pKMXd2WVpmN1U1Yjh1M2tiemFtSXlac2oyUlY5ejVxWHVTWWFjSDJ3eURaOHdSMkF0NApHNkNPRXA0azExODROak5mUG1NcDJuLzFwWWVSMmhNa0hRWUJKK3MrazFZNXlBU0k0R0FYMXpKS3pNYnh4SGZOClViTVFyMFd1R0gyRGZGVW5sUnBtYlp4ZlZ0SlkrWmlBcTBUdXJSeGY2eWlRcTdMM0lnRjMwRFZhZHpXZmk5N2YKOW1LakRvS05JRU5obTRjU1VWdz0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=", 
            "server-key": "LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktLS0tLQpNSUlFb3dJQkFBS0NBUUVBeUNTL2hKbnEwa2pEd2xoZG9Gd3ZXRHZ3LytCd3o2Yml6UHllNTVDRmh2cVZtNE85CjNUY3hqb1p5cHNiZm9lTHZpK0FYVllWQmdBZDhPZG9IQzlhT1V1ckZBZVFNTFBBMHluTUxBTzRUd3pPU0p0TjEKa2x6aTlGa3U5YnNLbUNSbVFMYkZDaEVTbFBlS2s3RGZ2ak5zZXF6RGtlRTZYUHNVMTFpTFY4V2M3bDlhNmd6KwpWa1Jtc25NZE5ZeWlqb29jQjVuc2s0RjZzTW5zeDlRTmk2enlmSVJ1VkVrN3lKZDNxOUROckxxNDlSS3lWaUZpClhzSUM1WE80dUhNK0hVVGE5YkxxK0ZZenJDT2FZYk9JN1d5aXFBd1VxZ0N3S3FyMSs1RXphRTRvdWg3aDhBVUgKNkJsMElUSW0vejhxQVhvMDdBM0wycGkrdEYzUVhQTzBHZDRWMndJREFRQUJBb0lCQVFDUHhrNUtUR055MGxERwpRTUpwV3krcm04dkJsSktWcVJZT0dYOXhhOUZ3S0h6bXJabnIyeVZmZEZmU1ZOVDdyMUZUMHhRUUhGejBRdXhMCmhzTS9EczlJNDF2SXd2QzRLNHBRMEpuYi9pcjJOQXJPbDJORFZEUzVRWVBKaEtiVXFubEdEY1c0T0pGM3IrZTkKdHZiVDVJOE9CU09zblBaWEt3dEtzMUhPS0toV1Y2WXRhMzJiRXplVm9UclU1WDFMWkEvbms4V04zaWl5M0pSVQpTY1NEbkIrYmVFNE9uSXR0d1pMdnRwUlZ1UlBreWJXTEVPbmZOZFcrNEdDQjhQRy9QWmJzSXdHTmsxcDhwaE1sCnNjWWxHamkzVm9uZDBaT244ekxENjhlSHRHWHp1ZThBbGtuR1J0OTY5OTQ0eGxtNnY1K2gySGFBSmdCZWdnZXAKVWtWc1NpZ1pBb0dCQU5qWU1DcHZlT0g2K29zbjlsQ2NldzRHOEZUWEZhYnQyc0owUjNNc1NFYTRZOFJ4cnByTwpNR0Z4S1V2cnFGN1FNUmZxYkhsd3pmTE96WFlrMTFYOHV4a3NQZG1KSkEwejBxTVI3SEJBZXU0WDdJNjl0ZXpoCmJNdy95VW80UEc1dWdpZkNTRzBieTZNTFVJOEoyRC9abWZxZnRkT21aSjU2OEhCdm5rUm92Y1kzQW9HQkFPeEkKaTYwRGdjQzdxaGdqemNMenFwcGd3anJta2ZQNnd1S25Obkc1V0ZwOGJySmM2bWo2dVRhdlhEWVF6VTZEZk94SAoxbHQ1SzQ5WHBTNXEzTExCWUloN0dHOEUxMDdvOEM2ZWtlbFRORTZQWmdiZXROSnRXWWhJZGhmZzNyZ0VMRGt1CjlzMVZLcVJMMTc2ME5PbWQ3UDVKR2kzdmhld0NVZHFQbUJrYTE1dDlBb0dBUmliYW5qL2w2YVhhZkQ1M2IyalEKWHA5Y0RQWndhTXEyWlFaZFB5TnFWb2E4c0FiZkovSGdzUVY5Q0xTNmljSHN3QUgxQ2V5MmxBRFhjNHREcHV2VwpVN1IrWmV2Nkg5Tk9KN2RhdUk0RHR4ZENUb09OWVk4a05ZZkZSUitnWFZHZkJlSFNzSW0zZlkzaGlBVDFVdUxBCjc5WFBheU4vbGMzTUQzUHN6ZERjNUUwQ2dZQkxSYmtwZnVxQkNjZmdOTmZCK1hvcUFCVWdTbi9JcCtRWjdJY04KcDZ3YjkzUVVZa0ZTL0R5d3pTQ2xJS2tuRUFCbURXU2VjM1dMRHJMU25MeCtQZGlRNGhZZ2wvdzNhVUhLdUQxbgpoVmd1aHNSTC8vcSs1cE1WTlhCWm53dVV0OCtXei8xVDRJUGJIMFkxdkpiMnJaYm9VMFdCeU1Kek16SDhYSzVwCm9RRjZ5UUtCZ0F0VkUzRmVWSXNmMjRIOTZTZWo5N2ZnM25sZ203VElRaEV0U283U1NuUkhXV3VGTkVDK1JQNGkKWFVZQjNRNWJLMVFiSUpCTkU0cGNNNGJRbDBFcVZjbVVuckRzS3NzK0Y4RFk5Z2JSTzdobEcyb2hlRFh4alo3KwpwTWFWdXpQUlNrMy9uUTg4aGovVkpWZ01WK3VGanI4TzhOYndMTmRmMmZlbm1hdk1IUGFmCi0tLS0tRU5EIFJTQSBQUklWQVRFIEtFWS0tLS0tCg==", 
            "server-tls.json": "Ly8gU2VlIGZvciBhdmFpbGFibGUgb3B0aW9uczogaHR0cHM6Ly9ub2RlanMub3JnL2FwaS90bHMuaHRtbCN0bHNfdGxzX2NyZWF0ZXNlcnZlcl9vcHRpb25zX3NlY3VyZWNvbm5lY3Rpb25saXN0ZW5lcgp0bHNfb3B0aW9ucyA9IHsKCWNpcGhlcnM6ICdrRUVDREg6K2tFRUNESCtTSEE6a0VESDora0VESCtTSEE6K2tFREgrQ0FNRUxMSUE6a0VDREg6K2tFQ0RIK1NIQTprUlNBOitrUlNBK1NIQTora1JTQStDQU1FTExJQTohYU5VTEw6IWVOVUxMOiFTU0x2MjohUkM0OiFERVM6IUVYUDohU0VFRDohSURFQTorM0RFUycsCglob25vckNpcGhlck9yZGVyOiB0cnVlCn0K", 
            "session-secret": "M2oxZU5nTUtiNWNOWjJvV3QzdU1qUmpqenBoZHRrU3JHZXRNUUhEa1JKS2NJNUNlQkx3SmU3NDluc2pPRk5xOTNLVnBIY1FscnRvdHQ2VTN4ck1IRUg3Q3BRZnVpR2Q4bldZdWwxZEl6YWNnU3NaRW84UktkemdnM3YySXQ5TXNIY2FKS3hmRzV5Z2R5ZVg2TDVXYXNqSmxVbDFWUllGWmVGcXdxSmtXbHB5T2V4N0FFZjg5RGVoeERJUDZmWENZenhmNlRIaWc="
        }, 
        "kind": "Secret", 
        "metadata": {
            "creationTimestamp": null, 
            "name": "logging-kibana-proxy"
        }, 
        "type": "Opaque"
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Generate Kibana DC template] ******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:221
changed: [openshift] => {
    "changed": true, 
    "checksum": "5d3e0ca245ba76c18712bbe9899ec3e4e221c78e", 
    "dest": "/tmp/openshift-logging-ansible-aMED3y/templates/kibana-dc.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "da5a6cd62f48d11460b17a0258dd52a8", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 3757, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019140.87-119432478238231/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_kibana : Set Kibana DC] ********************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:240
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get dc logging-kibana-ops -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "DeploymentConfig", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:39:01Z", 
                    "generation": 2, 
                    "labels": {
                        "component": "kibana-ops", 
                        "logging-infra": "kibana", 
                        "provider": "openshift"
                    }, 
                    "name": "logging-kibana-ops", 
                    "namespace": "logging", 
                    "resourceVersion": "1493", 
                    "selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-kibana-ops", 
                    "uid": "614f24f2-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "replicas": 1, 
                    "selector": {
                        "component": "kibana-ops", 
                        "logging-infra": "kibana", 
                        "provider": "openshift"
                    }, 
                    "strategy": {
                        "activeDeadlineSeconds": 21600, 
                        "resources": {}, 
                        "rollingParams": {
                            "intervalSeconds": 1, 
                            "maxSurge": "25%", 
                            "maxUnavailable": "25%", 
                            "timeoutSeconds": 600, 
                            "updatePeriodSeconds": 1
                        }, 
                        "type": "Rolling"
                    }, 
                    "template": {
                        "metadata": {
                            "creationTimestamp": null, 
                            "labels": {
                                "component": "kibana-ops", 
                                "logging-infra": "kibana", 
                                "provider": "openshift"
                            }, 
                            "name": "logging-kibana-ops"
                        }, 
                        "spec": {
                            "containers": [
                                {
                                    "env": [
                                        {
                                            "name": "ES_HOST", 
                                            "value": "logging-es-ops"
                                        }, 
                                        {
                                            "name": "ES_PORT", 
                                            "value": "9200"
                                        }, 
                                        {
                                            "name": "KIBANA_MEMORY_LIMIT", 
                                            "valueFrom": {
                                                "resourceFieldRef": {
                                                    "containerName": "kibana", 
                                                    "divisor": "0", 
                                                    "resource": "limits.memory"
                                                }
                                            }
                                        }
                                    ], 
                                    "image": "172.30.224.2:5000/logging/logging-kibana:latest", 
                                    "imagePullPolicy": "Always", 
                                    "name": "kibana", 
                                    "readinessProbe": {
                                        "exec": {
                                            "command": [
                                                "/usr/share/kibana/probe/readiness.sh"
                                            ]
                                        }, 
                                        "failureThreshold": 3, 
                                        "initialDelaySeconds": 5, 
                                        "periodSeconds": 5, 
                                        "successThreshold": 1, 
                                        "timeoutSeconds": 4
                                    }, 
                                    "resources": {
                                        "limits": {
                                            "memory": "736Mi"
                                        }
                                    }, 
                                    "terminationMessagePath": "/dev/termination-log", 
                                    "terminationMessagePolicy": "File", 
                                    "volumeMounts": [
                                        {
                                            "mountPath": "/etc/kibana/keys", 
                                            "name": "kibana", 
                                            "readOnly": true
                                        }
                                    ]
                                }, 
                                {
                                    "env": [
                                        {
                                            "name": "OAP_BACKEND_URL", 
                                            "value": "http://localhost:5601"
                                        }, 
                                        {
                                            "name": "OAP_AUTH_MODE", 
                                            "value": "oauth2"
                                        }, 
                                        {
                                            "name": "OAP_TRANSFORM", 
                                            "value": "user_header,token_header"
                                        }, 
                                        {
                                            "name": "OAP_OAUTH_ID", 
                                            "value": "kibana-proxy"
                                        }, 
                                        {
                                            "name": "OAP_MASTER_URL", 
                                            "value": "https://kubernetes.default.svc.cluster.local"
                                        }, 
                                        {
                                            "name": "OAP_PUBLIC_MASTER_URL", 
                                            "value": "https://172.18.4.93:8443"
                                        }, 
                                        {
                                            "name": "OAP_LOGOUT_REDIRECT", 
                                            "value": "https://172.18.4.93:8443/console/logout"
                                        }, 
                                        {
                                            "name": "OAP_MASTER_CA_FILE", 
                                            "value": "/var/run/secrets/kubernetes.io/serviceaccount/ca.crt"
                                        }, 
                                        {
                                            "name": "OAP_DEBUG", 
                                            "value": "False"
                                        }, 
                                        {
                                            "name": "OAP_OAUTH_SECRET_FILE", 
                                            "value": "/secret/oauth-secret"
                                        }, 
                                        {
                                            "name": "OAP_SERVER_CERT_FILE", 
                                            "value": "/secret/server-cert"
                                        }, 
                                        {
                                            "name": "OAP_SERVER_KEY_FILE", 
                                            "value": "/secret/server-key"
                                        }, 
                                        {
                                            "name": "OAP_SERVER_TLS_FILE", 
                                            "value": "/secret/server-tls.json"
                                        }, 
                                        {
                                            "name": "OAP_SESSION_SECRET_FILE", 
                                            "value": "/secret/session-secret"
                                        }, 
                                        {
                                            "name": "OCP_AUTH_PROXY_MEMORY_LIMIT", 
                                            "valueFrom": {
                                                "resourceFieldRef": {
                                                    "containerName": "kibana-proxy", 
                                                    "divisor": "0", 
                                                    "resource": "limits.memory"
                                                }
                                            }
                                        }
                                    ], 
                                    "image": "172.30.224.2:5000/logging/logging-auth-proxy:latest", 
                                    "imagePullPolicy": "Always", 
                                    "name": "kibana-proxy", 
                                    "ports": [
                                        {
                                            "containerPort": 3000, 
                                            "name": "oaproxy", 
                                            "protocol": "TCP"
                                        }
                                    ], 
                                    "resources": {
                                        "limits": {
                                            "memory": "96Mi"
                                        }
                                    }, 
                                    "terminationMessagePath": "/dev/termination-log", 
                                    "terminationMessagePolicy": "File", 
                                    "volumeMounts": [
                                        {
                                            "mountPath": "/secret", 
                                            "name": "kibana-proxy", 
                                            "readOnly": true
                                        }
                                    ]
                                }
                            ], 
                            "dnsPolicy": "ClusterFirst", 
                            "restartPolicy": "Always", 
                            "schedulerName": "default-scheduler", 
                            "securityContext": {}, 
                            "serviceAccount": "aggregated-logging-kibana", 
                            "serviceAccountName": "aggregated-logging-kibana", 
                            "terminationGracePeriodSeconds": 30, 
                            "volumes": [
                                {
                                    "name": "kibana", 
                                    "secret": {
                                        "defaultMode": 420, 
                                        "secretName": "logging-kibana"
                                    }
                                }, 
                                {
                                    "name": "kibana-proxy", 
                                    "secret": {
                                        "defaultMode": 420, 
                                        "secretName": "logging-kibana-proxy"
                                    }
                                }
                            ]
                        }
                    }, 
                    "test": false, 
                    "triggers": [
                        {
                            "type": "ConfigChange"
                        }
                    ]
                }, 
                "status": {
                    "availableReplicas": 0, 
                    "conditions": [
                        {
                            "lastTransitionTime": "2017-06-09T14:39:01Z", 
                            "lastUpdateTime": "2017-06-09T14:39:01Z", 
                            "message": "Deployment config does not have minimum availability.", 
                            "status": "False", 
                            "type": "Available"
                        }, 
                        {
                            "lastTransitionTime": "2017-06-09T14:39:02Z", 
                            "lastUpdateTime": "2017-06-09T14:39:02Z", 
                            "message": "replication controller \"logging-kibana-ops-1\" is waiting for pod \"logging-kibana-ops-1-deploy\" to run", 
                            "status": "Unknown", 
                            "type": "Progressing"
                        }
                    ], 
                    "details": {
                        "causes": [
                            {
                                "type": "ConfigChange"
                            }
                        ], 
                        "message": "config change"
                    }, 
                    "latestVersion": 1, 
                    "observedGeneration": 2, 
                    "replicas": 0, 
                    "unavailableReplicas": 0, 
                    "updatedReplicas": 0
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_kibana : Delete temp directory] ************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:252
ok: [openshift] => {
    "changed": false, 
    "path": "/tmp/openshift-logging-ansible-aMED3y", 
    "state": "absent"
}

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:195
statically included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml

TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:3
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:7
ok: [openshift] => {
    "ansible_facts": {
        "curator_version": "3_5"
    }, 
    "changed": false
}

TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:15
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : Create temp directory for doing work in] *****
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:5
ok: [openshift] => {
    "changed": false, 
    "cmd": [
        "mktemp", 
        "-d", 
        "/tmp/openshift-logging-ansible-XXXXXX"
    ], 
    "delta": "0:00:00.009591", 
    "end": "2017-06-09 10:39:04.206757", 
    "rc": 0, 
    "start": "2017-06-09 10:39:04.197166"
}

STDOUT:

/tmp/openshift-logging-ansible-iHgrng

TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:10
ok: [openshift] => {
    "ansible_facts": {
        "tempdir": "/tmp/openshift-logging-ansible-iHgrng"
    }, 
    "changed": false
}

TASK [openshift_logging_curator : Create templates subdirectory] ***************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:14
ok: [openshift] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/tmp/openshift-logging-ansible-iHgrng/templates", 
    "secontext": "unconfined_u:object_r:user_tmp_t:s0", 
    "size": 6, 
    "state": "directory", 
    "uid": 0
}

TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:24
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:32
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get sa aggregated-logging-curator -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "imagePullSecrets": [
                    {
                        "name": "aggregated-logging-curator-dockercfg-kmsqd"
                    }
                ], 
                "kind": "ServiceAccount", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:39:05Z", 
                    "name": "aggregated-logging-curator", 
                    "namespace": "logging", 
                    "resourceVersion": "1504", 
                    "selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-curator", 
                    "uid": "633a2289-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "secrets": [
                    {
                        "name": "aggregated-logging-curator-token-mdrm0"
                    }, 
                    {
                        "name": "aggregated-logging-curator-dockercfg-kmsqd"
                    }
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:41
ok: [openshift] => {
    "changed": false, 
    "checksum": "9008efd9a8892dcc42c28c6dfb6708527880a6d8", 
    "dest": "/tmp/openshift-logging-ansible-iHgrng/curator.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "5498c5fd98f3dd06e34b20eb1f55dc12", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 320, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019145.67-36923368276207/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:47
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : Set Curator configmap] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:53
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get configmap logging-curator -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "data": {
                    "config.yaml": "# Logging example curator config file\n\n# uncomment and use this to override the defaults from env vars\n#.defaults:\n#  delete:\n#    days: 30\n#  runhour: 0\n#  runminute: 0\n\n# to keep ops logs for a different duration:\n#.operations:\n#  delete:\n#    weeks: 8\n\n# example for a normal project\n#myapp:\n#  delete:\n#    weeks: 1\n"
                }, 
                "kind": "ConfigMap", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:39:06Z", 
                    "name": "logging-curator", 
                    "namespace": "logging", 
                    "resourceVersion": "1520", 
                    "selfLink": "/api/v1/namespaces/logging/configmaps/logging-curator", 
                    "uid": "6411aa37-4d21-11e7-83b0-0e6fb895db82"
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_curator : Set Curator secret] **************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:62
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc secrets new logging-curator ca=/etc/origin/logging/ca.crt key=/etc/origin/logging/system.logging.curator.key cert=/etc/origin/logging/system.logging.curator.crt -n logging", 
        "results": "", 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:75
ok: [openshift] => {
    "ansible_facts": {
        "curator_component": "curator", 
        "curator_name": "logging-curator"
    }, 
    "changed": false
}

TASK [openshift_logging_curator : Generate Curator deploymentconfig] ***********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:81
ok: [openshift] => {
    "changed": false, 
    "checksum": "674dfe0c6e9c0d9b4e3c74b9072d086f7985adde", 
    "dest": "/tmp/openshift-logging-ansible-iHgrng/templates/curator-dc.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "c6ded9fd24af49942a05ed8e78bf3968", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 2339, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019147.98-54781495807966/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_curator : Set Curator DC] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:99
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get dc logging-curator -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "DeploymentConfig", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:39:08Z", 
                    "generation": 2, 
                    "labels": {
                        "component": "curator", 
                        "logging-infra": "curator", 
                        "provider": "openshift"
                    }, 
                    "name": "logging-curator", 
                    "namespace": "logging", 
                    "resourceVersion": "1540", 
                    "selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-curator", 
                    "uid": "6570c974-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "replicas": 1, 
                    "selector": {
                        "component": "curator", 
                        "logging-infra": "curator", 
                        "provider": "openshift"
                    }, 
                    "strategy": {
                        "activeDeadlineSeconds": 21600, 
                        "recreateParams": {
                            "timeoutSeconds": 600
                        }, 
                        "resources": {}, 
                        "rollingParams": {
                            "intervalSeconds": 1, 
                            "maxSurge": "25%", 
                            "maxUnavailable": "25%", 
                            "timeoutSeconds": 600, 
                            "updatePeriodSeconds": 1
                        }, 
                        "type": "Recreate"
                    }, 
                    "template": {
                        "metadata": {
                            "creationTimestamp": null, 
                            "labels": {
                                "component": "curator", 
                                "logging-infra": "curator", 
                                "provider": "openshift"
                            }, 
                            "name": "logging-curator"
                        }, 
                        "spec": {
                            "containers": [
                                {
                                    "env": [
                                        {
                                            "name": "K8S_HOST_URL", 
                                            "value": "https://kubernetes.default.svc.cluster.local"
                                        }, 
                                        {
                                            "name": "ES_HOST", 
                                            "value": "logging-es"
                                        }, 
                                        {
                                            "name": "ES_PORT", 
                                            "value": "9200"
                                        }, 
                                        {
                                            "name": "ES_CLIENT_CERT", 
                                            "value": "/etc/curator/keys/cert"
                                        }, 
                                        {
                                            "name": "ES_CLIENT_KEY", 
                                            "value": "/etc/curator/keys/key"
                                        }, 
                                        {
                                            "name": "ES_CA", 
                                            "value": "/etc/curator/keys/ca"
                                        }, 
                                        {
                                            "name": "CURATOR_DEFAULT_DAYS", 
                                            "value": "30"
                                        }, 
                                        {
                                            "name": "CURATOR_RUN_HOUR", 
                                            "value": "0"
                                        }, 
                                        {
                                            "name": "CURATOR_RUN_MINUTE", 
                                            "value": "0"
                                        }, 
                                        {
                                            "name": "CURATOR_RUN_TIMEZONE", 
                                            "value": "UTC"
                                        }, 
                                        {
                                            "name": "CURATOR_SCRIPT_LOG_LEVEL", 
                                            "value": "INFO"
                                        }, 
                                        {
                                            "name": "CURATOR_LOG_LEVEL", 
                                            "value": "ERROR"
                                        }
                                    ], 
                                    "image": "172.30.224.2:5000/logging/logging-curator:latest", 
                                    "imagePullPolicy": "Always", 
                                    "name": "curator", 
                                    "resources": {
                                        "limits": {
                                            "cpu": "100m"
                                        }
                                    }, 
                                    "terminationMessagePath": "/dev/termination-log", 
                                    "terminationMessagePolicy": "File", 
                                    "volumeMounts": [
                                        {
                                            "mountPath": "/etc/curator/keys", 
                                            "name": "certs", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/etc/curator/settings", 
                                            "name": "config", 
                                            "readOnly": true
                                        }
                                    ]
                                }
                            ], 
                            "dnsPolicy": "ClusterFirst", 
                            "restartPolicy": "Always", 
                            "schedulerName": "default-scheduler", 
                            "securityContext": {}, 
                            "serviceAccount": "aggregated-logging-curator", 
                            "serviceAccountName": "aggregated-logging-curator", 
                            "terminationGracePeriodSeconds": 30, 
                            "volumes": [
                                {
                                    "name": "certs", 
                                    "secret": {
                                        "defaultMode": 420, 
                                        "secretName": "logging-curator"
                                    }
                                }, 
                                {
                                    "configMap": {
                                        "defaultMode": 420, 
                                        "name": "logging-curator"
                                    }, 
                                    "name": "config"
                                }
                            ]
                        }
                    }, 
                    "test": false, 
                    "triggers": [
                        {
                            "type": "ConfigChange"
                        }
                    ]
                }, 
                "status": {
                    "availableReplicas": 0, 
                    "conditions": [
                        {
                            "lastTransitionTime": "2017-06-09T14:39:08Z", 
                            "lastUpdateTime": "2017-06-09T14:39:08Z", 
                            "message": "Deployment config does not have minimum availability.", 
                            "status": "False", 
                            "type": "Available"
                        }, 
                        {
                            "lastTransitionTime": "2017-06-09T14:39:09Z", 
                            "lastUpdateTime": "2017-06-09T14:39:09Z", 
                            "message": "replication controller \"logging-curator-1\" is waiting for pod \"logging-curator-1-deploy\" to run", 
                            "status": "Unknown", 
                            "type": "Progressing"
                        }
                    ], 
                    "details": {
                        "causes": [
                            {
                                "type": "ConfigChange"
                            }
                        ], 
                        "message": "config change"
                    }, 
                    "latestVersion": 1, 
                    "observedGeneration": 2, 
                    "replicas": 0, 
                    "unavailableReplicas": 0, 
                    "updatedReplicas": 0
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_curator : Delete temp directory] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:109
ok: [openshift] => {
    "changed": false, 
    "path": "/tmp/openshift-logging-ansible-iHgrng", 
    "state": "absent"
}

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:207
statically included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml

TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:3
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:7
ok: [openshift] => {
    "ansible_facts": {
        "curator_version": "3_5"
    }, 
    "changed": false
}

TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:15
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : Create temp directory for doing work in] *****
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:5
ok: [openshift] => {
    "changed": false, 
    "cmd": [
        "mktemp", 
        "-d", 
        "/tmp/openshift-logging-ansible-XXXXXX"
    ], 
    "delta": "0:00:00.004080", 
    "end": "2017-06-09 10:39:11.954336", 
    "rc": 0, 
    "start": "2017-06-09 10:39:11.950256"
}

STDOUT:

/tmp/openshift-logging-ansible-vywoUO

TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:10
ok: [openshift] => {
    "ansible_facts": {
        "tempdir": "/tmp/openshift-logging-ansible-vywoUO"
    }, 
    "changed": false
}

TASK [openshift_logging_curator : Create templates subdirectory] ***************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:14
ok: [openshift] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/tmp/openshift-logging-ansible-vywoUO/templates", 
    "secontext": "unconfined_u:object_r:user_tmp_t:s0", 
    "size": 6, 
    "state": "directory", 
    "uid": 0
}

TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:24
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:32
ok: [openshift] => {
    "changed": false, 
    "results": {
        "cmd": "/bin/oc get sa aggregated-logging-curator -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "imagePullSecrets": [
                    {
                        "name": "aggregated-logging-curator-dockercfg-kmsqd"
                    }
                ], 
                "kind": "ServiceAccount", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:39:05Z", 
                    "name": "aggregated-logging-curator", 
                    "namespace": "logging", 
                    "resourceVersion": "1504", 
                    "selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-curator", 
                    "uid": "633a2289-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "secrets": [
                    {
                        "name": "aggregated-logging-curator-token-mdrm0"
                    }, 
                    {
                        "name": "aggregated-logging-curator-dockercfg-kmsqd"
                    }
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:41
ok: [openshift] => {
    "changed": false, 
    "checksum": "9008efd9a8892dcc42c28c6dfb6708527880a6d8", 
    "dest": "/tmp/openshift-logging-ansible-vywoUO/curator.yml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "5498c5fd98f3dd06e34b20eb1f55dc12", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 320, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019153.02-200314772942038/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:47
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_curator : Set Curator configmap] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:53
ok: [openshift] => {
    "changed": false, 
    "results": {
        "cmd": "/bin/oc get configmap logging-curator -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "data": {
                    "config.yaml": "# Logging example curator config file\n\n# uncomment and use this to override the defaults from env vars\n#.defaults:\n#  delete:\n#    days: 30\n#  runhour: 0\n#  runminute: 0\n\n# to keep ops logs for a different duration:\n#.operations:\n#  delete:\n#    weeks: 8\n\n# example for a normal project\n#myapp:\n#  delete:\n#    weeks: 1\n"
                }, 
                "kind": "ConfigMap", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:39:06Z", 
                    "name": "logging-curator", 
                    "namespace": "logging", 
                    "resourceVersion": "1520", 
                    "selfLink": "/api/v1/namespaces/logging/configmaps/logging-curator", 
                    "uid": "6411aa37-4d21-11e7-83b0-0e6fb895db82"
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_curator : Set Curator secret] **************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:62
ok: [openshift] => {
    "changed": false, 
    "results": {
        "apiVersion": "v1", 
        "data": {
            "ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjME4xb1hEVEl5TURZd09ERTBNemMwT0ZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU4vVnhpck9Ub0Z1OUhrSUVtQVMwU0RlbXE0LzF1RSt3dWxoYWVMWFZXVVEKZ05QVSt3ZTlMRGw5bXJNYUN1VUFKall0YUNxdktxbGpLNUtSWVd4dldIK09lRkd0bmZXajN3RllUL0NIRXpIWAozclR3OVBzU1grelRabHlEUXl5ekZ6S29KNm15MGlsU2NwcXdMcEpIMk9tdjE2Sk12VytKck13Q2FmUURESTMvCkZPZWFSdndVcEUxVlFuK0wvRTJGcU02SlRwQ3ZKVTB3TGp4Rkg1b3ZpSDNDdXBFR29sYW1JNHhXaXp5dUR0amUKNnpuTGJPOGJ2a1BFZkk5MHN1TTR0TlpGOVlyd3VHaS9mUkN4U2c3dXc1dUlocU5xQ3RWSUVGSHN5Qng0Vm96eApTbGhSaHJzR1JuRkdDR2lWM3FVemI3aWZPN05jVUFBMzVzQUZWb0pzUUJzQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFISFoKaEtMM0cxdjZPMjhpcHByb2cxT1k3aDFIQWgrUDZzRmdEczlwbER2WnhnaER5SWdPUEt3R0FYZW9JTlkwZ1dSNApDdEJUeDJWWE82b0dxM1Yrc0pNR2ZjaVVBRzlCaDZneGh5UHIwTkY4aDlWcGQzdFJMVW9hNHVRZ0tSdHYrNlRBCnJISzdpVTF5RmhNSTNKSjF3dllaZjdVNWI4dTNrYnphbUl5WnNqMlJWOXo1cVh1U1lhY0gyd3lEWjh3UjJBdDQKRzZDT0VwNGsxMTg0TmpOZlBtTXAybi8xcFllUjJoTWtIUVlCSitzK2sxWTV5QVNJNEdBWDF6Skt6TWJ4eEhmTgpVYk1RcjBXdUdIMkRmRlVubFJwbWJaeGZWdEpZK1ppQXEwVHVyUnhmNnlpUXE3TDNJZ0YzMERWYWR6V2ZpOTdmCjltS2pEb0tOSUVOaG00Y1NVVnc9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K", 
            "cert": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURSakNDQWk2Z0F3SUJBZ0lCQkRBTkJna3Foa2lHOXcwQkFRVUZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPVEUwTXpjMU5Gb1hEVEU1TURZd09URTBNemMxTkZvdwpSekVRTUE0R0ExVUVDZ3dIVEc5bloybHVaekVTTUJBR0ExVUVDd3dKVDNCbGJsTm9hV1owTVI4d0hRWURWUVFECkRCWnplWE4wWlcwdWJHOW5aMmx1Wnk1amRYSmhkRzl5TUlJQklqQU5CZ2txaGtpRzl3MEJBUUVGQUFPQ0FROEEKTUlJQkNnS0NBUUVBdy9qamxTSjB3RTFrZDVTeXQwUXhiNHdsalZwdVYvWmtGZkVjRUR4T014NXVqSmpHeGlpbwpqd0IrUG11Mkx2cmxNbDJWU29IM2VMWUIwc004R2VWV25GS1FNeXh6cUkrZEVUTThjekthK1F5WlpIc2k5dGJnClJCeElVZCtqOStCU3dMdDE0Mm8xWnlsYlQvTFloYiszdXdsRk1NMVE0YlZDb3FFTFBCVi9HUjFySFZxS2dUOW0Kci8vazA3N1ZvNm95T0RSeE5kQW1LQjNWckxCZ243VTFYRTloWkVxUGtTRnQ4UDFqVFNtTzkwd1BsaUQ5MG9RcgpCR3loQktLSFFsb3EvMzRNdmt0VldUcGpoWUordGlhckRrT0dPZUVoM3RKbzhxb2lFNE9QVHRvenpleU00cjNvCmtzeDhXYVFrSU5sVk1QK0F3eThONzRHMldSZE5KQm1rVHdJREFRQUJvMll3WkRBT0JnTlZIUThCQWY4RUJBTUMKQmFBd0NRWURWUjBUQkFJd0FEQWRCZ05WSFNVRUZqQVVCZ2dyQmdFRkJRY0RBUVlJS3dZQkJRVUhBd0l3SFFZRApWUjBPQkJZRUZBRHYzWFFLemNMQ2czZzNSTng4dnUwZlVyQUZNQWtHQTFVZEl3UUNNQUF3RFFZSktvWklodmNOCkFRRUZCUUFEZ2dFQkFONnlKZWY5ckFOSXUvRXh4NzFUNnRTMEQ1VTl1VHdqZXZTYW9QRVluWERWbElZbnc5Vi8KamNFakYrN1lKTnpOZkRRMGpETVZBR2ZjSlJScVl2eWh5YllKRitJYkNkSDA1aHhwK2wrN1Bpd1YybU03aXh4TwprcnFnWDk1NjJTOWk2R29pYnZXa3IxUy9rSWxuWkFCRzI4Tmx5T3M0Qm0zY3JrZHJlY1dZTVZPZTFPemRtZTUwCjl1RGFoMHR5dlV4dHlaMHdUaXlRbHVvcGFneDNETHJiMUpCdFBwMGhDM3pYNGlXYWRlU0hTLzhNdERDeWdTVkkKalA4dlBKTVdUQ292RC9mQWxFeSs3TCttV3lyVkFpNHkyQ2NnU0JwNlhEdzBSNVpyZkc0UUZ5M0hhUFpDa2drRgp5ZEcrTnhOM1lRanN3TmtIR0Y2Ky9Qd05kaUowOGxDMHprZz0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=", 
            "key": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV2QUlCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktZd2dnU2lBZ0VBQW9JQkFRREQrT09WSW5UQVRXUjMKbExLM1JERnZqQ1dOV201WDltUVY4UndRUEU0ekhtNk1tTWJHS0tpUEFINCthN1l1K3VVeVhaVktnZmQ0dGdIUwp3endaNVZhY1VwQXpMSE9vajUwUk16eHpNcHI1REpsa2V5TDIxdUJFSEVoUjM2UDM0RkxBdTNYamFqVm5LVnRQCjh0aUZ2N2U3Q1VVd3pWRGh0VUtpb1FzOEZYOFpIV3NkV29xQlAyYXYvK1RUdnRXanFqSTROSEUxMENZb0hkV3MKc0dDZnRUVmNUMkZrU28rUklXM3cvV05OS1k3M1RBK1dJUDNTaENzRWJLRUVvb2RDV2lyL2ZneStTMVZaT21PRgpnbjYySnFzT1E0WTU0U0hlMG1qeXFpSVRnNDlPMmpQTjdJeml2ZWlTekh4WnBDUWcyVlV3LzREREx3M3ZnYlpaCkYwMGtHYVJQQWdNQkFBRUNnZ0VBZTgxdUtMYmR2dWFsZzQzaTRUZ3BhdWpFaUdvS3IzTXVnMVlRZm9rNmRielIKNWV4V0ZyVjAxdkplenB4Sk9hQ3l6b0NrWWE5OVlUcktLQlhDa0RGNzU4R1k2MXMzcmRNY1ROTnJhdk1iU0I2WApnUXp4WjdNVGRyUFBWRG5PWWpmS3o0c2R6STg2TVhQRkJkckt3cVA5TkNHRkhuRjJtVUJqV2s0V3hOeG5zTHJ6CjRvWDJlWFB5R0FWQklFdmwzbkVsU2dzREY2NDZzSW5vRVJYT091Q1YxWFNVMzVZTStDTkVMd1UxWiswWkplOVQKdFdHcGhicEtBMFhoWU9KL0diN2dXb056bUk1NWRwalllY3hOM3BTck9ocXpyRHZaMGdBRWVCZzJnSWhIdEJlRwpSbkF1S1FmRGkwdUVDNVMwdnFwQ1Z0QlRjaVp4eWJ3MTFWb2ltVWNkb1FLQmdRRG1ybVlRMStQSWEwRnNIRHNOCjJiVTNPeWJUYTJFV2xMNUR1MlBUdXIrVzBEc1VKZG91ZVFIUGErOHFVaDkwZnVFalNGUEVHTEdNUW9VcEtObEkKRXZDK1ZEVkNMWE1yVkY3ZzFuZTRyZnBUL0ttYzd4b3o0eHdZcHdwVmdKTGp4MTdIbkszQVEwdGQwTVV2c1dyYgpBQWx1dnNnMFI3bVhyeVpycU1veDEwS3g1d0tCZ1FEWmV6K1BEQVFCQnl6TWdNWXB2Z1JYWWs4RUczbmQ3bnlFCk5hWlY5VHFTSVlPZmwzd2NtRUI4em9kc3Jadi9VY1R0MVQ1NUNMWExtZXg4S0VsUW50djgxR3FzeVg3QSs3Uk4KUisvejIxMm91NUZMRG1nUXdraHI2KzE0UVJUQkVjZzBTN2dpM3o4M241ZnVLMzExMzg2cmNFNjFBRnpqMUk2dwpsMzU4a1JSOVdRS0JnQ1hUOVlMUGxGZmFWc2tldkFSaWJoZ1hpQjlsWFc5eGh0M2VqZGs3cDQxWXFrZDhpWkhUCllCdWVqSUs1SXRWY3RSaXZGS1Ywa3pEMys1UXJVYTVEQk4yQk81YVZrMnhJa3FKMktQM004ZDd4OFBKK055TEoKSEdOaWlReEtZRXdmOW9mdlJvTGZ4aTBsUGlDN0RGWExaTUNYWW9kSU44UUxBZGJudi9oNlgwS0RBb0dBSUljSQppSktaZnJYZUEzQTNNME1TVHllSy9laEdIK1Z0aGpDb0FpODlaV0hSMStyR1MxaVNQSmgvTk1mNmp3TmgydGlNCkVYbzJCbyt0Q1J6VmFsTjRQNlpkbGg1RThRUTBnV0tEQnd3R1JCaFpzTW9rMG16Y2ZCZTdXc01SSVpkUlJGRVcKNTFCUW1qY3FlQnJWRnU0MGNQdkpLZFUyRDNWWktoeGVkSzhXUzNFQ2dZQTdjVjVycWNMazFPTERWQWNwUkg4MApJS21hbUxKVXpJbllQMkN1dVJhOHFJMVFwQjcwbjF5cVJKbnBqNDdCVUZXQnpDcWxVNUVzZVp0MnJVZm43dURyCmhnSlBlYWQ5OFBubEM0TXdYdkh6TURhRTZoVHdna1lid0lEYjhpVkhCMnJvNE1OMDNiLzJaUFpVZTB0VXB6WE0KUmNMV0ozMU5VN2xaUlhtT05xSHJrZz09Ci0tLS0tRU5EIFBSSVZBVEUgS0VZLS0tLS0K"
        }, 
        "kind": "Secret", 
        "metadata": {
            "creationTimestamp": null, 
            "name": "logging-curator"
        }, 
        "type": "Opaque"
    }, 
    "state": "present"
}

TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:75
ok: [openshift] => {
    "ansible_facts": {
        "curator_component": "curator-ops", 
        "curator_name": "logging-curator-ops"
    }, 
    "changed": false
}

TASK [openshift_logging_curator : Generate Curator deploymentconfig] ***********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:81
ok: [openshift] => {
    "changed": false, 
    "checksum": "704ab874d5e53eb04fb70ac51b0b9d90cfedd08f", 
    "dest": "/tmp/openshift-logging-ansible-vywoUO/templates/curator-dc.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "d5ffd5de466b7f6cdb2903f417adf34f", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 2363, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019154.7-81526706715204/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_curator : Set Curator DC] ******************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:99
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get dc logging-curator-ops -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "DeploymentConfig", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:39:15Z", 
                    "generation": 2, 
                    "labels": {
                        "component": "curator-ops", 
                        "logging-infra": "curator", 
                        "provider": "openshift"
                    }, 
                    "name": "logging-curator-ops", 
                    "namespace": "logging", 
                    "resourceVersion": "1586", 
                    "selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-curator-ops", 
                    "uid": "696d60a3-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "replicas": 1, 
                    "selector": {
                        "component": "curator-ops", 
                        "logging-infra": "curator", 
                        "provider": "openshift"
                    }, 
                    "strategy": {
                        "activeDeadlineSeconds": 21600, 
                        "recreateParams": {
                            "timeoutSeconds": 600
                        }, 
                        "resources": {}, 
                        "rollingParams": {
                            "intervalSeconds": 1, 
                            "maxSurge": "25%", 
                            "maxUnavailable": "25%", 
                            "timeoutSeconds": 600, 
                            "updatePeriodSeconds": 1
                        }, 
                        "type": "Recreate"
                    }, 
                    "template": {
                        "metadata": {
                            "creationTimestamp": null, 
                            "labels": {
                                "component": "curator-ops", 
                                "logging-infra": "curator", 
                                "provider": "openshift"
                            }, 
                            "name": "logging-curator-ops"
                        }, 
                        "spec": {
                            "containers": [
                                {
                                    "env": [
                                        {
                                            "name": "K8S_HOST_URL", 
                                            "value": "https://kubernetes.default.svc.cluster.local"
                                        }, 
                                        {
                                            "name": "ES_HOST", 
                                            "value": "logging-es-ops"
                                        }, 
                                        {
                                            "name": "ES_PORT", 
                                            "value": "9200"
                                        }, 
                                        {
                                            "name": "ES_CLIENT_CERT", 
                                            "value": "/etc/curator/keys/cert"
                                        }, 
                                        {
                                            "name": "ES_CLIENT_KEY", 
                                            "value": "/etc/curator/keys/key"
                                        }, 
                                        {
                                            "name": "ES_CA", 
                                            "value": "/etc/curator/keys/ca"
                                        }, 
                                        {
                                            "name": "CURATOR_DEFAULT_DAYS", 
                                            "value": "30"
                                        }, 
                                        {
                                            "name": "CURATOR_RUN_HOUR", 
                                            "value": "0"
                                        }, 
                                        {
                                            "name": "CURATOR_RUN_MINUTE", 
                                            "value": "0"
                                        }, 
                                        {
                                            "name": "CURATOR_RUN_TIMEZONE", 
                                            "value": "UTC"
                                        }, 
                                        {
                                            "name": "CURATOR_SCRIPT_LOG_LEVEL", 
                                            "value": "INFO"
                                        }, 
                                        {
                                            "name": "CURATOR_LOG_LEVEL", 
                                            "value": "ERROR"
                                        }
                                    ], 
                                    "image": "172.30.224.2:5000/logging/logging-curator:latest", 
                                    "imagePullPolicy": "Always", 
                                    "name": "curator", 
                                    "resources": {
                                        "limits": {
                                            "cpu": "100m"
                                        }
                                    }, 
                                    "terminationMessagePath": "/dev/termination-log", 
                                    "terminationMessagePolicy": "File", 
                                    "volumeMounts": [
                                        {
                                            "mountPath": "/etc/curator/keys", 
                                            "name": "certs", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/etc/curator/settings", 
                                            "name": "config", 
                                            "readOnly": true
                                        }
                                    ]
                                }
                            ], 
                            "dnsPolicy": "ClusterFirst", 
                            "restartPolicy": "Always", 
                            "schedulerName": "default-scheduler", 
                            "securityContext": {}, 
                            "serviceAccount": "aggregated-logging-curator", 
                            "serviceAccountName": "aggregated-logging-curator", 
                            "terminationGracePeriodSeconds": 30, 
                            "volumes": [
                                {
                                    "name": "certs", 
                                    "secret": {
                                        "defaultMode": 420, 
                                        "secretName": "logging-curator"
                                    }
                                }, 
                                {
                                    "configMap": {
                                        "defaultMode": 420, 
                                        "name": "logging-curator"
                                    }, 
                                    "name": "config"
                                }
                            ]
                        }
                    }, 
                    "test": false, 
                    "triggers": [
                        {
                            "type": "ConfigChange"
                        }
                    ]
                }, 
                "status": {
                    "availableReplicas": 0, 
                    "conditions": [
                        {
                            "lastTransitionTime": "2017-06-09T14:39:15Z", 
                            "lastUpdateTime": "2017-06-09T14:39:15Z", 
                            "message": "Deployment config does not have minimum availability.", 
                            "status": "False", 
                            "type": "Available"
                        }, 
                        {
                            "lastTransitionTime": "2017-06-09T14:39:15Z", 
                            "lastUpdateTime": "2017-06-09T14:39:15Z", 
                            "message": "replication controller \"logging-curator-ops-1\" is waiting for pod \"logging-curator-ops-1-deploy\" to run", 
                            "status": "Unknown", 
                            "type": "Progressing"
                        }
                    ], 
                    "details": {
                        "causes": [
                            {
                                "type": "ConfigChange"
                            }
                        ], 
                        "message": "config change"
                    }, 
                    "latestVersion": 1, 
                    "observedGeneration": 2, 
                    "replicas": 0, 
                    "unavailableReplicas": 0, 
                    "updatedReplicas": 0
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_curator : Delete temp directory] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:109
ok: [openshift] => {
    "changed": false, 
    "path": "/tmp/openshift-logging-ansible-vywoUO", 
    "state": "absent"
}

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:226
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:241
statically included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml

TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:2
 [WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_fluentd_nodeselector.keys()
| count }} > 1
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:6
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:10
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:14
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:3
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:7
ok: [openshift] => {
    "ansible_facts": {
        "fluentd_version": "3_5"
    }, 
    "changed": false
}

TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:12
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:15
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:20
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:26
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : Create temp directory for doing work in] *****
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:33
ok: [openshift] => {
    "changed": false, 
    "cmd": [
        "mktemp", 
        "-d", 
        "/tmp/openshift-logging-ansible-XXXXXX"
    ], 
    "delta": "0:00:00.002006", 
    "end": "2017-06-09 10:39:19.619753", 
    "rc": 0, 
    "start": "2017-06-09 10:39:19.617747"
}

STDOUT:

/tmp/openshift-logging-ansible-hFvrLb

TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:38
ok: [openshift] => {
    "ansible_facts": {
        "tempdir": "/tmp/openshift-logging-ansible-hFvrLb"
    }, 
    "changed": false
}

TASK [openshift_logging_fluentd : Create templates subdirectory] ***************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:41
ok: [openshift] => {
    "changed": false, 
    "gid": 0, 
    "group": "root", 
    "mode": "0755", 
    "owner": "root", 
    "path": "/tmp/openshift-logging-ansible-hFvrLb/templates", 
    "secontext": "unconfined_u:object_r:user_tmp_t:s0", 
    "size": 6, 
    "state": "directory", 
    "uid": 0
}

TASK [openshift_logging_fluentd : Create Fluentd service account] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:51
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : Create Fluentd service account] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:59
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get sa aggregated-logging-fluentd -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "kind": "ServiceAccount", 
                "metadata": {
                    "annotations": {
                        "openshift.io/create-dockercfg-secrets.pending-token": "aggregated-logging-fluentd-token-6sxnn"
                    }, 
                    "creationTimestamp": "2017-06-09T14:39:20Z", 
                    "name": "aggregated-logging-fluentd", 
                    "namespace": "logging", 
                    "resourceVersion": "1624", 
                    "selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-fluentd", 
                    "uid": "6c61f6fc-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "secrets": [
                    {
                        "name": "aggregated-logging-fluentd-token-3n7cl"
                    }
                ]
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_fluentd : Set privileged permissions for Fluentd] ******
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:68
changed: [openshift] => {
    "changed": true, 
    "present": "present", 
    "results": {
        "cmd": "/bin/oc adm policy add-scc-to-user privileged system:serviceaccount:logging:aggregated-logging-fluentd -n logging", 
        "results": "", 
        "returncode": 0
    }
}

TASK [openshift_logging_fluentd : Set cluster-reader permissions for Fluentd] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:77
changed: [openshift] => {
    "changed": true, 
    "present": "present", 
    "results": {
        "cmd": "/bin/oc adm policy add-cluster-role-to-user cluster-reader system:serviceaccount:logging:aggregated-logging-fluentd -n logging", 
        "results": "", 
        "returncode": 0
    }
}

TASK [openshift_logging_fluentd : template] ************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:86
ok: [openshift] => {
    "changed": false, 
    "checksum": "a8c8596f5fc2c5dd7c8d33d244af17a2555be086", 
    "dest": "/tmp/openshift-logging-ansible-hFvrLb/fluent.conf", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "579698b48ffce6276ee0e8d5ac71a338", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 1301, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019162.78-163081589611217/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:94
ok: [openshift] => {
    "changed": false, 
    "checksum": "b3e75eddc4a0765edc77da092384c0c6f95440e1", 
    "dest": "/tmp/openshift-logging-ansible-hFvrLb/fluentd-throttle-config.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "25871b8e0a9bedc166a6029872a6c336", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 133, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019163.2-113128984754267/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:100
ok: [openshift] => {
    "changed": false, 
    "checksum": "a3aa36da13f3108aa4ad5b98d4866007b44e9798", 
    "dest": "/tmp/openshift-logging-ansible-hFvrLb/secure-forward.conf", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "1084b00c427f4fa48dfc66d6ad6555d4", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 563, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019163.5-246373269277544/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:107
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:113
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:119
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging_fluentd : Set Fluentd configmap] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:125
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get configmap logging-fluentd -o json -n logging", 
        "results": [
            {
                "apiVersion": "v1", 
                "data": {
                    "fluent.conf": "# This file is the fluentd configuration entrypoint. Edit with care.\n\n@include configs.d/openshift/system.conf\n\n# In each section below, pre- and post- includes don't include anything initially;\n# they exist to enable future additions to openshift conf as needed.\n\n## sources\n## ordered so that syslog always runs last...\n@include configs.d/openshift/input-pre-*.conf\n@include configs.d/dynamic/input-docker-*.conf\n@include configs.d/dynamic/input-syslog-*.conf\n@include configs.d/openshift/input-post-*.conf\n##\n\n<label @INGRESS>\n## filters\n  @include configs.d/openshift/filter-pre-*.conf\n  @include configs.d/openshift/filter-retag-journal.conf\n  @include configs.d/openshift/filter-k8s-meta.conf\n  @include configs.d/openshift/filter-kibana-transform.conf\n  @include configs.d/openshift/filter-k8s-flatten-hash.conf\n  @include configs.d/openshift/filter-k8s-record-transform.conf\n  @include configs.d/openshift/filter-syslog-record-transform.conf\n  @include configs.d/openshift/filter-viaq-data-model.conf\n  @include configs.d/openshift/filter-post-*.conf\n##\n\n## matches\n  @include configs.d/openshift/output-pre-*.conf\n  @include configs.d/openshift/output-operations.conf\n  @include configs.d/openshift/output-applications.conf\n  # no post - applications.conf matches everything left\n##\n</label>\n", 
                    "secure-forward.conf": "# @type secure_forward\n\n# self_hostname ${HOSTNAME}\n# shared_key <SECRET_STRING>\n\n# secure yes\n# enable_strict_verification yes\n\n# ca_cert_path /etc/fluent/keys/your_ca_cert\n# ca_private_key_path /etc/fluent/keys/your_private_key\n  # for private CA secret key\n# ca_private_key_passphrase passphrase\n\n# <server>\n  # or IP\n#   host server.fqdn.example.com\n#   port 24284\n# </server>\n# <server>\n  # ip address to connect\n#   host 203.0.113.8\n  # specify hostlabel for FQDN verification if ipaddress is used for host\n#   hostlabel server.fqdn.example.com\n# </server>\n", 
                    "throttle-config.yaml": "# Logging example fluentd throttling config file\n\n#example-project:\n#  read_lines_limit: 10\n#\n#.operations:\n#  read_lines_limit: 100\n"
                }, 
                "kind": "ConfigMap", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:39:24Z", 
                    "name": "logging-fluentd", 
                    "namespace": "logging", 
                    "resourceVersion": "1651", 
                    "selfLink": "/api/v1/namespaces/logging/configmaps/logging-fluentd", 
                    "uid": "6ead881a-4d21-11e7-83b0-0e6fb895db82"
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_fluentd : Set logging-fluentd secret] ******************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:137
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc secrets new logging-fluentd ca=/etc/origin/logging/ca.crt key=/etc/origin/logging/system.logging.fluentd.key cert=/etc/origin/logging/system.logging.fluentd.crt -n logging", 
        "results": "", 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_fluentd : Generate logging-fluentd daemonset definition] ***
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:154
ok: [openshift] => {
    "changed": false, 
    "checksum": "51309307ea0991a129258ea566d883d9b663beb2", 
    "dest": "/tmp/openshift-logging-ansible-hFvrLb/templates/logging-fluentd.yaml", 
    "gid": 0, 
    "group": "root", 
    "md5sum": "dc3385ee3b7b11ac7dad4a71b34cbdb3", 
    "mode": "0644", 
    "owner": "root", 
    "secontext": "unconfined_u:object_r:admin_home_t:s0", 
    "size": 3413, 
    "src": "/root/.ansible/tmp/ansible-tmp-1497019165.39-234714267889408/source", 
    "state": "file", 
    "uid": 0
}

TASK [openshift_logging_fluentd : Set logging-fluentd daemonset] ***************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:172
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc get daemonset logging-fluentd -o json -n logging", 
        "results": [
            {
                "apiVersion": "extensions/v1beta1", 
                "kind": "DaemonSet", 
                "metadata": {
                    "creationTimestamp": "2017-06-09T14:39:26Z", 
                    "generation": 1, 
                    "labels": {
                        "component": "fluentd", 
                        "logging-infra": "fluentd", 
                        "provider": "openshift"
                    }, 
                    "name": "logging-fluentd", 
                    "namespace": "logging", 
                    "resourceVersion": "1654", 
                    "selfLink": "/apis/extensions/v1beta1/namespaces/logging/daemonsets/logging-fluentd", 
                    "uid": "6fc4234a-4d21-11e7-83b0-0e6fb895db82"
                }, 
                "spec": {
                    "selector": {
                        "matchLabels": {
                            "component": "fluentd", 
                            "provider": "openshift"
                        }
                    }, 
                    "template": {
                        "metadata": {
                            "creationTimestamp": null, 
                            "labels": {
                                "component": "fluentd", 
                                "logging-infra": "fluentd", 
                                "provider": "openshift"
                            }, 
                            "name": "fluentd-elasticsearch"
                        }, 
                        "spec": {
                            "containers": [
                                {
                                    "env": [
                                        {
                                            "name": "K8S_HOST_URL", 
                                            "value": "https://kubernetes.default.svc.cluster.local"
                                        }, 
                                        {
                                            "name": "ES_HOST", 
                                            "value": "logging-es"
                                        }, 
                                        {
                                            "name": "ES_PORT", 
                                            "value": "9200"
                                        }, 
                                        {
                                            "name": "ES_CLIENT_CERT", 
                                            "value": "/etc/fluent/keys/cert"
                                        }, 
                                        {
                                            "name": "ES_CLIENT_KEY", 
                                            "value": "/etc/fluent/keys/key"
                                        }, 
                                        {
                                            "name": "ES_CA", 
                                            "value": "/etc/fluent/keys/ca"
                                        }, 
                                        {
                                            "name": "OPS_HOST", 
                                            "value": "logging-es-ops"
                                        }, 
                                        {
                                            "name": "OPS_PORT", 
                                            "value": "9200"
                                        }, 
                                        {
                                            "name": "OPS_CLIENT_CERT", 
                                            "value": "/etc/fluent/keys/cert"
                                        }, 
                                        {
                                            "name": "OPS_CLIENT_KEY", 
                                            "value": "/etc/fluent/keys/key"
                                        }, 
                                        {
                                            "name": "OPS_CA", 
                                            "value": "/etc/fluent/keys/ca"
                                        }, 
                                        {
                                            "name": "ES_COPY", 
                                            "value": "false"
                                        }, 
                                        {
                                            "name": "USE_JOURNAL", 
                                            "value": "true"
                                        }, 
                                        {
                                            "name": "JOURNAL_SOURCE"
                                        }, 
                                        {
                                            "name": "JOURNAL_READ_FROM_HEAD", 
                                            "value": "false"
                                        }
                                    ], 
                                    "image": "172.30.224.2:5000/logging/logging-fluentd:latest", 
                                    "imagePullPolicy": "Always", 
                                    "name": "fluentd-elasticsearch", 
                                    "resources": {
                                        "limits": {
                                            "cpu": "100m", 
                                            "memory": "512Mi"
                                        }
                                    }, 
                                    "securityContext": {
                                        "privileged": true
                                    }, 
                                    "terminationMessagePath": "/dev/termination-log", 
                                    "terminationMessagePolicy": "File", 
                                    "volumeMounts": [
                                        {
                                            "mountPath": "/run/log/journal", 
                                            "name": "runlogjournal"
                                        }, 
                                        {
                                            "mountPath": "/var/log", 
                                            "name": "varlog"
                                        }, 
                                        {
                                            "mountPath": "/var/lib/docker/containers", 
                                            "name": "varlibdockercontainers", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/etc/fluent/configs.d/user", 
                                            "name": "config", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/etc/fluent/keys", 
                                            "name": "certs", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/etc/docker-hostname", 
                                            "name": "dockerhostname", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/etc/localtime", 
                                            "name": "localtime", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/etc/sysconfig/docker", 
                                            "name": "dockercfg", 
                                            "readOnly": true
                                        }, 
                                        {
                                            "mountPath": "/etc/docker", 
                                            "name": "dockerdaemoncfg", 
                                            "readOnly": true
                                        }
                                    ]
                                }
                            ], 
                            "dnsPolicy": "ClusterFirst", 
                            "nodeSelector": {
                                "logging-infra-fluentd": "true"
                            }, 
                            "restartPolicy": "Always", 
                            "schedulerName": "default-scheduler", 
                            "securityContext": {}, 
                            "serviceAccount": "aggregated-logging-fluentd", 
                            "serviceAccountName": "aggregated-logging-fluentd", 
                            "terminationGracePeriodSeconds": 30, 
                            "volumes": [
                                {
                                    "hostPath": {
                                        "path": "/run/log/journal"
                                    }, 
                                    "name": "runlogjournal"
                                }, 
                                {
                                    "hostPath": {
                                        "path": "/var/log"
                                    }, 
                                    "name": "varlog"
                                }, 
                                {
                                    "hostPath": {
                                        "path": "/var/lib/docker/containers"
                                    }, 
                                    "name": "varlibdockercontainers"
                                }, 
                                {
                                    "configMap": {
                                        "defaultMode": 420, 
                                        "name": "logging-fluentd"
                                    }, 
                                    "name": "config"
                                }, 
                                {
                                    "name": "certs", 
                                    "secret": {
                                        "defaultMode": 420, 
                                        "secretName": "logging-fluentd"
                                    }
                                }, 
                                {
                                    "hostPath": {
                                        "path": "/etc/hostname"
                                    }, 
                                    "name": "dockerhostname"
                                }, 
                                {
                                    "hostPath": {
                                        "path": "/etc/localtime"
                                    }, 
                                    "name": "localtime"
                                }, 
                                {
                                    "hostPath": {
                                        "path": "/etc/sysconfig/docker"
                                    }, 
                                    "name": "dockercfg"
                                }, 
                                {
                                    "hostPath": {
                                        "path": "/etc/docker"
                                    }, 
                                    "name": "dockerdaemoncfg"
                                }
                            ]
                        }
                    }, 
                    "templateGeneration": 1, 
                    "updateStrategy": {
                        "rollingUpdate": {
                            "maxUnavailable": 1
                        }, 
                        "type": "RollingUpdate"
                    }
                }, 
                "status": {
                    "currentNumberScheduled": 0, 
                    "desiredNumberScheduled": 0, 
                    "numberMisscheduled": 0, 
                    "numberReady": 0, 
                    "observedGeneration": 1
                }
            }
        ], 
        "returncode": 0
    }, 
    "state": "present"
}

TASK [openshift_logging_fluentd : Retrieve list of Fluentd hosts] **************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:183
ok: [openshift] => {
    "changed": false, 
    "results": {
        "cmd": "/bin/oc get node -o json -n default", 
        "results": [
            {
                "apiVersion": "v1", 
                "items": [
                    {
                        "apiVersion": "v1", 
                        "kind": "Node", 
                        "metadata": {
                            "annotations": {
                                "volumes.kubernetes.io/controller-managed-attach-detach": "true"
                            }, 
                            "creationTimestamp": "2017-06-09T14:23:20Z", 
                            "labels": {
                                "beta.kubernetes.io/arch": "amd64", 
                                "beta.kubernetes.io/os": "linux", 
                                "kubernetes.io/hostname": "172.18.4.93"
                            }, 
                            "name": "172.18.4.93", 
                            "namespace": "", 
                            "resourceVersion": "1628", 
                            "selfLink": "/api/v1/nodes/172.18.4.93", 
                            "uid": "30393f5f-4d1f-11e7-83b0-0e6fb895db82"
                        }, 
                        "spec": {
                            "externalID": "172.18.4.93", 
                            "providerID": "aws:////i-08eae8de52d2e283e"
                        }, 
                        "status": {
                            "addresses": [
                                {
                                    "address": "172.18.4.93", 
                                    "type": "LegacyHostIP"
                                }, 
                                {
                                    "address": "172.18.4.93", 
                                    "type": "InternalIP"
                                }, 
                                {
                                    "address": "172.18.4.93", 
                                    "type": "Hostname"
                                }
                            ], 
                            "allocatable": {
                                "cpu": "4", 
                                "memory": "7129288Ki", 
                                "pods": "40"
                            }, 
                            "capacity": {
                                "cpu": "4", 
                                "memory": "7231688Ki", 
                                "pods": "40"
                            }, 
                            "conditions": [
                                {
                                    "lastHeartbeatTime": "2017-06-09T14:39:21Z", 
                                    "lastTransitionTime": "2017-06-09T14:23:20Z", 
                                    "message": "kubelet has sufficient disk space available", 
                                    "reason": "KubeletHasSufficientDisk", 
                                    "status": "False", 
                                    "type": "OutOfDisk"
                                }, 
                                {
                                    "lastHeartbeatTime": "2017-06-09T14:39:21Z", 
                                    "lastTransitionTime": "2017-06-09T14:23:20Z", 
                                    "message": "kubelet has sufficient memory available", 
                                    "reason": "KubeletHasSufficientMemory", 
                                    "status": "False", 
                                    "type": "MemoryPressure"
                                }, 
                                {
                                    "lastHeartbeatTime": "2017-06-09T14:39:21Z", 
                                    "lastTransitionTime": "2017-06-09T14:23:20Z", 
                                    "message": "kubelet has no disk pressure", 
                                    "reason": "KubeletHasNoDiskPressure", 
                                    "status": "False", 
                                    "type": "DiskPressure"
                                }, 
                                {
                                    "lastHeartbeatTime": "2017-06-09T14:39:21Z", 
                                    "lastTransitionTime": "2017-06-09T14:23:20Z", 
                                    "message": "kubelet is posting ready status", 
                                    "reason": "KubeletReady", 
                                    "status": "True", 
                                    "type": "Ready"
                                }
                            ], 
                            "daemonEndpoints": {
                                "kubeletEndpoint": {
                                    "Port": 10250
                                }
                            }, 
                            "images": [
                                {
                                    "names": [
                                        "openshift/origin-federation:6acabdc", 
                                        "openshift/origin-federation:latest"
                                    ], 
                                    "sizeBytes": 1205885664
                                }, 
                                {
                                    "names": [
                                        "docker.io/openshift/origin-docker-registry@sha256:0601ffd0ff2b7258926bde100b285cf824e012438e15e1ad808ea5e3bbdecc12", 
                                        "docker.io/openshift/origin-docker-registry:latest"
                                    ], 
                                    "sizeBytes": 1100570695
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-docker-registry:6acabdc", 
                                        "openshift/origin-docker-registry:latest"
                                    ], 
                                    "sizeBytes": 1100164272
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-gitserver:6acabdc", 
                                        "openshift/origin-gitserver:latest"
                                    ], 
                                    "sizeBytes": 1086520226
                                }, 
                                {
                                    "names": [
                                        "openshift/node:6acabdc", 
                                        "openshift/node:latest"
                                    ], 
                                    "sizeBytes": 1051721928
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-keepalived-ipfailover:6acabdc", 
                                        "openshift/origin-keepalived-ipfailover:latest"
                                    ], 
                                    "sizeBytes": 1028529711
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-haproxy-router:latest"
                                    ], 
                                    "sizeBytes": 1022758742
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-docker-builder:6acabdc", 
                                        "openshift/origin-docker-builder:latest"
                                    ], 
                                    "sizeBytes": 1001728427
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-deployer:6acabdc", 
                                        "openshift/origin-deployer:latest"
                                    ], 
                                    "sizeBytes": 1001728427
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-f5-router:6acabdc", 
                                        "openshift/origin-f5-router:latest"
                                    ], 
                                    "sizeBytes": 1001728427
                                }, 
                                {
                                    "names": [
                                        "openshift/origin:6acabdc", 
                                        "openshift/origin:latest"
                                    ], 
                                    "sizeBytes": 1001728427
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-sti-builder:6acabdc", 
                                        "openshift/origin-sti-builder:latest"
                                    ], 
                                    "sizeBytes": 1001728427
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-recycler:6acabdc", 
                                        "openshift/origin-recycler:latest"
                                    ], 
                                    "sizeBytes": 1001728427
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-cluster-capacity:6acabdc", 
                                        "openshift/origin-cluster-capacity:latest"
                                    ], 
                                    "sizeBytes": 962455026
                                }, 
                                {
                                    "names": [
                                        "rhel7.1:latest"
                                    ], 
                                    "sizeBytes": 765301508
                                }, 
                                {
                                    "names": [
                                        "openshift/dind-master:latest"
                                    ], 
                                    "sizeBytes": 731456758
                                }, 
                                {
                                    "names": [
                                        "openshift/dind-node:latest"
                                    ], 
                                    "sizeBytes": 731453034
                                }, 
                                {
                                    "names": [
                                        "172.30.224.2:5000/logging/logging-auth-proxy@sha256:63567bf13e7d4ad50117140426e98a4dcf59048ec0bf0e28f4ed074f8cda8155", 
                                        "172.30.224.2:5000/logging/logging-auth-proxy:latest"
                                    ], 
                                    "sizeBytes": 715536092
                                }, 
                                {
                                    "names": [
                                        "docker.io/node@sha256:46db0dd19955beb87b841c30a6b9812ba626473283e84117d1c016deee5949a9", 
                                        "docker.io/node:0.10.36"
                                    ], 
                                    "sizeBytes": 697128386
                                }, 
                                {
                                    "names": [
                                        "docker.io/openshift/origin-logging-kibana@sha256:950568237cc7d0ff14ea9fe22c3967d888996db70c66181421ad68caeb5ba75f", 
                                        "docker.io/openshift/origin-logging-kibana:latest"
                                    ], 
                                    "sizeBytes": 682851513
                                }, 
                                {
                                    "names": [
                                        "172.30.224.2:5000/logging/logging-kibana@sha256:bcb762f029371abc58677ae894e2e1bd96d2509b05c9c863d3ecd17b05d07272", 
                                        "172.30.224.2:5000/logging/logging-kibana:latest"
                                    ], 
                                    "sizeBytes": 682851459
                                }, 
                                {
                                    "names": [
                                        "openshift/dind:latest"
                                    ], 
                                    "sizeBytes": 640650210
                                }, 
                                {
                                    "names": [
                                        "172.30.224.2:5000/logging/logging-elasticsearch@sha256:16dcdc717d95c0fabe6ec7713bfe6c9261d4de9a56962894965ae45c63a347ec", 
                                        "172.30.224.2:5000/logging/logging-elasticsearch:latest"
                                    ], 
                                    "sizeBytes": 623513030
                                }, 
                                {
                                    "names": [
                                        "172.30.224.2:5000/logging/logging-fluentd@sha256:130794deff858df95acbfe5e44921ee244118b1973ebce37aa6587765155b940", 
                                        "172.30.224.2:5000/logging/logging-fluentd:latest"
                                    ], 
                                    "sizeBytes": 472184910
                                }, 
                                {
                                    "names": [
                                        "docker.io/openshift/origin-logging-elasticsearch@sha256:6296f1719676e970438cac4d912542b35ac786c14a15df892507007c4ecbe490", 
                                        "docker.io/openshift/origin-logging-elasticsearch:latest"
                                    ], 
                                    "sizeBytes": 425567196
                                }, 
                                {
                                    "names": [
                                        "172.30.224.2:5000/logging/logging-curator@sha256:c31caeff56c054df608838150e51377d6a79e2e8a33f48750d025d7690812a65", 
                                        "172.30.224.2:5000/logging/logging-curator:latest"
                                    ], 
                                    "sizeBytes": 418288265
                                }, 
                                {
                                    "names": [
                                        "docker.io/openshift/base-centos7@sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c", 
                                        "docker.io/openshift/base-centos7:latest"
                                    ], 
                                    "sizeBytes": 383049978
                                }, 
                                {
                                    "names": [
                                        "rhel7.2:latest"
                                    ], 
                                    "sizeBytes": 377493597
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-egress-router:6acabdc", 
                                        "openshift/origin-egress-router:latest"
                                    ], 
                                    "sizeBytes": 364745713
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-base:latest"
                                    ], 
                                    "sizeBytes": 363070172
                                }, 
                                {
                                    "names": [
                                        "<none>@<none>", 
                                        "<none>:<none>"
                                    ], 
                                    "sizeBytes": 363024702
                                }, 
                                {
                                    "names": [
                                        "docker.io/openshift/origin-logging-fluentd@sha256:cae7c21c9f111d4f5b481c14a65c597c67e715a8ffe3aee4c483100ee77296d7", 
                                        "docker.io/openshift/origin-logging-fluentd:latest"
                                    ], 
                                    "sizeBytes": 359223728
                                }, 
                                {
                                    "names": [
                                        "docker.io/fedora@sha256:69281ddd7b2600e5f2b17f1e12d7fba25207f459204fb2d15884f8432c479136", 
                                        "docker.io/fedora:25"
                                    ], 
                                    "sizeBytes": 230864375
                                }, 
                                {
                                    "names": [
                                        "docker.io/openshift/origin-logging-curator@sha256:daded10ff4e08dfb6659c964e305f16679596312da558af095835202cf66f703", 
                                        "docker.io/openshift/origin-logging-curator:latest"
                                    ], 
                                    "sizeBytes": 224977669
                                }, 
                                {
                                    "names": [
                                        "rhel7.3:latest", 
                                        "rhel7:latest"
                                    ], 
                                    "sizeBytes": 219121266
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-pod:6acabdc", 
                                        "openshift/origin-pod:latest"
                                    ], 
                                    "sizeBytes": 213199843
                                }, 
                                {
                                    "names": [
                                        "registry.access.redhat.com/rhel7.2@sha256:98e6ca5d226c26e31a95cd67716afe22833c943e1926a21daf1a030906a02249", 
                                        "registry.access.redhat.com/rhel7.2:latest"
                                    ], 
                                    "sizeBytes": 201376319
                                }, 
                                {
                                    "names": [
                                        "registry.access.redhat.com/rhel7.3@sha256:1e232401d8e0ba53b36b757b4712fbcbd1dab9c21db039c45a84871a74e89e68", 
                                        "registry.access.redhat.com/rhel7.3:latest"
                                    ], 
                                    "sizeBytes": 192693772
                                }, 
                                {
                                    "names": [
                                        "docker.io/centos@sha256:bba1de7c9d900a898e3cadbae040dfe8a633c06bc104a0df76ae24483e03c077"
                                    ], 
                                    "sizeBytes": 192548999
                                }, 
                                {
                                    "names": [
                                        "openshift/origin-source:latest"
                                    ], 
                                    "sizeBytes": 192548894
                                }, 
                                {
                                    "names": [
                                        "docker.io/centos@sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9", 
                                        "docker.io/centos:7", 
                                        "docker.io/centos:centos7"
                                    ], 
                                    "sizeBytes": 192548537
                                }, 
                                {
                                    "names": [
                                        "registry.access.redhat.com/rhel7.1@sha256:1bc5a4c43bbb29a5a96a61896ff696933be3502e2f5fdc4cde02d9e101731fdd", 
                                        "registry.access.redhat.com/rhel7.1:latest"
                                    ], 
                                    "sizeBytes": 158229901
                                }, 
                                {
                                    "names": [
                                        "openshift/hello-openshift:6acabdc", 
                                        "openshift/hello-openshift:latest"
                                    ], 
                                    "sizeBytes": 5643318
                                }
                            ], 
                            "nodeInfo": {
                                "architecture": "amd64", 
                                "bootID": "9b91d16b-8962-41b1-a934-bcdeca0205d2", 
                                "containerRuntimeVersion": "docker://1.12.6", 
                                "kernelVersion": "3.10.0-327.22.2.el7.x86_64", 
                                "kubeProxyVersion": "v1.6.1+5115d708d7", 
                                "kubeletVersion": "v1.6.1+5115d708d7", 
                                "machineID": "f9370ed252a14f73b014c1301a9b6d1b", 
                                "operatingSystem": "linux", 
                                "osImage": "Red Hat Enterprise Linux Server 7.3 (Maipo)", 
                                "systemUUID": "EC20179D-CEE7-8FA3-53A5-5B49D0B44786"
                            }
                        }
                    }
                ], 
                "kind": "List", 
                "metadata": {}, 
                "resourceVersion": "", 
                "selfLink": ""
            }
        ], 
        "returncode": 0
    }, 
    "state": "list"
}

TASK [openshift_logging_fluentd : Set openshift_logging_fluentd_hosts] *********
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:190
ok: [openshift] => {
    "ansible_facts": {
        "openshift_logging_fluentd_hosts": [
            "172.18.4.93"
        ]
    }, 
    "changed": false
}

TASK [openshift_logging_fluentd : include] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:195
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/label_and_wait.yaml for openshift

TASK [openshift_logging_fluentd : Label 172.18.4.93 for Fluentd deployment] ****
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/label_and_wait.yaml:2
changed: [openshift] => {
    "changed": true, 
    "results": {
        "cmd": "/bin/oc label node 172.18.4.93 logging-infra-fluentd=true --overwrite", 
        "results": "", 
        "returncode": 0
    }, 
    "state": "add"
}

TASK [openshift_logging_fluentd : command] *************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/label_and_wait.yaml:10
changed: [openshift -> 127.0.0.1] => {
    "changed": true, 
    "cmd": [
        "sleep", 
        "0.5"
    ], 
    "delta": "0:00:00.502438", 
    "end": "2017-06-09 10:39:28.473332", 
    "rc": 0, 
    "start": "2017-06-09 10:39:27.970894"
}

TASK [openshift_logging_fluentd : Delete temp directory] ***********************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:202
ok: [openshift] => {
    "changed": false, 
    "path": "/tmp/openshift-logging-ansible-hFvrLb", 
    "state": "absent"
}

TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:253
included: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/update_master_config.yaml for openshift

TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/main.yaml:36
skipping: [openshift] => {
    "changed": false, 
    "skip_reason": "Conditional result was False", 
    "skipped": true
}

TASK [openshift_logging : Cleaning up local temp dir] **************************
task path: /tmp/tmp.7yWYbvLCJ1/openhift-ansible/roles/openshift_logging/tasks/main.yaml:40
ok: [openshift -> 127.0.0.1] => {
    "changed": false, 
    "path": "/tmp/openshift-logging-ansible-XvDfd8", 
    "state": "absent"
}
META: ran handlers
META: ran handlers

PLAY [Update Master configs] ***************************************************
skipping: no hosts matched

PLAY RECAP *********************************************************************
localhost                  : ok=2    changed=0    unreachable=0    failed=0   
openshift                  : ok=213  changed=71   unreachable=0    failed=0   

/data/src/github.com/openshift/origin-aggregated-logging
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:170: executing 'oc get pods -l component=es' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.282s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:170: executing 'oc get pods -l component=es' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME                                      READY     STATUS    RESTARTS   AGE
logging-es-data-master-nij68urm-1-7f3ck   1/1       Running   0          1m

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:171: executing 'oc get pods -l component=kibana' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.280s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:171: executing 'oc get pods -l component=kibana' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME                     READY     STATUS    RESTARTS   AGE
logging-kibana-1-q2g1v   2/2       Running   0          39s

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:172: executing 'oc get pods -l component=curator' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.526s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:172: executing 'oc get pods -l component=curator' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME                      READY     STATUS    RESTARTS   AGE
logging-curator-1-819xv   1/1       Running   0          18s

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:175: executing 'oc get pods -l component=es-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.239s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:175: executing 'oc get pods -l component=es-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME                                          READY     STATUS    RESTARTS   AGE
logging-es-ops-data-master-1lbwcltv-1-w0wm7   1/1       Running   0          53s

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:176: executing 'oc get pods -l component=kibana-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.238s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:176: executing 'oc get pods -l component=kibana-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME                         READY     STATUS    RESTARTS   AGE
logging-kibana-ops-1-mq6b8   2/2       Running   0          26s

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:177: executing 'oc get pods -l component=curator-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.213s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:177: executing 'oc get pods -l component=curator-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME                          READY     STATUS    RESTARTS   AGE
logging-curator-ops-1-qxmbj   1/1       Running   0          14s

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:185: executing 'oc project logging > /dev/null' expecting success...
SUCCESS after 0.224s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:185: executing 'oc project logging > /dev/null' expecting success
There was no output from the command.
There was no error output from the command.
/data/src/github.com/openshift/origin-aggregated-logging/hack/testing /data/src/github.com/openshift/origin-aggregated-logging
--> Deploying template "logging/logging-fluentd-template-maker" for "-" to project logging

     logging-fluentd-template-maker
     ---------
     Template to create template for fluentd

     * With parameters:
        * MASTER_URL=https://kubernetes.default.svc.cluster.local
        * ES_HOST=logging-es
        * ES_PORT=9200
        * ES_CLIENT_CERT=/etc/fluent/keys/cert
        * ES_CLIENT_KEY=/etc/fluent/keys/key
        * ES_CA=/etc/fluent/keys/ca
        * OPS_HOST=logging-es-ops
        * OPS_PORT=9200
        * OPS_CLIENT_CERT=/etc/fluent/keys/cert
        * OPS_CLIENT_KEY=/etc/fluent/keys/key
        * OPS_CA=/etc/fluent/keys/ca
        * ES_COPY=false
        * ES_COPY_HOST=
        * ES_COPY_PORT=
        * ES_COPY_SCHEME=https
        * ES_COPY_CLIENT_CERT=
        * ES_COPY_CLIENT_KEY=
        * ES_COPY_CA=
        * ES_COPY_USERNAME=
        * ES_COPY_PASSWORD=
        * OPS_COPY_HOST=
        * OPS_COPY_PORT=
        * OPS_COPY_SCHEME=https
        * OPS_COPY_CLIENT_CERT=
        * OPS_COPY_CLIENT_KEY=
        * OPS_COPY_CA=
        * OPS_COPY_USERNAME=
        * OPS_COPY_PASSWORD=
        * IMAGE_PREFIX_DEFAULT=172.30.224.2:5000/logging/
        * IMAGE_VERSION_DEFAULT=latest
        * USE_JOURNAL=
        * JOURNAL_SOURCE=
        * JOURNAL_READ_FROM_HEAD=false
        * USE_MUX=false
        * USE_MUX_CLIENT=false
        * MUX_ALLOW_EXTERNAL=false
        * BUFFER_QUEUE_LIMIT=1024
        * BUFFER_SIZE_LIMIT=16777216

--> Creating resources ...
    template "logging-fluentd-template" created
--> Success
    Run 'oc status' to view your app.
WARNING: bridge-nf-call-ip6tables is disabled
START wait_for_fluentd_to_catch_up at 2017-06-09 14:39:43.181917679+00:00
added es message 8a328d17-4cf5-4e14-bf64-53ef5ff1d7c7
added es-ops message eeb4b9b4-0450-4a55-90ed-ceae8170de69
good - wait_for_fluentd_to_catch_up: found 1 record project logging for 8a328d17-4cf5-4e14-bf64-53ef5ff1d7c7
good - wait_for_fluentd_to_catch_up: found 1 record project .operations for eeb4b9b4-0450-4a55-90ed-ceae8170de69
END wait_for_fluentd_to_catch_up took 11 seconds at 2017-06-09 14:39:54.843647473+00:00
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:223: executing 'oc login --username=admin --password=admin' expecting success...
SUCCESS after 0.227s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:223: executing 'oc login --username=admin --password=admin' expecting success
Standard output from the command:
Login successful.

You don't have any projects. You can try to create a new project, by running

    oc new-project <projectname>


There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:224: executing 'oc login --username=system:admin' expecting success...
SUCCESS after 0.245s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:224: executing 'oc login --username=system:admin' expecting success
Standard output from the command:
Logged into "https://172.18.4.93:8443" as "system:admin" using existing credentials.

You have access to the following projects and can switch between them with 'oc project <projectname>':

  * default
    kube-public
    kube-system
    logging
    openshift
    openshift-infra

Using project "default".

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:225: executing 'oadm policy add-cluster-role-to-user cluster-admin admin' expecting success...
SUCCESS after 0.292s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:225: executing 'oadm policy add-cluster-role-to-user cluster-admin admin' expecting success
Standard output from the command:
cluster role "cluster-admin" added: "admin"

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:226: executing 'oc login --username=loguser --password=loguser' expecting success...
SUCCESS after 0.231s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:226: executing 'oc login --username=loguser --password=loguser' expecting success
Standard output from the command:
Login successful.

You don't have any projects. You can try to create a new project, by running

    oc new-project <projectname>


There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:227: executing 'oc login --username=system:admin' expecting success...
SUCCESS after 0.667s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:227: executing 'oc login --username=system:admin' expecting success
Standard output from the command:
Logged into "https://172.18.4.93:8443" as "system:admin" using existing credentials.

You have access to the following projects and can switch between them with 'oc project <projectname>':

  * default
    kube-public
    kube-system
    logging
    openshift
    openshift-infra

Using project "default".

There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:228: executing 'oc project logging > /dev/null' expecting success...
SUCCESS after 0.336s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:228: executing 'oc project logging > /dev/null' expecting success
There was no output from the command.
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:229: executing 'oadm policy add-role-to-user view loguser' expecting success...
SUCCESS after 0.244s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:229: executing 'oadm policy add-role-to-user view loguser' expecting success
Standard output from the command:
role "view" added: "loguser"

There was no error output from the command.
Checking if Elasticsearch logging-es-data-master-nij68urm-1-7f3ck is ready
{
    "_id": "0",
    "_index": ".searchguard.logging-es-data-master-nij68urm-1-7f3ck",
    "_shards": {
        "failed": 0,
        "successful": 1,
        "total": 1
    },
    "_type": "rolesmapping",
    "_version": 2,
    "created": false
}
Checking if Elasticsearch logging-es-ops-data-master-1lbwcltv-1-w0wm7 is ready
{
    "_id": "0",
    "_index": ".searchguard.logging-es-ops-data-master-1lbwcltv-1-w0wm7",
    "_shards": {
        "failed": 0,
        "successful": 1,
        "total": 1
    },
    "_type": "rolesmapping",
    "_version": 2,
    "created": false
}
------------------------------------------
     Test 'admin' user can access cluster stats
------------------------------------------
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:265: executing 'test 200 = 200' expecting success...
SUCCESS after 0.009s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:265: executing 'test 200 = 200' expecting success
There was no output from the command.
There was no error output from the command.
------------------------------------------
     Test 'admin' user can access cluster stats for OPS cluster
------------------------------------------
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:274: executing 'test 200 = 200' expecting success...
SUCCESS after 0.010s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:274: executing 'test 200 = 200' expecting success
There was no output from the command.
There was no error output from the command.
Running e2e tests
Checking installation of the EFK stack...
Running test/cluster/rollout.sh:20: executing 'oc project logging' expecting success...
SUCCESS after 0.236s: test/cluster/rollout.sh:20: executing 'oc project logging' expecting success
Standard output from the command:
Already on project "logging" on server "https://172.18.4.93:8443".

There was no error output from the command.
[INFO] Checking for DeploymentConfigurations...
Running test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-kibana' expecting success...
SUCCESS after 0.238s: test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-kibana' expecting success
Standard output from the command:
NAME             REVISION   DESIRED   CURRENT   TRIGGERED BY
logging-kibana   1          1         1         config

There was no error output from the command.
Running test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-kibana' expecting success...
SUCCESS after 0.225s: test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-kibana' expecting success
Standard output from the command:
replication controller "logging-kibana-1" successfully rolled out

There was no error output from the command.
Running test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-curator' expecting success...
SUCCESS after 0.224s: test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-curator' expecting success
Standard output from the command:
NAME              REVISION   DESIRED   CURRENT   TRIGGERED BY
logging-curator   1          1         1         config

There was no error output from the command.
Running test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-curator' expecting success...
SUCCESS after 0.215s: test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-curator' expecting success
Standard output from the command:
replication controller "logging-curator-1" successfully rolled out

There was no error output from the command.
Running test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-kibana-ops' expecting success...
SUCCESS after 0.242s: test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-kibana-ops' expecting success
Standard output from the command:
NAME                 REVISION   DESIRED   CURRENT   TRIGGERED BY
logging-kibana-ops   1          1         1         config

There was no error output from the command.
Running test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-kibana-ops' expecting success...
SUCCESS after 0.236s: test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-kibana-ops' expecting success
Standard output from the command:
replication controller "logging-kibana-ops-1" successfully rolled out

There was no error output from the command.
Running test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-curator-ops' expecting success...
SUCCESS after 0.212s: test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-curator-ops' expecting success
Standard output from the command:
NAME                  REVISION   DESIRED   CURRENT   TRIGGERED BY
logging-curator-ops   1          1         1         config

There was no error output from the command.
Running test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-curator-ops' expecting success...
SUCCESS after 0.215s: test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-curator-ops' expecting success
Standard output from the command:
replication controller "logging-curator-ops-1" successfully rolled out

There was no error output from the command.
Running test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-es-data-master-nij68urm' expecting success...
SUCCESS after 0.260s: test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-es-data-master-nij68urm' expecting success
Standard output from the command:
NAME                              REVISION   DESIRED   CURRENT   TRIGGERED BY
logging-es-data-master-nij68urm   1          1         1         config

There was no error output from the command.
Running test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-es-data-master-nij68urm' expecting success...
SUCCESS after 0.224s: test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-es-data-master-nij68urm' expecting success
Standard output from the command:
replication controller "logging-es-data-master-nij68urm-1" successfully rolled out

There was no error output from the command.
Running test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-es-ops-data-master-1lbwcltv' expecting success...
SUCCESS after 0.218s: test/cluster/rollout.sh:24: executing 'oc get deploymentconfig logging-es-ops-data-master-1lbwcltv' expecting success
Standard output from the command:
NAME                                  REVISION   DESIRED   CURRENT   TRIGGERED BY
logging-es-ops-data-master-1lbwcltv   1          1         1         config

There was no error output from the command.
Running test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-es-ops-data-master-1lbwcltv' expecting success...
SUCCESS after 0.207s: test/cluster/rollout.sh:25: executing 'oc rollout status deploymentconfig/logging-es-ops-data-master-1lbwcltv' expecting success
Standard output from the command:
replication controller "logging-es-ops-data-master-1lbwcltv-1" successfully rolled out

There was no error output from the command.
[INFO] Checking for Routes...
Running test/cluster/rollout.sh:30: executing 'oc get route logging-kibana' expecting success...
SUCCESS after 0.223s: test/cluster/rollout.sh:30: executing 'oc get route logging-kibana' expecting success
Standard output from the command:
NAME             HOST/PORT                                 PATH      SERVICES         PORT      TERMINATION          WILDCARD
logging-kibana   kibana.router.default.svc.cluster.local             logging-kibana   <all>     reencrypt/Redirect   None

There was no error output from the command.
Running test/cluster/rollout.sh:30: executing 'oc get route logging-kibana-ops' expecting success...
SUCCESS after 0.208s: test/cluster/rollout.sh:30: executing 'oc get route logging-kibana-ops' expecting success
Standard output from the command:
NAME                 HOST/PORT                                     PATH      SERVICES             PORT      TERMINATION          WILDCARD
logging-kibana-ops   kibana-ops.router.default.svc.cluster.local             logging-kibana-ops   <all>     reencrypt/Redirect   None

There was no error output from the command.
[INFO] Checking for Services...
Running test/cluster/rollout.sh:35: executing 'oc get service logging-es' expecting success...
SUCCESS after 0.206s: test/cluster/rollout.sh:35: executing 'oc get service logging-es' expecting success
Standard output from the command:
NAME         CLUSTER-IP       EXTERNAL-IP   PORT(S)    AGE
logging-es   172.30.217.175   <none>        9200/TCP   1m

There was no error output from the command.
Running test/cluster/rollout.sh:35: executing 'oc get service logging-es-cluster' expecting success...
SUCCESS after 0.233s: test/cluster/rollout.sh:35: executing 'oc get service logging-es-cluster' expecting success
Standard output from the command:
NAME                 CLUSTER-IP       EXTERNAL-IP   PORT(S)    AGE
logging-es-cluster   172.30.159.212   <none>        9300/TCP   1m

There was no error output from the command.
Running test/cluster/rollout.sh:35: executing 'oc get service logging-kibana' expecting success...
SUCCESS after 0.213s: test/cluster/rollout.sh:35: executing 'oc get service logging-kibana' expecting success
Standard output from the command:
NAME             CLUSTER-IP      EXTERNAL-IP   PORT(S)   AGE
logging-kibana   172.30.30.149   <none>        443/TCP   1m

There was no error output from the command.
Running test/cluster/rollout.sh:35: executing 'oc get service logging-es-ops' expecting success...
SUCCESS after 0.222s: test/cluster/rollout.sh:35: executing 'oc get service logging-es-ops' expecting success
Standard output from the command:
NAME             CLUSTER-IP       EXTERNAL-IP   PORT(S)    AGE
logging-es-ops   172.30.110.130   <none>        9200/TCP   1m

There was no error output from the command.
Running test/cluster/rollout.sh:35: executing 'oc get service logging-es-ops-cluster' expecting success...
SUCCESS after 0.238s: test/cluster/rollout.sh:35: executing 'oc get service logging-es-ops-cluster' expecting success
Standard output from the command:
NAME                     CLUSTER-IP     EXTERNAL-IP   PORT(S)    AGE
logging-es-ops-cluster   172.30.18.33   <none>        9300/TCP   1m

There was no error output from the command.
Running test/cluster/rollout.sh:35: executing 'oc get service logging-kibana-ops' expecting success...
SUCCESS after 0.246s: test/cluster/rollout.sh:35: executing 'oc get service logging-kibana-ops' expecting success
Standard output from the command:
NAME                 CLUSTER-IP       EXTERNAL-IP   PORT(S)   AGE
logging-kibana-ops   172.30.164.117   <none>        443/TCP   1m

There was no error output from the command.
[INFO] Checking for OAuthClients...
Running test/cluster/rollout.sh:40: executing 'oc get oauthclient kibana-proxy' expecting success...
SUCCESS after 0.208s: test/cluster/rollout.sh:40: executing 'oc get oauthclient kibana-proxy' expecting success
Standard output from the command:
NAME           SECRET                                                             WWW-CHALLENGE   REDIRECT URIS
kibana-proxy   hq6HeKWxwDxGEk870qHq9HqfkisXwGEz4HZgSD2i3ekR5I4zojxRSrNEz5qzVPiZ   FALSE           https://kibana.router.default.svc.cluster.local,https://kibana-ops.router.default.svc.cluster.local

There was no error output from the command.
[INFO] Checking for DaemonSets...
Running test/cluster/rollout.sh:45: executing 'oc get daemonset logging-fluentd' expecting success...
SUCCESS after 0.216s: test/cluster/rollout.sh:45: executing 'oc get daemonset logging-fluentd' expecting success
Standard output from the command:
NAME              DESIRED   CURRENT   READY     UP-TO-DATE   AVAILABLE   NODE-SELECTOR                AGE
logging-fluentd   1         1         1         1            1           logging-infra-fluentd=true   53s

There was no error output from the command.
Running test/cluster/rollout.sh:47: executing 'oc get daemonset logging-fluentd -o jsonpath='{ .status.numberReady }'' expecting any result and text '1'; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 0.214s: test/cluster/rollout.sh:47: executing 'oc get daemonset logging-fluentd -o jsonpath='{ .status.numberReady }'' expecting any result and text '1'; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
1
There was no error output from the command.
Checking for log entry matches between ES and their sources...
WARNING: bridge-nf-call-ip6tables is disabled
Running test/cluster/functionality.sh:40: executing 'oc login --username=admin --password=admin' expecting success...
SUCCESS after 0.356s: test/cluster/functionality.sh:40: executing 'oc login --username=admin --password=admin' expecting success
Standard output from the command:
Login successful.

You have access to the following projects and can switch between them with 'oc project <projectname>':

    default
    kube-public
    kube-system
  * logging
    openshift
    openshift-infra

Using project "logging".

There was no error output from the command.
Running test/cluster/functionality.sh:44: executing 'oc login --username=system:admin' expecting success...
SUCCESS after 0.248s: test/cluster/functionality.sh:44: executing 'oc login --username=system:admin' expecting success
Standard output from the command:
Logged into "https://172.18.4.93:8443" as "system:admin" using existing credentials.

You have access to the following projects and can switch between them with 'oc project <projectname>':

    default
    kube-public
    kube-system
  * logging
    openshift
    openshift-infra

Using project "logging".

There was no error output from the command.
Running test/cluster/functionality.sh:45: executing 'oc project logging' expecting success...
SUCCESS after 0.234s: test/cluster/functionality.sh:45: executing 'oc project logging' expecting success
Standard output from the command:
Already on project "logging" on server "https://172.18.4.93:8443".

There was no error output from the command.
[INFO] Testing Kibana pod logging-kibana-1-q2g1v for a successful start...
Running test/cluster/functionality.sh:52: executing 'oc exec logging-kibana-1-q2g1v -c kibana -- curl -s --request HEAD --write-out '%{response_code}' http://localhost:5601/' expecting any result and text '200'; re-trying every 0.2s until completion or 600.000s...
SUCCESS after 120.304s: test/cluster/functionality.sh:52: executing 'oc exec logging-kibana-1-q2g1v -c kibana -- curl -s --request HEAD --write-out '%{response_code}' http://localhost:5601/' expecting any result and text '200'; re-trying every 0.2s until completion or 600.000s
Standard output from the command:
200
There was no error output from the command.
Running test/cluster/functionality.sh:53: executing 'oc get pod logging-kibana-1-q2g1v -o jsonpath='{ .status.containerStatuses[?(@.name=="kibana")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 0.238s: test/cluster/functionality.sh:53: executing 'oc get pod logging-kibana-1-q2g1v -o jsonpath='{ .status.containerStatuses[?(@.name=="kibana")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
true
There was no error output from the command.
Running test/cluster/functionality.sh:54: executing 'oc get pod logging-kibana-1-q2g1v -o jsonpath='{ .status.containerStatuses[?(@.name=="kibana-proxy")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 0.208s: test/cluster/functionality.sh:54: executing 'oc get pod logging-kibana-1-q2g1v -o jsonpath='{ .status.containerStatuses[?(@.name=="kibana-proxy")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
true
There was no error output from the command.
[INFO] Testing Elasticsearch pod logging-es-data-master-nij68urm-1-7f3ck for a successful start...
Running test/cluster/functionality.sh:59: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/' -X HEAD -w '%{response_code}'' expecting any result and text '200'; re-trying every 0.2s until completion or 600.000s...
SUCCESS after 0.361s: test/cluster/functionality.sh:59: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/' -X HEAD -w '%{response_code}'' expecting any result and text '200'; re-trying every 0.2s until completion or 600.000s
Standard output from the command:
200
There was no error output from the command.
Running test/cluster/functionality.sh:60: executing 'oc get pod logging-es-data-master-nij68urm-1-7f3ck -o jsonpath='{ .status.containerStatuses[?(@.name=="elasticsearch")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 0.247s: test/cluster/functionality.sh:60: executing 'oc get pod logging-es-data-master-nij68urm-1-7f3ck -o jsonpath='{ .status.containerStatuses[?(@.name=="elasticsearch")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
true
There was no error output from the command.
[INFO] Checking that Elasticsearch pod logging-es-data-master-nij68urm-1-7f3ck recovered its indices after starting...
Running test/cluster/functionality.sh:63: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_cluster/state/master_node' -w '%{response_code}'' expecting any result and text '}200$'; re-trying every 0.2s until completion or 600.000s...
SUCCESS after 0.435s: test/cluster/functionality.sh:63: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_cluster/state/master_node' -w '%{response_code}'' expecting any result and text '}200$'; re-trying every 0.2s until completion or 600.000s
Standard output from the command:
{"cluster_name":"logging-es","master_node":"m8K0X_TASYaiQ9JFtFjmOA"}200
There was no error output from the command.
[INFO] Elasticsearch pod logging-es-data-master-nij68urm-1-7f3ck is the master
[INFO] Checking that Elasticsearch pod logging-es-data-master-nij68urm-1-7f3ck has persisted indices created by Fluentd...
Running test/cluster/functionality.sh:76: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_cat/indices?h=index'' expecting any result and text '^(project|\.operations)\.'; re-trying every 0.2s until completion or 600.000s...
SUCCESS after 0.360s: test/cluster/functionality.sh:76: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_cat/indices?h=index'' expecting any result and text '^(project|\.operations)\.'; re-trying every 0.2s until completion or 600.000s
Standard output from the command:
.kibana.d033e22ae348aeb5660fc2140aec35850c4da997                
.kibana                                                         
project.default.2d141b54-4d1f-11e7-83b0-0e6fb895db82.2017.06.09 
.searchguard.logging-es-data-master-nij68urm-1-7f3ck            
project.logging.319a45c2-4d1f-11e7-83b0-0e6fb895db82.2017.06.09 

There was no error output from the command.
[INFO] Cheking for index project.default.2d141b54-4d1f-11e7-83b0-0e6fb895db82 with Kibana pod logging-kibana-1-q2g1v...
Running test/cluster/functionality.sh:100: executing 'sudo -E VERBOSE=true go run '/data/src/github.com/openshift/origin-aggregated-logging/hack/testing/check-logs.go' 'logging-kibana-1-q2g1v' 'logging-es:9200' 'project.default.2d141b54-4d1f-11e7-83b0-0e6fb895db82' '/var/log/containers/*_2d141b54-4d1f-11e7-83b0-0e6fb895db82_*.log' '500' 'admin' '7NE-Tuce7Q8xnKgAeYP8mX4EvOadsadfvOZSCmilRuw' '127.0.0.1'' expecting success...
SUCCESS after 11.901s: test/cluster/functionality.sh:100: executing 'sudo -E VERBOSE=true go run '/data/src/github.com/openshift/origin-aggregated-logging/hack/testing/check-logs.go' 'logging-kibana-1-q2g1v' 'logging-es:9200' 'project.default.2d141b54-4d1f-11e7-83b0-0e6fb895db82' '/var/log/containers/*_2d141b54-4d1f-11e7-83b0-0e6fb895db82_*.log' '500' 'admin' '7NE-Tuce7Q8xnKgAeYP8mX4EvOadsadfvOZSCmilRuw' '127.0.0.1'' expecting success
Standard output from the command:
Executing command [oc exec logging-kibana-1-q2g1v -- curl -s --key /etc/kibana/keys/key --cert /etc/kibana/keys/cert --cacert /etc/kibana/keys/ca -H 'X-Proxy-Remote-User: admin' -H 'Authorization: Bearer 7NE-Tuce7Q8xnKgAeYP8mX4EvOadsadfvOZSCmilRuw' -H 'X-Forwarded-For: 127.0.0.1' -XGET "https://logging-es:9200/project.default.2d141b54-4d1f-11e7-83b0-0e6fb895db82.*/_search?q=hostname:ip-172-18-4-93&fields=message&size=500"]
Failure - no log entries found in Elasticsearch logging-es:9200 for index project.default.2d141b54-4d1f-11e7-83b0-0e6fb895db82

There was no error output from the command.
[INFO] Cheking for index project.logging.319a45c2-4d1f-11e7-83b0-0e6fb895db82 with Kibana pod logging-kibana-1-q2g1v...
Running test/cluster/functionality.sh:100: executing 'sudo -E VERBOSE=true go run '/data/src/github.com/openshift/origin-aggregated-logging/hack/testing/check-logs.go' 'logging-kibana-1-q2g1v' 'logging-es:9200' 'project.logging.319a45c2-4d1f-11e7-83b0-0e6fb895db82' '/var/log/containers/*_319a45c2-4d1f-11e7-83b0-0e6fb895db82_*.log' '500' 'admin' '7NE-Tuce7Q8xnKgAeYP8mX4EvOadsadfvOZSCmilRuw' '127.0.0.1'' expecting success...
SUCCESS after 0.616s: test/cluster/functionality.sh:100: executing 'sudo -E VERBOSE=true go run '/data/src/github.com/openshift/origin-aggregated-logging/hack/testing/check-logs.go' 'logging-kibana-1-q2g1v' 'logging-es:9200' 'project.logging.319a45c2-4d1f-11e7-83b0-0e6fb895db82' '/var/log/containers/*_319a45c2-4d1f-11e7-83b0-0e6fb895db82_*.log' '500' 'admin' '7NE-Tuce7Q8xnKgAeYP8mX4EvOadsadfvOZSCmilRuw' '127.0.0.1'' expecting success
Standard output from the command:
Executing command [oc exec logging-kibana-1-q2g1v -- curl -s --key /etc/kibana/keys/key --cert /etc/kibana/keys/cert --cacert /etc/kibana/keys/ca -H 'X-Proxy-Remote-User: admin' -H 'Authorization: Bearer 7NE-Tuce7Q8xnKgAeYP8mX4EvOadsadfvOZSCmilRuw' -H 'X-Forwarded-For: 127.0.0.1' -XGET "https://logging-es:9200/project.logging.319a45c2-4d1f-11e7-83b0-0e6fb895db82.*/_search?q=hostname:ip-172-18-4-93&fields=message&size=500"]
Failure - no log entries found in Elasticsearch logging-es:9200 for index project.logging.319a45c2-4d1f-11e7-83b0-0e6fb895db82

There was no error output from the command.
[INFO] Checking that Elasticsearch pod logging-es-data-master-nij68urm-1-7f3ck contains common data model index templates...
Running test/cluster/functionality.sh:105: executing 'oc exec logging-es-data-master-nij68urm-1-7f3ck -- ls -1 /usr/share/elasticsearch/index_templates' expecting success...
SUCCESS after 0.312s: test/cluster/functionality.sh:105: executing 'oc exec logging-es-data-master-nij68urm-1-7f3ck -- ls -1 /usr/share/elasticsearch/index_templates' expecting success
Standard output from the command:
com.redhat.viaq-openshift-operations.template.json
com.redhat.viaq-openshift-project.template.json
org.ovirt.viaq-collectd.template.json

There was no error output from the command.
Running test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_template/com.redhat.viaq-openshift-operations.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'...
SUCCESS after 0.365s: test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_template/com.redhat.viaq-openshift-operations.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'
Standard output from the command:
200
There was no error output from the command.
Running test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_template/com.redhat.viaq-openshift-project.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'...
SUCCESS after 0.584s: test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_template/com.redhat.viaq-openshift-project.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'
Standard output from the command:
200
There was no error output from the command.
Running test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_template/org.ovirt.viaq-collectd.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'...
SUCCESS after 0.402s: test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-data-master-nij68urm-1-7f3ck' '/_template/org.ovirt.viaq-collectd.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'
Standard output from the command:
200
There was no error output from the command.
Running test/cluster/functionality.sh:40: executing 'oc login --username=admin --password=admin' expecting success...
SUCCESS after 0.243s: test/cluster/functionality.sh:40: executing 'oc login --username=admin --password=admin' expecting success
Standard output from the command:
Login successful.

You have access to the following projects and can switch between them with 'oc project <projectname>':

    default
    kube-public
    kube-system
  * logging
    openshift
    openshift-infra

Using project "logging".

There was no error output from the command.
Running test/cluster/functionality.sh:44: executing 'oc login --username=system:admin' expecting success...
SUCCESS after 0.245s: test/cluster/functionality.sh:44: executing 'oc login --username=system:admin' expecting success
Standard output from the command:
Logged into "https://172.18.4.93:8443" as "system:admin" using existing credentials.

You have access to the following projects and can switch between them with 'oc project <projectname>':

    default
    kube-public
    kube-system
  * logging
    openshift
    openshift-infra

Using project "logging".

There was no error output from the command.
Running test/cluster/functionality.sh:45: executing 'oc project logging' expecting success...
SUCCESS after 0.211s: test/cluster/functionality.sh:45: executing 'oc project logging' expecting success
Standard output from the command:
Already on project "logging" on server "https://172.18.4.93:8443".

There was no error output from the command.
[INFO] Testing Kibana pod logging-kibana-ops-1-mq6b8 for a successful start...
Running test/cluster/functionality.sh:52: executing 'oc exec logging-kibana-ops-1-mq6b8 -c kibana -- curl -s --request HEAD --write-out '%{response_code}' http://localhost:5601/' expecting any result and text '200'; re-trying every 0.2s until completion or 600.000s...
SUCCESS after 120.281s: test/cluster/functionality.sh:52: executing 'oc exec logging-kibana-ops-1-mq6b8 -c kibana -- curl -s --request HEAD --write-out '%{response_code}' http://localhost:5601/' expecting any result and text '200'; re-trying every 0.2s until completion or 600.000s
Standard output from the command:
200
There was no error output from the command.
Running test/cluster/functionality.sh:53: executing 'oc get pod logging-kibana-ops-1-mq6b8 -o jsonpath='{ .status.containerStatuses[?(@.name=="kibana")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 0.237s: test/cluster/functionality.sh:53: executing 'oc get pod logging-kibana-ops-1-mq6b8 -o jsonpath='{ .status.containerStatuses[?(@.name=="kibana")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
true
There was no error output from the command.
Running test/cluster/functionality.sh:54: executing 'oc get pod logging-kibana-ops-1-mq6b8 -o jsonpath='{ .status.containerStatuses[?(@.name=="kibana-proxy")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 0.222s: test/cluster/functionality.sh:54: executing 'oc get pod logging-kibana-ops-1-mq6b8 -o jsonpath='{ .status.containerStatuses[?(@.name=="kibana-proxy")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
true
There was no error output from the command.
[INFO] Testing Elasticsearch pod logging-es-ops-data-master-1lbwcltv-1-w0wm7 for a successful start...
Running test/cluster/functionality.sh:59: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/' -X HEAD -w '%{response_code}'' expecting any result and text '200'; re-trying every 0.2s until completion or 600.000s...
SUCCESS after 0.366s: test/cluster/functionality.sh:59: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/' -X HEAD -w '%{response_code}'' expecting any result and text '200'; re-trying every 0.2s until completion or 600.000s
Standard output from the command:
200
There was no error output from the command.
Running test/cluster/functionality.sh:60: executing 'oc get pod logging-es-ops-data-master-1lbwcltv-1-w0wm7 -o jsonpath='{ .status.containerStatuses[?(@.name=="elasticsearch")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s...
SUCCESS after 0.209s: test/cluster/functionality.sh:60: executing 'oc get pod logging-es-ops-data-master-1lbwcltv-1-w0wm7 -o jsonpath='{ .status.containerStatuses[?(@.name=="elasticsearch")].ready }'' expecting any result and text 'true'; re-trying every 0.2s until completion or 60.000s
Standard output from the command:
true
There was no error output from the command.
[INFO] Checking that Elasticsearch pod logging-es-ops-data-master-1lbwcltv-1-w0wm7 recovered its indices after starting...
Running test/cluster/functionality.sh:63: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_cluster/state/master_node' -w '%{response_code}'' expecting any result and text '}200$'; re-trying every 0.2s until completion or 600.000s...
SUCCESS after 0.383s: test/cluster/functionality.sh:63: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_cluster/state/master_node' -w '%{response_code}'' expecting any result and text '}200$'; re-trying every 0.2s until completion or 600.000s
Standard output from the command:
{"cluster_name":"logging-es-ops","master_node":"gCpihUBoSUG8P_xjUHU0TQ"}200
There was no error output from the command.
[INFO] Elasticsearch pod logging-es-ops-data-master-1lbwcltv-1-w0wm7 is the master
[INFO] Checking that Elasticsearch pod logging-es-ops-data-master-1lbwcltv-1-w0wm7 has persisted indices created by Fluentd...
Running test/cluster/functionality.sh:76: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_cat/indices?h=index'' expecting any result and text '^(project|\.operations)\.'; re-trying every 0.2s until completion or 600.000s...
SUCCESS after 0.403s: test/cluster/functionality.sh:76: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_cat/indices?h=index'' expecting any result and text '^(project|\.operations)\.'; re-trying every 0.2s until completion or 600.000s
Standard output from the command:
.kibana.d033e22ae348aeb5660fc2140aec35850c4da997         
.operations.2017.06.09                                   
.kibana                                                  
.searchguard.logging-es-ops-data-master-1lbwcltv-1-w0wm7 

There was no error output from the command.
[INFO] Cheking for index .operations with Kibana pod logging-kibana-ops-1-mq6b8...
Running test/cluster/functionality.sh:100: executing 'sudo -E VERBOSE=true go run '/data/src/github.com/openshift/origin-aggregated-logging/hack/testing/check-logs.go' 'logging-kibana-ops-1-mq6b8' 'logging-es-ops:9200' '.operations' '/var/log/messages' '500' 'admin' 'ZX199OEG8d5EnevyRVFTh2sYKPi-ixKNMSKL_HUxpQk' '127.0.0.1'' expecting success...
SUCCESS after 0.726s: test/cluster/functionality.sh:100: executing 'sudo -E VERBOSE=true go run '/data/src/github.com/openshift/origin-aggregated-logging/hack/testing/check-logs.go' 'logging-kibana-ops-1-mq6b8' 'logging-es-ops:9200' '.operations' '/var/log/messages' '500' 'admin' 'ZX199OEG8d5EnevyRVFTh2sYKPi-ixKNMSKL_HUxpQk' '127.0.0.1'' expecting success
Standard output from the command:
Executing command [oc exec logging-kibana-ops-1-mq6b8 -- curl -s --key /etc/kibana/keys/key --cert /etc/kibana/keys/cert --cacert /etc/kibana/keys/ca -H 'X-Proxy-Remote-User: admin' -H 'Authorization: Bearer ZX199OEG8d5EnevyRVFTh2sYKPi-ixKNMSKL_HUxpQk' -H 'X-Forwarded-For: 127.0.0.1' -XGET "https://logging-es-ops:9200/.operations.*/_search?q=hostname:ip-172-18-4-93&fields=message&size=500"]
Failure - no log entries found in Elasticsearch logging-es-ops:9200 for index .operations

There was no error output from the command.
[INFO] Checking that Elasticsearch pod logging-es-ops-data-master-1lbwcltv-1-w0wm7 contains common data model index templates...
Running test/cluster/functionality.sh:105: executing 'oc exec logging-es-ops-data-master-1lbwcltv-1-w0wm7 -- ls -1 /usr/share/elasticsearch/index_templates' expecting success...
SUCCESS after 0.282s: test/cluster/functionality.sh:105: executing 'oc exec logging-es-ops-data-master-1lbwcltv-1-w0wm7 -- ls -1 /usr/share/elasticsearch/index_templates' expecting success
Standard output from the command:
com.redhat.viaq-openshift-operations.template.json
com.redhat.viaq-openshift-project.template.json
org.ovirt.viaq-collectd.template.json

There was no error output from the command.
Running test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_template/com.redhat.viaq-openshift-operations.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'...
SUCCESS after 0.468s: test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_template/com.redhat.viaq-openshift-operations.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'
Standard output from the command:
200
There was no error output from the command.
Running test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_template/com.redhat.viaq-openshift-project.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'...
SUCCESS after 0.383s: test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_template/com.redhat.viaq-openshift-project.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'
Standard output from the command:
200
There was no error output from the command.
Running test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_template/org.ovirt.viaq-collectd.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'...
SUCCESS after 0.380s: test/cluster/functionality.sh:107: executing 'curl_es 'logging-es-ops-data-master-1lbwcltv-1-w0wm7' '/_template/org.ovirt.viaq-collectd.template.json' -X HEAD -w '%{response_code}'' expecting success and text '200'
Standard output from the command:
200
There was no error output from the command.
running test test-curator.sh
configmap "logging-curator" deleted
configmap "logging-curator" created
deploymentconfig "logging-curator" scaled
deploymentconfig "logging-curator" scaled
configmap "logging-curator" deleted
configmap "logging-curator" created
deploymentconfig "logging-curator" scaled
deploymentconfig "logging-curator" scaled
Error: the curator pod should be in the error state
logging-curator-1-84xsd
Error: did not find the correct error message
error: expected 'logs (POD | TYPE/NAME) [CONTAINER_NAME]'.
POD or TYPE/NAME is a required argument for the logs command
See 'oc logs -h' for help and examples.
The project name must match this regex: [^[a-z0-9]([-a-z0-9]*[a-z0-9])?$] This does not match: [-BOGUS^PROJECT^NAME]
configmap "logging-curator" deleted
configmap "logging-curator" created
deploymentconfig "logging-curator" scaled
deploymentconfig "logging-curator" scaled
[ERROR] PID 4245: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:303: `echo running test $test` exited with status 1.
[INFO] 		Stack Trace: 
[INFO] 		  1: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:303: `echo running test $test`
[INFO]   Exiting with code 1.
/data/src/github.com/openshift/origin-aggregated-logging/hack/lib/log/system.sh: line 31:  4608 Terminated              sar -A -o "${binary_logfile}" 1 86400 > /dev/null 2> "${stderr_logfile}"  (wd: /data/src/github.com/openshift/origin-aggregated-logging)
[INFO] [CLEANUP] Beginning cleanup routines...
[INFO] [CLEANUP] Dumping cluster events to /tmp/origin-aggregated-logging/artifacts/events.txt
[INFO] [CLEANUP] Dumping etcd contents to /tmp/origin-aggregated-logging/artifacts/etcd
[WARNING] No compiled `etcdhelper` binary was found. Attempting to build one using:
[WARNING]   $ hack/build-go.sh tools/etcdhelper
++ Building go targets for linux/amd64: tools/etcdhelper
/data/src/github.com/openshift/origin-aggregated-logging/../origin/hack/build-go.sh took 272 seconds
2017-06-09 10:55:30.747418 I | warning: ignoring ServerName for user-provided CA for backwards compatibility is deprecated
[INFO] [CLEANUP] Dumping container logs to /tmp/origin-aggregated-logging/logs/containers
[INFO] [CLEANUP] Truncating log files over 200M
[INFO] [CLEANUP] Stopping docker containers
[INFO] [CLEANUP] Removing docker containers
Error: No such image, container or task: 70e18d140838
json: cannot unmarshal array into Go value of type types.ContainerJSON
Error response from daemon: You cannot remove a running container cb23e4d7da528e5cf68ff814d2e4183303d3113c11367c222598cb715a4ab359. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container c43b6c60fd85117dd2b8134d90585574f6eaceba6bbebe285524ae104ecf3044. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 26b0530d2b5bd52b9ce9e36e5fdcac5f2441c53ab702b66207d7d9e3927a7953. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 11bf46766abe0360eead62066ebcbf395ed83a2947c3927a40733d48ee025228. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 243466c6a7c3b21ff40e5c23b7f1fd7fddc58fbd01f9304be359c28680909264. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 82064bd28a9e0f7040d099fd2986e70bf3337c3ad2b8e4169b088c0caad56d5a. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 9f208267cad464755d3efc850440e93d55b3c03f3f7d877d3a052ec85dc48c2f. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 3b7f71aceeda370a4b66ad2283c9bb6ea23fddcc2b3532408a75006d8ce2f54c. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 7170de21d8a6bd7e0a498643a737c7738e18ea5bb701ab72689e598f6d8f5531. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 3476ee807d33ea52dc866fcf0231937571610637e74954198169e9ea3d4c8e3f. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 141a2a42a867e5365b6f1b5649e78710b8179e17d1c3085910dbf0ee648bba1c. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 21302b939b383159026445e72111e466da8c0f1c2bde167435ca98dfefc21b0b. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container b5f52f338703881d353f23e4bf4d5f0e6e9d6b5f6627337fbafbb643da605674. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 034d3576118f082d409da66bb1c8e2472b4c3d89f4864302de7dc45f9af61179. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container e8b7088f975fed198dd6b18103c19d5eef328c6592420c304babf78e77866942. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 6949884721e8542d86bd8cde2ba34840fb80cb9b11546512ec4fc35aa36c262c. Stop the container before attempting removal or use -f
[INFO] [CLEANUP] Killing child processes
[INFO] [CLEANUP] Pruning etcd data directory
[ERROR] /data/src/github.com/openshift/origin-aggregated-logging/logging.sh exited with code 1 after 00h 40m 59s
Error while running ssh/sudo command: 
set -e
pushd /data/src/github.com/openshift//origin-aggregated-logging/hack/testing >/dev/null
export PATH=$GOPATH/bin:$PATH

echo '***************************************************'
echo 'Running GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh...'
time GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh
echo 'Finished GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh'
echo '***************************************************'

popd >/dev/null
        
The SSH command responded with a non-zero exit status. Vagrant
assumes that this means the command failed. The output for this command
should be in the log above. Please read the output to determine what
went wrong.
==> openshiftdev: Downloading logs
==> openshiftdev: Downloading artifacts from '/var/log/yum.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts/yum.log'
==> openshiftdev: Downloading artifacts from '/var/log/secure' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts/secure'
==> openshiftdev: Downloading artifacts from '/var/log/audit/audit.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts/audit.log'
==> openshiftdev: Downloading artifacts from '/tmp/origin-aggregated-logging/' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts'
Build step 'Execute shell' marked build as failure
[description-setter] Could not determine description.
[PostBuildScript] - Execution post build scripts.
[workspace] $ /bin/sh -xe /tmp/hudson3978827358491594136.sh
+ INSTANCE_NAME=origin_logging-rhel7-1652
+ pushd origin
~/jobs/test-origin-aggregated-logging/workspace/origin ~/jobs/test-origin-aggregated-logging/workspace
+ rc=0
+ '[' -f .vagrant-openshift.json ']'
++ /usr/bin/vagrant ssh -c 'sudo ausearch -m avc'
+ ausearchresult='<no matches>'
+ rc=1
+ '[' '<no matches>' = '<no matches>' ']'
+ rc=0
+ /usr/bin/vagrant destroy -f
==> openshiftdev: Terminating the instance...
==> openshiftdev: Running cleanup tasks for 'shell' provisioner...
+ popd
~/jobs/test-origin-aggregated-logging/workspace
+ exit 0
[BFA] Scanning build for known causes...
[BFA] Found failure cause(s):
[BFA] Command Failure from category failure
[BFA] Done. 0s
Finished: FAILURE