Console Output
Skipping 280 KB..
Full Logitem={u'name': u'kibana_internal_key', u'file': u'kibana-internal.key'}) => {
"changed": false,
"content": "LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktLS0tLQpNSUlFcEFJQkFBS0NBUUVBdVgzNHhLQjdHWVJIYUovS0xXQ0ZzRlZKVU42dHc4dmRJSTluWDc2ajRRU1ZOMEJRCk1JWVNjOE5LendyYnNiakw1NG9aUjRQWUNERGxQWDlnSzNkemQySzRNV2FPa3YrNjZscEo5ZVZuQjR0cTdKbWMKQXo5RTZVWW1LeHVOQ0EwQ1pPMEpaOUhqT3dnTlo1OTdBU0hCenlCWkoycUszNlZYRmFydEVDTWsyZDlnSUU3MQp5cnlwYitmcU1jU1BXZ2phT3FOS0I4YWRidS9ENGxFYnhpUlo1RExSazFObCs2Qml6M0hOR0NxVDJZVDJkblRyCndCNE1YbENGRGRRZ2RWK1Y0VXFTK0xKcUtnRTM5c05MQ09Xa3VuaHNFUFRER3RRaTNzVXpmWGwzcktwRjRqck8KWTVmaEFDM2lHM3pRSTFNQVNHd09kQlNhU1h0b0FnbDdKcndjNlFJREFRQUJBb0lCQUhXS2p1NUNaMThuQkgwVwo3RzNPd1VpWThzbU5JZjExbk4wNkloNTQxcTRMVW1iZG1UTDJjcERxeXVmaUVJOVE4OXo0Rk1iTmxFSzBnVDkwCkRNUGVnTXZCUWNrRUhRcW5oSkZtdjRGVGFmTk05R2VUZTFwUGFHS044amZVMEl5dWVicnN2YzROSVhuUnczVEwKaVkyUkQ1Z3AvblRNdmMyMG56ZDZ4Wmc1UE92VjNZYVR4WXYydlkzVmtYSjBxMUwwL1pGdnd6bWFSR29GbkdYWgo3UStTN2MwVGRwcjdBS1hBS0RjVzBCaDBSZHFpcytJZEJjUG8zZUI0L2VTYWJCZDBEL1Z4cHNkZ3FPakVkdjdiCkJaMlFhMi81b0RXTTVYYnF1NjZTNjVGcnkycVdESWY1bEh1Q00vL0NDeUgvQWVMMUtOQTcrS3AyKzBLemFGLzEKd2VTaHVZRUNnWUVBM1NOZS81cHY5R3V4aDBVMHdiUmh3T29ibXdTZFlVT2gxSkFrTGEwUW9QMWNjNDJtL01VNwpkTFl6NnZFRlhBczNYSlRFMlhscTBkMVpvMzEzM2RWcS9PWUZqcmJxZGIyNDN4U3NkdG5uUVlyL3lkMk54SHpqCkl1TDkySWZqWXhDZzZYU0Z1aWN3TklXaW1vMEVYaWNhRlVJK2M3UnZXS3NIakE0NmpMZS9SVmtDZ1lFQTFyd0IKbU4vRjBrWnhFRFJMdG91Yk9SdVZETkhHZWNGczAvdy9seFFqc3JBZ2doR1N1RjdpRGZxc0pvVXZ5MDBlaFVnSwpvRzdjL0wycWJCWDlWa2lrZFZGS0liWEtzS01RSXNPakdKcTcrb3lOZ3R3dEJsa2xWbitlVmpSWmY0UUt2MCtlClRoZ2EzbWQxZTZVcXRRZDBHb01SemNaSlp4elMreWQxY0NFaVVoRUNnWUFPaDQ1c1dPZFdOZVU2TUEvaFVrckIKOUIvUU5hTnBpcG9OYjFNUk5UZk1mQmtnOW9Pc1JBRStEK0tsWXlTcEFZdW5wNWF2ZTB6TGNURzhqamZiK1hQSQpIZ3pyYlpWR0d5c1ViVFZQc2MyQi92SURmMjBiSmVGK24wOXlkS3M2RFJPbGd5UnVNTGo0R25ldWsrbGZqazRtCmEyM1RDRjN3ZzN4QmRZWGZUUXRpTVFLQmdRREdTOGVOek5kNVh1TjUrMUVQdWN3VjZJcXowK1JjKzJuc09MemcKWHhDNEtqMWEzNitGNHRrTjM3YXB1OFJ2RVVCbUJDa01EbFMwY05HNERuWnIvNWtmWFhuL3Qwajk1UmM1ZzBCUgpzRkozQk9TSk81bTRpd29YM3JIeEdWYXNrdHI1NENSb0tZcG1aMVB1WXBBVnJGUmpSelpodUFLQTNmVlhTUkVDCkF4Y1dnUUtCZ1FEWWxKWWRYZXFMMHlkL1ZmdFkwYjNvR1R1M251WUk1aDlyWW5xVHpySUc1R0E1M3J5L1ZFbXAKOGlwa1V6dVF3MHZhQk1YODgwRzh0N0JGclhabnFZZTFjSlI3NmMxTEh6NDhrVHllN0p4SDZJUEFlVjdlR1VpZApsTi9YdkRHVlZWVHhaOWFnWU1FcE5YcnNXck0rTExLMDdKQ08yRytlc0krSG56REpZeXg1K0E9PQotLS0tLUVORCBSU0EgUFJJVkFURSBLRVktLS0tLQo=",
"encoding": "base64",
"item": {
"file": "kibana-internal.key",
"name": "kibana_internal_key"
},
"source": "/etc/origin/logging/kibana-internal.key"
}
ok: [openshift] => (item={u'name': u'kibana_internal_cert', u'file': u'kibana-internal.crt'}) => {
"changed": false,
"content": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURUakNDQWphZ0F3SUJBZ0lCQWpBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUwTXpjeU1sb1hEVEU1TURZd09ERTBNemN5TTFvdwpGakVVTUJJR0ExVUVBeE1MSUd0cFltRnVZUzF2Y0hNd2dnRWlNQTBHQ1NxR1NJYjNEUUVCQVFVQUE0SUJEd0F3CmdnRUtBb0lCQVFDNWZmakVvSHNaaEVkb244b3RZSVd3VlVsUTNxM0R5OTBnajJkZnZxUGhCSlUzUUZBd2hoSnoKdzByUEN0dXh1TXZuaWhsSGc5Z0lNT1U5ZjJBcmQzTjNZcmd4Wm82Uy83cnFXa24xNVdjSGkycnNtWndEUDBUcApSaVlyRzQwSURRSms3UWxuMGVNN0NBMW5uM3NCSWNIUElGa25hb3JmcFZjVnF1MFFJeVRaMzJBZ1R2WEt2S2x2CjUrb3h4STlhQ05vNm8wb0h4cDF1NzhQaVVSdkdKRm5rTXRHVFUyWDdvR0xQY2MwWUtwUFpoUFoyZE92QUhneGUKVUlVTjFDQjFYNVhoU3BMNHNtb3FBVGYydzBzSTVhUzZlR3dROU1NYTFDTGV4VE45ZVhlc3FrWGlPczVqbCtFQQpMZUliZk5BalV3QkliQTUwRkpwSmUyZ0NDWHNtdkJ6cEFnTUJBQUdqZ1o0d2dac3dEZ1lEVlIwUEFRSC9CQVFECkFnV2dNQk1HQTFVZEpRUU1NQW9HQ0NzR0FRVUZCd01CTUF3R0ExVWRFd0VCL3dRQ01BQXdaZ1lEVlIwUkJGOHcKWFlJTElHdHBZbUZ1WVMxdmNIT0NMQ0JyYVdKaGJtRXRiM0J6TG5KdmRYUmxjaTVrWldaaGRXeDBMbk4yWXk1agpiSFZ6ZEdWeUxteHZZMkZzZ2hnZ2EybGlZVzVoTGpFeU55NHdMakF1TVM1NGFYQXVhVytDQm10cFltRnVZVEFOCkJna3Foa2lHOXcwQkFRc0ZBQU9DQVFFQVF3M1BITzQwak9KZmRVTU44SnZQcnUrMlN0NmU2cU1jMFRYTUkzcDEKaWdqVHA3TjhQanBhTFEzc3U0aW0xV1Y2RmZPTFNyNlF2eHVTSlhtbXM5TWpUNjFyamtabkJvT2FKUFdkdWVOUAphR09uOWgvbDV5bEYrZmlqUXM1NTRjMHhmSitGSU5KOGgrVlh1MVZyUG9KQ2pwOUN0N2VpMzAycjgwbFJPbmk1Cm45WTFwaVJ4U2wzRng0TlhWbnFQU0ZBZk0zU2ZHcDgyTlVzVHNoeCtVZk1LZ21NQXVFS2NxeUlTTmdrOHBiR2sKbitGeEd5RjkzdDhjaVZDQ3JRK3JWUkltaGRlMklRSmc4cnFDSzJCZmFjbFAzT1FkQk0yUlZUWlVKY3FQZThMRAo3S1JER3NmYmt1OXlHN1MrQU5FZGFLN3dIRmpUdEppWnV0OEU0MVFOK2pzb0tRPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQotLS0tLUJFR0lOIENFUlRJRklDQVRFLS0tLS0KTUlJQzJqQ0NBY0tnQXdJQkFnSUJBVEFOQmdrcWhraUc5dzBCQVFzRkFEQWVNUnd3R2dZRFZRUURFeE5zYjJkbgphVzVuTFhOcFoyNWxjaTEwWlhOME1CNFhEVEUzTURZd09ERTBNemN5TUZvWERUSXlNRFl3TnpFME16Y3lNVm93CkhqRWNNQm9HQTFVRUF4TVRiRzluWjJsdVp5MXphV2R1WlhJdGRHVnpkRENDQVNJd0RRWUpLb1pJaHZjTkFRRUIKQlFBRGdnRVBBRENDQVFvQ2dnRUJBTWZSWmNCeFkxWVRMSmJscGd4dXZMTnJGcFNwbEVsMzVTTisrcU0xTmZMSAphdXZjNzdZdVVUQ3M5dVc3YUFib21BOEtPR1loRVN1YlNZWkdpcFlleVBUZFZZVkxWWmVYM3h6YnFkL1FuRGdSCm9mM3Uyd0hwT2EyNUw1UEdWUkIrS3JINStEb2VyN0M5OE0raWdVUnhZdnhBaHRwdFZpNTZyRDVidmlXSVNaWUsKZ3BNb2xnYkl0OU5FU2dOVWNRcGFZRjA4UW5ON0JUN2pFdW81WUZERmlhUWhUMndOcmduSWdjUnZ1aEpkUjZ1dAoydVp3NEpMRUJnZ214Z2VyeGNjWEpRUkZCWkxwaVNCMEZlY0s3NnJJMm1uckpxU0l1VDJmR3ZzS2g2Q1M3M1drCk96VzBjSkc3N3pYeG9QUXRWWStkRXd4OW5sTjBxczJ0UzVuLzd4V0U0M1VDQXdFQUFhTWpNQ0V3RGdZRFZSMFAKQVFIL0JBUURBZ0trTUE4R0ExVWRFd0VCL3dRRk1BTUJBZjh3RFFZSktvWklodmNOQVFFTEJRQURnZ0VCQUxvYwpKYnlZcGZRMGZidFF1N1dreTRqL2lQZVM0cHlralJsNmRCVnlkRGpic0tjWW5odXJaZi9QaWRKSkJETDRWZ29ZCmV3SDZSNXVaT3pjWlFaSXdvbk14cjBrMnIwZE9ndytJWkcxa0FXYzYwY0JIUHc0emJuYUdZL2tpREd6TFNUdm8KaUlFZjdxUW9UWTVCMTNQcDFHME1YR2JYWUd5a00vTjg0eXFtWEdNUDQrdGlXUGV6aUpkeFY1ZW92dHJWRFVrWgpvbnN4dTlMTVRIVjNWU3dmaVg5Skp6NC95Vk01VlJJdUNrZjZMalVtcU1RWk5IaGdVRms1T0tqYlNyTnRmZC9ICkZ2NFFkN04yT3FLa05ObFMzeDhSQVlnUXl4Q2VIMEVzZEpTVkcxNC9QTUI1UHZsc1lKZkRvN3crMzB3dEdFZi8KT210WHllZXhUbDdILzZRMC9Xcz0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
"encoding": "base64",
"item": {
"file": "kibana-internal.crt",
"name": "kibana_internal_cert"
},
"source": "/etc/origin/logging/kibana-internal.crt"
}
ok: [openshift] => (item={u'name': u'server_tls', u'file': u'server-tls.json'}) => {
"changed": false,
"content": "Ly8gU2VlIGZvciBhdmFpbGFibGUgb3B0aW9uczogaHR0cHM6Ly9ub2RlanMub3JnL2FwaS90bHMuaHRtbCN0bHNfdGxzX2NyZWF0ZXNlcnZlcl9vcHRpb25zX3NlY3VyZWNvbm5lY3Rpb25saXN0ZW5lcgp0bHNfb3B0aW9ucyA9IHsKCWNpcGhlcnM6ICdrRUVDREg6K2tFRUNESCtTSEE6a0VESDora0VESCtTSEE6K2tFREgrQ0FNRUxMSUE6a0VDREg6K2tFQ0RIK1NIQTprUlNBOitrUlNBK1NIQTora1JTQStDQU1FTExJQTohYU5VTEw6IWVOVUxMOiFTU0x2MjohUkM0OiFERVM6IUVYUDohU0VFRDohSURFQTorM0RFUycsCglob25vckNpcGhlck9yZGVyOiB0cnVlCn0K",
"encoding": "base64",
"item": {
"file": "server-tls.json",
"name": "server_tls"
},
"source": "/etc/origin/logging/server-tls.json"
}
TASK [openshift_logging_kibana : Set logging-kibana-ops service] ***************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:57
changed: [openshift] => {
"changed": true,
"results": {
"clusterip": "172.30.78.252",
"cmd": "/bin/oc get service logging-kibana-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Service",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:12Z",
"name": "logging-kibana-ops",
"namespace": "logging",
"resourceVersion": "1460",
"selfLink": "/api/v1/namespaces/logging/services/logging-kibana-ops",
"uid": "198b7f6f-4c58-11e7-8cf8-0ef95424e992"
},
"spec": {
"clusterIP": "172.30.78.252",
"ports": [
{
"port": 443,
"protocol": "TCP",
"targetPort": "oaproxy"
}
],
"selector": {
"component": "kibana-ops",
"provider": "openshift"
},
"sessionAffinity": "None",
"type": "ClusterIP"
},
"status": {
"loadBalancer": {}
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:74
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_key | trim | length
> 0 }}
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:79
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_cert | trim | length
> 0 }}
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:84
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_kibana_ca | trim | length >
0 }}
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : set_fact] *************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:89
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_kibana : Generating Kibana route template] *************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:94
ok: [openshift] => {
"changed": false,
"checksum": "79804e2c424b05848802a3e3f5b9e69bd4e1c5d8",
"dest": "/tmp/openshift-logging-ansible-eQPaO6/templates/kibana-route.yaml",
"gid": 0,
"group": "root",
"md5sum": "b217071d1466fb89d402428d82345b0b",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2726,
"src": "/root/.ansible/tmp/ansible-tmp-1496932693.3-76805340376234/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_kibana : Setting Kibana route] *************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:114
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get route logging-kibana-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "Route",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:14Z",
"labels": {
"component": "support",
"logging-infra": "support",
"provider": "openshift"
},
"name": "logging-kibana-ops",
"namespace": "logging",
"resourceVersion": "1463",
"selfLink": "/oapi/v1/namespaces/logging/routes/logging-kibana-ops",
"uid": "1a716b60-4c58-11e7-8cf8-0ef95424e992"
},
"spec": {
"host": "kibana-ops.router.default.svc.cluster.local",
"tls": {
"caCertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwODE0MzcyMFoXDTIyMDYwNzE0MzcyMVow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAMfRZcBxY1YTLJblpgxuvLNrFpSplEl35SN++qM1NfLH\nauvc77YuUTCs9uW7aAbomA8KOGYhESubSYZGipYeyPTdVYVLVZeX3xzbqd/QnDgR\nof3u2wHpOa25L5PGVRB+KrH5+Doer7C98M+igURxYvxAhtptVi56rD5bviWISZYK\ngpMolgbIt9NESgNUcQpaYF08QnN7BT7jEuo5YFDFiaQhT2wNrgnIgcRvuhJdR6ut\n2uZw4JLEBggmxgerxccXJQRFBZLpiSB0FecK76rI2mnrJqSIuT2fGvsKh6CS73Wk\nOzW0cJG77zXxoPQtVY+dEwx9nlN0qs2tS5n/7xWE43UCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBALoc\nJbyYpfQ0fbtQu7Wky4j/iPeS4pykjRl6dBVydDjbsKcYnhurZf/PidJJBDL4VgoY\newH6R5uZOzcZQZIwonMxr0k2r0dOgw+IZG1kAWc60cBHPw4zbnaGY/kiDGzLSTvo\niIEf7qQoTY5B13Pp1G0MXGbXYGykM/N84yqmXGMP4+tiWPeziJdxV5eovtrVDUkZ\nonsxu9LMTHV3VSwfiX9JJz4/yVM5VRIuCkf6LjUmqMQZNHhgUFk5OKjbSrNtfd/H\nFv4Qd7N2OqKkNNlS3x8RAYgQyxCeH0EsdJSVG14/PMB5PvlsYJfDo7w+30wtGEf/\nOmtXyeexTl7H/6Q0/Ws=\n-----END CERTIFICATE-----\n",
"destinationCACertificate": "-----BEGIN CERTIFICATE-----\nMIIC2jCCAcKgAwIBAgIBATANBgkqhkiG9w0BAQsFADAeMRwwGgYDVQQDExNsb2dn\naW5nLXNpZ25lci10ZXN0MB4XDTE3MDYwODE0MzcyMFoXDTIyMDYwNzE0MzcyMVow\nHjEcMBoGA1UEAxMTbG9nZ2luZy1zaWduZXItdGVzdDCCASIwDQYJKoZIhvcNAQEB\nBQADggEPADCCAQoCggEBAMfRZcBxY1YTLJblpgxuvLNrFpSplEl35SN++qM1NfLH\nauvc77YuUTCs9uW7aAbomA8KOGYhESubSYZGipYeyPTdVYVLVZeX3xzbqd/QnDgR\nof3u2wHpOa25L5PGVRB+KrH5+Doer7C98M+igURxYvxAhtptVi56rD5bviWISZYK\ngpMolgbIt9NESgNUcQpaYF08QnN7BT7jEuo5YFDFiaQhT2wNrgnIgcRvuhJdR6ut\n2uZw4JLEBggmxgerxccXJQRFBZLpiSB0FecK76rI2mnrJqSIuT2fGvsKh6CS73Wk\nOzW0cJG77zXxoPQtVY+dEwx9nlN0qs2tS5n/7xWE43UCAwEAAaMjMCEwDgYDVR0P\nAQH/BAQDAgKkMA8GA1UdEwEB/wQFMAMBAf8wDQYJKoZIhvcNAQELBQADggEBALoc\nJbyYpfQ0fbtQu7Wky4j/iPeS4pykjRl6dBVydDjbsKcYnhurZf/PidJJBDL4VgoY\newH6R5uZOzcZQZIwonMxr0k2r0dOgw+IZG1kAWc60cBHPw4zbnaGY/kiDGzLSTvo\niIEf7qQoTY5B13Pp1G0MXGbXYGykM/N84yqmXGMP4+tiWPeziJdxV5eovtrVDUkZ\nonsxu9LMTHV3VSwfiX9JJz4/yVM5VRIuCkf6LjUmqMQZNHhgUFk5OKjbSrNtfd/H\nFv4Qd7N2OqKkNNlS3x8RAYgQyxCeH0EsdJSVG14/PMB5PvlsYJfDo7w+30wtGEf/\nOmtXyeexTl7H/6Q0/Ws=\n-----END CERTIFICATE-----\n",
"insecureEdgeTerminationPolicy": "Redirect",
"termination": "reencrypt"
},
"to": {
"kind": "Service",
"name": "logging-kibana-ops",
"weight": 100
},
"wildcardPolicy": "None"
},
"status": {
"ingress": [
{
"conditions": [
{
"lastTransitionTime": "2017-06-08T14:38:14Z",
"status": "True",
"type": "Admitted"
}
],
"host": "kibana-ops.router.default.svc.cluster.local",
"routerName": "router",
"wildcardPolicy": "None"
}
]
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Generate proxy session] ***********************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:125
ok: [openshift] => {
"ansible_facts": {
"session_secret": "5UtjXtmji1q5ORQ3LqoIMgc58O3GVImVtCnt88Nr8y4w821Avj0vXVo64GPcmDlK7PvI7iGBxByXE7KuOiIsKXrZ21d8tCNPe8qlrPxhIEazhkqN3lYYkGTf0xexfxwYXOQ27haWhY4bMyIwDdmyxPI7JevnB4jJgqxtJv17MNXS0bTLwMTWrq1aUBcfSZz0fDELQIV1"
},
"changed": false
}
TASK [openshift_logging_kibana : Generate oauth client secret] *****************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:132
ok: [openshift] => {
"ansible_facts": {
"oauth_secret": "QzrWYqRnoG6Utj3WhKOgsio1smVquO6YH8k4o80ltA8yw0y91z3OG9pcjClAWg6j"
},
"changed": false
}
TASK [openshift_logging_kibana : Create oauth-client template] *****************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:138
changed: [openshift] => {
"changed": true,
"checksum": "662c48b86474e7d8754def1f5ebf8619562dc75d",
"dest": "/tmp/openshift-logging-ansible-eQPaO6/templates/oauth-client.yml",
"gid": 0,
"group": "root",
"md5sum": "22fcc2ed07392270dda208f327855c13",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 332,
"src": "/root/.ansible/tmp/ansible-tmp-1496932694.67-169309840738289/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_kibana : Set kibana-proxy oauth-client] ****************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:146
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get oauthclient kibana-proxy -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "OAuthClient",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:06Z",
"labels": {
"logging-infra": "support"
},
"name": "kibana-proxy",
"resourceVersion": "1470",
"selfLink": "/oapi/v1/oauthclients/kibana-proxy",
"uid": "15f6ed6e-4c58-11e7-8cf8-0ef95424e992"
},
"redirectURIs": [
"https://kibana-ops.router.default.svc.cluster.local"
],
"scopeRestrictions": [
{
"literals": [
"user:info",
"user:check-access",
"user:list-projects"
]
}
],
"secret": "QzrWYqRnoG6Utj3WhKOgsio1smVquO6YH8k4o80ltA8yw0y91z3OG9pcjClAWg6j"
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Set Kibana secret] ****************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:157
ok: [openshift] => {
"changed": false,
"results": {
"apiVersion": "v1",
"data": {
"ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUwTXpjeU1Gb1hEVEl5TURZd056RTBNemN5TVZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU1mUlpjQnhZMVlUTEpibHBneHV2TE5yRnBTcGxFbDM1U04rK3FNMU5mTEgKYXV2Yzc3WXVVVENzOXVXN2FBYm9tQThLT0dZaEVTdWJTWVpHaXBZZXlQVGRWWVZMVlplWDN4emJxZC9RbkRnUgpvZjN1MndIcE9hMjVMNVBHVlJCK0tySDUrRG9lcjdDOThNK2lnVVJ4WXZ4QWh0cHRWaTU2ckQ1YnZpV0lTWllLCmdwTW9sZ2JJdDlORVNnTlVjUXBhWUYwOFFuTjdCVDdqRXVvNVlGREZpYVFoVDJ3TnJnbklnY1J2dWhKZFI2dXQKMnVadzRKTEVCZ2dteGdlcnhjY1hKUVJGQlpMcGlTQjBGZWNLNzZySTJtbnJKcVNJdVQyZkd2c0toNkNTNzNXawpPelcwY0pHNzd6WHhvUFF0VlkrZEV3eDlubE4wcXMydFM1bi83eFdFNDNVQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFMb2MKSmJ5WXBmUTBmYnRRdTdXa3k0ai9pUGVTNHB5a2pSbDZkQlZ5ZERqYnNLY1luaHVyWmYvUGlkSkpCREw0VmdvWQpld0g2UjV1Wk96Y1pRWkl3b25NeHIwazJyMGRPZ3crSVpHMWtBV2M2MGNCSFB3NHpibmFHWS9raURHekxTVHZvCmlJRWY3cVFvVFk1QjEzUHAxRzBNWEdiWFlHeWtNL044NHlxbVhHTVA0K3RpV1BlemlKZHhWNWVvdnRyVkRVa1oKb25zeHU5TE1USFYzVlN3ZmlYOUpKejQveVZNNVZSSXVDa2Y2TGpVbXFNUVpOSGhnVUZrNU9LamJTck50ZmQvSApGdjRRZDdOMk9xS2tOTmxTM3g4UkFZZ1F5eENlSDBFc2RKU1ZHMTQvUE1CNVB2bHNZSmZEbzd3KzMwd3RHRWYvCk9tdFh5ZWV4VGw3SC82UTAvV3M9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K",
"cert": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURSVENDQWkyZ0F3SUJBZ0lCQXpBTkJna3Foa2lHOXcwQkFRVUZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUwTXpjeU5sb1hEVEU1TURZd09ERTBNemN5TmxvdwpSakVRTUE0R0ExVUVDZ3dIVEc5bloybHVaekVTTUJBR0ExVUVDd3dKVDNCbGJsTm9hV1owTVI0d0hBWURWUVFECkRCVnplWE4wWlcwdWJHOW5aMmx1Wnk1cmFXSmhibUV3Z2dFaU1BMEdDU3FHU0liM0RRRUJBUVVBQTRJQkR3QXcKZ2dFS0FvSUJBUUNvV1VSK2NGUmR0bi9IY1owMllySlZOMzdRVi9GQmJlTzJJc3E0OVhEVXpXYjdJWktTNGsxWgorand1akExL0NhbzZjN0NuRDlPU1QvanU0YTdnb2xBSDRQenUrTU5JSWpvK3h6ZmV1UnZITzVpVE8wd3VDenplCmdrdjRkUWc5QytaalpXR2laSVM2cjNOalZxVDhGSUVUNkF1Zm9WSUZRa1Iya3NYRXRRcURKY0N0eTJkY0M3YVQKa3lha1BhVHo5VWxPNElsUE92Q1lmUGc0WVI1MnpKeDh2aU9ocWFPMDBWOUpYNDQvSHdBdUlSbXlJQWQ3VHRtZQpBOUVyS3NkMTF6a0RTMnVQdjAxaEwwYUkvdTQ0Zkg0akZBcVdqQXAyZ2xxT1d4VEczR1ZPbzBoTWhBQ3RoTkduCjRXZVZBYkdpdjRoZUR4Rml5TFlVZ0NqVHRiQ2JUVzQvQWdNQkFBR2paakJrTUE0R0ExVWREd0VCL3dRRUF3SUYKb0RBSkJnTlZIUk1FQWpBQU1CMEdBMVVkSlFRV01CUUdDQ3NHQVFVRkJ3TUJCZ2dyQmdFRkJRY0RBakFkQmdOVgpIUTRFRmdRVXViY3NQTUE5QjZUZ2hxN2JaK2Nzc0o5bE9Yc3dDUVlEVlIwakJBSXdBREFOQmdrcWhraUc5dzBCCkFRVUZBQU9DQVFFQXBLbTA1UWhHWlZEckkxNUc3LzE1MUF5UjZOSm5UN1dOSzd6WVgxRkxMQXhCNXdCU1ovM3IKYXFLTllTV0syMlFBSVVMdkMweldEQ0JPRlpudHVvbnpvU3lUaUlJWEhFNTQ0c2FuR0xQQmVPTW93K0ZjaVRyOQpuSjJJUUVLUlBWRENlQ1ZXem9zTXF6OXV4dm5LbEVGVThqdHMwTmJYYmwrQ0RvMC9penpCWTE5bTlCNzB0bmZhCmtBVnpvVkV3bDV6cWdUZDhDRXNabXlNVklBejdHYWQzQWkzTGxBWlI3RzBHU2ZxY0hUOS9pQ3dXN1BRT0pUZUgKeG5xbDN5VkRKTnQzZzluMHJ1ck9PUUFnSDQ3MnhRaWk0VE5vSE5xNXBLaHNwVTRtMm4vRUpYTXNCTXRUVDA0dwpNd2YrUGxCVUtJNWJMb2hiUDBrYzlUUmJsYlgvK0UzK3VRPT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
"key": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV1d0lCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktVd2dnU2hBZ0VBQW9JQkFRQ29XVVIrY0ZSZHRuL0gKY1owMllySlZOMzdRVi9GQmJlTzJJc3E0OVhEVXpXYjdJWktTNGsxWitqd3VqQTEvQ2FvNmM3Q25EOU9TVC9qdQo0YTdnb2xBSDRQenUrTU5JSWpvK3h6ZmV1UnZITzVpVE8wd3VDenplZ2t2NGRRZzlDK1pqWldHaVpJUzZyM05qClZxVDhGSUVUNkF1Zm9WSUZRa1Iya3NYRXRRcURKY0N0eTJkY0M3YVRreWFrUGFUejlVbE80SWxQT3ZDWWZQZzQKWVI1MnpKeDh2aU9ocWFPMDBWOUpYNDQvSHdBdUlSbXlJQWQ3VHRtZUE5RXJLc2QxMXprRFMydVB2MDFoTDBhSQovdTQ0Zkg0akZBcVdqQXAyZ2xxT1d4VEczR1ZPbzBoTWhBQ3RoTkduNFdlVkFiR2l2NGhlRHhGaXlMWVVnQ2pUCnRiQ2JUVzQvQWdNQkFBRUNnZjhxRmljSmRRdWlZZjJQM3RkdXdUM1VFQXVrQ2xoR0F6THFWY2hZZFFQQUw1bVAKTHRWMTR4MXpZUnVxaDFqWWFaSWxDc2NlL05YbnZoWjlvZTRXT24zSGVuZkJSbStMbVFMMnJ0ZEkxdjRhME10VQpBMzhJQThjejJWOGt2aDFSSmFoek1PVTNzK2w2d21JV2M1TExjbFRTcFRCQ3VCSEpQc2ZlM05LVVdxNUdHc0pECnRaWExLZXBwQkplWGVweWRMK0VzRmhhOFI5WURqeFQxMHBoUG8rZjhpQnF4VmFBN09wT0piRmdRMTljNHhmUUgKQ25hMG02K2h4T3UvczlFSlpnZHE4eWZtbVFWUVhUV0hHM1V0QmNyQitZYjlncDgwNlVUQU1BN1N5bk50ckFUTwphU1VxcXNXRXRpRGFRcEtBcXJNOTUxMUl3QmJnTTVWZHpGR0srZ0VDZ1lFQTBLTzlmMnprTHRVQUpoZkFrZmd3ClRCWlVBRVRYUWJveXNVbUc1ZTMwRWp1ZnowdFRnQ3JHYlEyWEJUL1dCVlhUNkpyZW1RWWpTdVEvTlhFN1c5RGoKMDJCbkNpUlhkVTRDMzVHQWZGNnYzUm9hbk9vRjlpSnQ5UXhkTmt3YWx6eXpkR2hCTlJFNE93TDlZL0xDM3M5VgpBaUpiVFFBQ3VoQXI5bExyd2puUnU3VUNnWUVBenBBdWdld29zWExXL2h3MFFweVRBOGl2ZmNEcFpiQzE1TnM3Ci8vL25xNzlkOTNDVTNVYWZ0eXZKQXdxSldwWFlmZU5IZDNqaC9jUHlaYWR4S0xBWElsNzZGZ1JrS0cyaVQyWTYKZ0xmZ3dlbm9TZEJmb1hYeWZ0d3FvbTljdFRGeUNmcUsvVTBNMWViMWJXQ3hIOFoxdHBsSlY2SlFmWU1xWER0RApNUVJOZ3FNQ2dZQUhCRHF2T2w1Q0pPK29XRXdsbkk4alArejdSVlZuNUhjbjl5ME9ObjVxem82RlRpYzB2RHVJCkYxam8yRWkrVFRDZk5mWWVkMUpnaG1TSnk1RVBlV3J3Qk9IeU9WNm5sMFFKZUw4MWI0bkNpY296Vkx0Mmw1blQKRCtOaW5CU1kyWFcvaUhJSTh0ZE5STUI0eUFVOXNRTk8yeER1K2YwZGZNVzl0dlF1eC9zQnNRS0JnR2NZRk80cApOTUlqdytPOXBlT2RDODcycVlmRThYZ2NjeHdPdzJwb2lYTHhwdTlwVkJNQVBaU0pHT0VZc0NieTVNTW4zVEptCnRid1d3UE0xVjhmcjR3LzNnUy9kN3pNeVRMRCtIN0xBa3orVkE2ZGJoVzhyVStVMjgxeHc0ajlZdDBiOXNjTHMKWXJ5YmNlQ3VRcGpPVDAvY1AzdXFlaXU2ZkRqZUp4SGV5T1NyQW9HQkFNZFNIczR2T3ppRVRzRk9ENXh0Q042WAp0dUhwYjA1aDVDYytNL3hSVkVKVzBxNGh0S2k4eHlTQVFFbEZLVEs4TFkyeDFPMGpXcjFvS3h0MTJXWFVPU2NTCkRkYXIyMi9LNmxNT0RTNUhhcTlBbGE0T1d1WFpSN1Q1cHZIeW1RZUQvdVpSWW1hV0JTWi9SZ1Jsd0ZReWZKNysKZElaZzNzcG5ZakJqYTdsNFZsQmwKLS0tLS1FTkQgUFJJVkFURSBLRVktLS0tLQo="
},
"kind": "Secret",
"metadata": {
"creationTimestamp": null,
"name": "logging-kibana"
},
"type": "Opaque"
},
"state": "present"
}
TASK [openshift_logging_kibana : Set Kibana Proxy secret] **********************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:171
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc replace -f /tmp/logging-kibana-proxy -n logging",
"results": "",
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Generate Kibana DC template] ******************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:197
changed: [openshift] => {
"changed": true,
"checksum": "4d26f2e20b9c4a692e3af55a7e3717145ce9db3c",
"dest": "/tmp/openshift-logging-ansible-eQPaO6/templates/kibana-dc.yaml",
"gid": 0,
"group": "root",
"md5sum": "84346684584df7e2361e3668e140e336",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 3761,
"src": "/root/.ansible/tmp/ansible-tmp-1496932698.3-97599400263107/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_kibana : Set Kibana DC] ********************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:216
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get dc logging-kibana-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "DeploymentConfig",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:19Z",
"generation": 2,
"labels": {
"component": "kibana-ops",
"logging-infra": "kibana",
"provider": "openshift"
},
"name": "logging-kibana-ops",
"namespace": "logging",
"resourceVersion": "1480",
"selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-kibana-ops",
"uid": "1d7e21dc-4c58-11e7-8cf8-0ef95424e992"
},
"spec": {
"replicas": 1,
"selector": {
"component": "kibana-ops",
"logging-infra": "kibana",
"provider": "openshift"
},
"strategy": {
"activeDeadlineSeconds": 21600,
"resources": {},
"rollingParams": {
"intervalSeconds": 1,
"maxSurge": "25%",
"maxUnavailable": "25%",
"timeoutSeconds": 600,
"updatePeriodSeconds": 1
},
"type": "Rolling"
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "kibana-ops",
"logging-infra": "kibana",
"provider": "openshift"
},
"name": "logging-kibana-ops"
},
"spec": {
"containers": [
{
"env": [
{
"name": "ES_HOST",
"value": "logging-es-ops"
},
{
"name": "ES_PORT",
"value": "9200"
},
{
"name": "KIBANA_MEMORY_LIMIT",
"valueFrom": {
"resourceFieldRef": {
"containerName": "kibana",
"divisor": "0",
"resource": "limits.memory"
}
}
}
],
"image": "172.30.168.204:5000/logging/logging-kibana:latest",
"imagePullPolicy": "Always",
"name": "kibana",
"readinessProbe": {
"exec": {
"command": [
"/usr/share/kibana/probe/readiness.sh"
]
},
"failureThreshold": 3,
"initialDelaySeconds": 5,
"periodSeconds": 5,
"successThreshold": 1,
"timeoutSeconds": 4
},
"resources": {
"limits": {
"memory": "736Mi"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/etc/kibana/keys",
"name": "kibana",
"readOnly": true
}
]
},
{
"env": [
{
"name": "OAP_BACKEND_URL",
"value": "http://localhost:5601"
},
{
"name": "OAP_AUTH_MODE",
"value": "oauth2"
},
{
"name": "OAP_TRANSFORM",
"value": "user_header,token_header"
},
{
"name": "OAP_OAUTH_ID",
"value": "kibana-proxy"
},
{
"name": "OAP_MASTER_URL",
"value": "https://kubernetes.default.svc.cluster.local"
},
{
"name": "OAP_PUBLIC_MASTER_URL",
"value": "https://172.18.2.72:8443"
},
{
"name": "OAP_LOGOUT_REDIRECT",
"value": "https://172.18.2.72:8443/console/logout"
},
{
"name": "OAP_MASTER_CA_FILE",
"value": "/var/run/secrets/kubernetes.io/serviceaccount/ca.crt"
},
{
"name": "OAP_DEBUG",
"value": "False"
},
{
"name": "OAP_OAUTH_SECRET_FILE",
"value": "/secret/oauth-secret"
},
{
"name": "OAP_SERVER_CERT_FILE",
"value": "/secret/server-cert"
},
{
"name": "OAP_SERVER_KEY_FILE",
"value": "/secret/server-key"
},
{
"name": "OAP_SERVER_TLS_FILE",
"value": "/secret/server-tls.json"
},
{
"name": "OAP_SESSION_SECRET_FILE",
"value": "/secret/session-secret"
},
{
"name": "OCP_AUTH_PROXY_MEMORY_LIMIT",
"valueFrom": {
"resourceFieldRef": {
"containerName": "kibana-proxy",
"divisor": "0",
"resource": "limits.memory"
}
}
}
],
"image": "172.30.168.204:5000/logging/logging-auth-proxy:latest",
"imagePullPolicy": "Always",
"name": "kibana-proxy",
"ports": [
{
"containerPort": 3000,
"name": "oaproxy",
"protocol": "TCP"
}
],
"resources": {
"limits": {
"memory": "96Mi"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/secret",
"name": "kibana-proxy",
"readOnly": true
}
]
}
],
"dnsPolicy": "ClusterFirst",
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {},
"serviceAccount": "aggregated-logging-kibana",
"serviceAccountName": "aggregated-logging-kibana",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"name": "kibana",
"secret": {
"defaultMode": 420,
"secretName": "logging-kibana"
}
},
{
"name": "kibana-proxy",
"secret": {
"defaultMode": 420,
"secretName": "logging-kibana-proxy"
}
}
]
}
},
"test": false,
"triggers": [
{
"type": "ConfigChange"
}
]
},
"status": {
"availableReplicas": 0,
"conditions": [
{
"lastTransitionTime": "2017-06-08T14:38:19Z",
"lastUpdateTime": "2017-06-08T14:38:19Z",
"message": "Deployment config does not have minimum availability.",
"status": "False",
"type": "Available"
},
{
"lastTransitionTime": "2017-06-08T14:38:19Z",
"lastUpdateTime": "2017-06-08T14:38:19Z",
"message": "Created new replication controller \"logging-kibana-ops-1\" for version 1",
"reason": "NewReplicationControllerCreated",
"status": "True",
"type": "Progressing"
}
],
"details": {
"causes": [
{
"type": "ConfigChange"
}
],
"message": "config change"
},
"latestVersion": 1,
"observedGeneration": 2,
"replicas": 0,
"unavailableReplicas": 0,
"updatedReplicas": 0
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_kibana : Delete temp directory] ************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_kibana/tasks/main.yaml:228
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-eQPaO6",
"state": "absent"
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:195
statically included: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml
TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"curator_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:15
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Create temp directory for doing work in] *****
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:5
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.002005",
"end": "2017-06-08 10:38:21.722866",
"rc": 0,
"start": "2017-06-08 10:38:21.720861"
}
STDOUT:
/tmp/openshift-logging-ansible-9sl3xI
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:10
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-9sl3xI"
},
"changed": false
}
TASK [openshift_logging_curator : Create templates subdirectory] ***************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:14
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-9sl3xI/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:24
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:32
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-curator-dockercfg-bcmwl"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:22Z",
"name": "aggregated-logging-curator",
"namespace": "logging",
"resourceVersion": "1509",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-curator",
"uid": "1f590b8f-4c58-11e7-8cf8-0ef95424e992"
},
"secrets": [
{
"name": "aggregated-logging-curator-dockercfg-bcmwl"
},
{
"name": "aggregated-logging-curator-token-qshch"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:41
ok: [openshift] => {
"changed": false,
"checksum": "9008efd9a8892dcc42c28c6dfb6708527880a6d8",
"dest": "/tmp/openshift-logging-ansible-9sl3xI/curator.yml",
"gid": 0,
"group": "root",
"md5sum": "5498c5fd98f3dd06e34b20eb1f55dc12",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 320,
"src": "/root/.ansible/tmp/ansible-tmp-1496932702.82-280670136634720/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:47
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Set Curator configmap] ***********************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:53
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get configmap logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"data": {
"config.yaml": "# Logging example curator config file\n\n# uncomment and use this to override the defaults from env vars\n#.defaults:\n# delete:\n# days: 30\n# runhour: 0\n# runminute: 0\n\n# to keep ops logs for a different duration:\n#.operations:\n# delete:\n# weeks: 8\n\n# example for a normal project\n#myapp:\n# delete:\n# weeks: 1\n"
},
"kind": "ConfigMap",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:23Z",
"name": "logging-curator",
"namespace": "logging",
"resourceVersion": "1514",
"selfLink": "/api/v1/namespaces/logging/configmaps/logging-curator",
"uid": "20126432-4c58-11e7-8cf8-0ef95424e992"
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : Set Curator secret] **************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:62
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc secrets new logging-curator ca=/etc/origin/logging/ca.crt key=/etc/origin/logging/system.logging.curator.key cert=/etc/origin/logging/system.logging.curator.crt -n logging",
"results": "",
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:75
ok: [openshift] => {
"ansible_facts": {
"curator_component": "curator",
"curator_name": "logging-curator"
},
"changed": false
}
TASK [openshift_logging_curator : Generate Curator deploymentconfig] ***********
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:81
ok: [openshift] => {
"changed": false,
"checksum": "74d5c19f34dbbaa3081e41f2e3332fdd8904a3ca",
"dest": "/tmp/openshift-logging-ansible-9sl3xI/templates/curator-dc.yaml",
"gid": 0,
"group": "root",
"md5sum": "79ee55e2d2db0b32e3cb743e2b89ea63",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2341,
"src": "/root/.ansible/tmp/ansible-tmp-1496932704.63-255499305935265/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_curator : Set Curator DC] ******************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:99
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get dc logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "DeploymentConfig",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:25Z",
"generation": 2,
"labels": {
"component": "curator",
"logging-infra": "curator",
"provider": "openshift"
},
"name": "logging-curator",
"namespace": "logging",
"resourceVersion": "1527",
"selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-curator",
"uid": "21373121-4c58-11e7-8cf8-0ef95424e992"
},
"spec": {
"replicas": 1,
"selector": {
"component": "curator",
"logging-infra": "curator",
"provider": "openshift"
},
"strategy": {
"activeDeadlineSeconds": 21600,
"recreateParams": {
"timeoutSeconds": 600
},
"resources": {},
"rollingParams": {
"intervalSeconds": 1,
"maxSurge": "25%",
"maxUnavailable": "25%",
"timeoutSeconds": 600,
"updatePeriodSeconds": 1
},
"type": "Recreate"
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "curator",
"logging-infra": "curator",
"provider": "openshift"
},
"name": "logging-curator"
},
"spec": {
"containers": [
{
"env": [
{
"name": "K8S_HOST_URL",
"value": "https://kubernetes.default.svc.cluster.local"
},
{
"name": "ES_HOST",
"value": "logging-es"
},
{
"name": "ES_PORT",
"value": "9200"
},
{
"name": "ES_CLIENT_CERT",
"value": "/etc/curator/keys/cert"
},
{
"name": "ES_CLIENT_KEY",
"value": "/etc/curator/keys/key"
},
{
"name": "ES_CA",
"value": "/etc/curator/keys/ca"
},
{
"name": "CURATOR_DEFAULT_DAYS",
"value": "30"
},
{
"name": "CURATOR_RUN_HOUR",
"value": "0"
},
{
"name": "CURATOR_RUN_MINUTE",
"value": "0"
},
{
"name": "CURATOR_RUN_TIMEZONE",
"value": "UTC"
},
{
"name": "CURATOR_SCRIPT_LOG_LEVEL",
"value": "INFO"
},
{
"name": "CURATOR_LOG_LEVEL",
"value": "ERROR"
}
],
"image": "172.30.168.204:5000/logging/logging-curator:latest",
"imagePullPolicy": "Always",
"name": "curator",
"resources": {
"limits": {
"cpu": "100m"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/etc/curator/keys",
"name": "certs",
"readOnly": true
},
{
"mountPath": "/etc/curator/settings",
"name": "config",
"readOnly": true
}
]
}
],
"dnsPolicy": "ClusterFirst",
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {},
"serviceAccount": "aggregated-logging-curator",
"serviceAccountName": "aggregated-logging-curator",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"name": "certs",
"secret": {
"defaultMode": 420,
"secretName": "logging-curator"
}
},
{
"configMap": {
"defaultMode": 420,
"name": "logging-curator"
},
"name": "config"
}
]
}
},
"test": false,
"triggers": [
{
"type": "ConfigChange"
}
]
},
"status": {
"availableReplicas": 0,
"conditions": [
{
"lastTransitionTime": "2017-06-08T14:38:25Z",
"lastUpdateTime": "2017-06-08T14:38:25Z",
"message": "Deployment config does not have minimum availability.",
"status": "False",
"type": "Available"
},
{
"lastTransitionTime": "2017-06-08T14:38:25Z",
"lastUpdateTime": "2017-06-08T14:38:25Z",
"message": "replication controller \"logging-curator-1\" is waiting for pod \"logging-curator-1-deploy\" to run",
"status": "Unknown",
"type": "Progressing"
}
],
"details": {
"causes": [
{
"type": "ConfigChange"
}
],
"message": "config change"
},
"latestVersion": 1,
"observedGeneration": 2,
"replicas": 0,
"unavailableReplicas": 0,
"updatedReplicas": 0
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : Delete temp directory] ***********************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:109
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-9sl3xI",
"state": "absent"
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:207
statically included: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml
TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"curator_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/determine_version.yaml:15
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Create temp directory for doing work in] *****
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:5
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.009927",
"end": "2017-06-08 10:38:29.275123",
"rc": 0,
"start": "2017-06-08 10:38:29.265196"
}
STDOUT:
/tmp/openshift-logging-ansible-EQRnBW
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:10
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-EQRnBW"
},
"changed": false
}
TASK [openshift_logging_curator : Create templates subdirectory] ***************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:14
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-EQRnBW/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:24
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Create Curator service account] **************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:32
ok: [openshift] => {
"changed": false,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-curator-dockercfg-bcmwl"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:22Z",
"name": "aggregated-logging-curator",
"namespace": "logging",
"resourceVersion": "1509",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-curator",
"uid": "1f590b8f-4c58-11e7-8cf8-0ef95424e992"
},
"secrets": [
{
"name": "aggregated-logging-curator-dockercfg-bcmwl"
},
{
"name": "aggregated-logging-curator-token-qshch"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:41
ok: [openshift] => {
"changed": false,
"checksum": "9008efd9a8892dcc42c28c6dfb6708527880a6d8",
"dest": "/tmp/openshift-logging-ansible-EQRnBW/curator.yml",
"gid": 0,
"group": "root",
"md5sum": "5498c5fd98f3dd06e34b20eb1f55dc12",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 320,
"src": "/root/.ansible/tmp/ansible-tmp-1496932710.63-26512504150061/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_curator : copy] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:47
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_curator : Set Curator configmap] ***********************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:53
ok: [openshift] => {
"changed": false,
"results": {
"cmd": "/bin/oc get configmap logging-curator -o json -n logging",
"results": [
{
"apiVersion": "v1",
"data": {
"config.yaml": "# Logging example curator config file\n\n# uncomment and use this to override the defaults from env vars\n#.defaults:\n# delete:\n# days: 30\n# runhour: 0\n# runminute: 0\n\n# to keep ops logs for a different duration:\n#.operations:\n# delete:\n# weeks: 8\n\n# example for a normal project\n#myapp:\n# delete:\n# weeks: 1\n"
},
"kind": "ConfigMap",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:23Z",
"name": "logging-curator",
"namespace": "logging",
"resourceVersion": "1514",
"selfLink": "/api/v1/namespaces/logging/configmaps/logging-curator",
"uid": "20126432-4c58-11e7-8cf8-0ef95424e992"
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : Set Curator secret] **************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:62
ok: [openshift] => {
"changed": false,
"results": {
"apiVersion": "v1",
"data": {
"ca": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUMyakNDQWNLZ0F3SUJBZ0lCQVRBTkJna3Foa2lHOXcwQkFRc0ZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUwTXpjeU1Gb1hEVEl5TURZd056RTBNemN5TVZvdwpIakVjTUJvR0ExVUVBeE1UYkc5bloybHVaeTF6YVdkdVpYSXRkR1Z6ZERDQ0FTSXdEUVlKS29aSWh2Y05BUUVCCkJRQURnZ0VQQURDQ0FRb0NnZ0VCQU1mUlpjQnhZMVlUTEpibHBneHV2TE5yRnBTcGxFbDM1U04rK3FNMU5mTEgKYXV2Yzc3WXVVVENzOXVXN2FBYm9tQThLT0dZaEVTdWJTWVpHaXBZZXlQVGRWWVZMVlplWDN4emJxZC9RbkRnUgpvZjN1MndIcE9hMjVMNVBHVlJCK0tySDUrRG9lcjdDOThNK2lnVVJ4WXZ4QWh0cHRWaTU2ckQ1YnZpV0lTWllLCmdwTW9sZ2JJdDlORVNnTlVjUXBhWUYwOFFuTjdCVDdqRXVvNVlGREZpYVFoVDJ3TnJnbklnY1J2dWhKZFI2dXQKMnVadzRKTEVCZ2dteGdlcnhjY1hKUVJGQlpMcGlTQjBGZWNLNzZySTJtbnJKcVNJdVQyZkd2c0toNkNTNzNXawpPelcwY0pHNzd6WHhvUFF0VlkrZEV3eDlubE4wcXMydFM1bi83eFdFNDNVQ0F3RUFBYU1qTUNFd0RnWURWUjBQCkFRSC9CQVFEQWdLa01BOEdBMVVkRXdFQi93UUZNQU1CQWY4d0RRWUpLb1pJaHZjTkFRRUxCUUFEZ2dFQkFMb2MKSmJ5WXBmUTBmYnRRdTdXa3k0ai9pUGVTNHB5a2pSbDZkQlZ5ZERqYnNLY1luaHVyWmYvUGlkSkpCREw0VmdvWQpld0g2UjV1Wk96Y1pRWkl3b25NeHIwazJyMGRPZ3crSVpHMWtBV2M2MGNCSFB3NHpibmFHWS9raURHekxTVHZvCmlJRWY3cVFvVFk1QjEzUHAxRzBNWEdiWFlHeWtNL044NHlxbVhHTVA0K3RpV1BlemlKZHhWNWVvdnRyVkRVa1oKb25zeHU5TE1USFYzVlN3ZmlYOUpKejQveVZNNVZSSXVDa2Y2TGpVbXFNUVpOSGhnVUZrNU9LamJTck50ZmQvSApGdjRRZDdOMk9xS2tOTmxTM3g4UkFZZ1F5eENlSDBFc2RKU1ZHMTQvUE1CNVB2bHNZSmZEbzd3KzMwd3RHRWYvCk9tdFh5ZWV4VGw3SC82UTAvV3M9Ci0tLS0tRU5EIENFUlRJRklDQVRFLS0tLS0K",
"cert": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSURSakNDQWk2Z0F3SUJBZ0lCQkRBTkJna3Foa2lHOXcwQkFRVUZBREFlTVJ3d0dnWURWUVFERXhOc2IyZG4KYVc1bkxYTnBaMjVsY2kxMFpYTjBNQjRYRFRFM01EWXdPREUwTXpjeU4xb1hEVEU1TURZd09ERTBNemN5TjFvdwpSekVRTUE0R0ExVUVDZ3dIVEc5bloybHVaekVTTUJBR0ExVUVDd3dKVDNCbGJsTm9hV1owTVI4d0hRWURWUVFECkRCWnplWE4wWlcwdWJHOW5aMmx1Wnk1amRYSmhkRzl5TUlJQklqQU5CZ2txaGtpRzl3MEJBUUVGQUFPQ0FROEEKTUlJQkNnS0NBUUVBdE9CY0RDMjJhdWpac1ovV04yc3AxODNFM3ZkTXFpNzVUUkkrbExaVVFiNTBYOWZCMzh2QQpOd0FkVGw4em1jNFplOXhIUlpoM0YwdUtCSkE4c01MUDZua25LbUdSYTB4QUN5SnRlamtSMi9uWUFNaDFGZkx5CmVpVWxPN3I1WEVRQkQvc2hnbks5OTNTNHNyRGxwdkliRS9URCtsTldLK0tNNjJ1S2gvcVdTTTJnZjd5amdwejIKZXJtSXdwNmY3cnRqWnhYM2w4T0ZZam1BRzdud1JOVVRnRGtEK0llcDJHeGpVQWdjOHIzaVN4cU5rTC8wZVkwNQozdnNOWno2Y1FiUHFhT2RtWkYvMmlGSnRUT0hmRDNnZmZTMkY2aG1CbXRkT2lHVGR3V09xakxibnZRWWc3NVlpCk1wczNJN0FoSDlNK2RNQndQZnVpOUhvVjFINVR6QU5aWVFJREFRQUJvMll3WkRBT0JnTlZIUThCQWY4RUJBTUMKQmFBd0NRWURWUjBUQkFJd0FEQWRCZ05WSFNVRUZqQVVCZ2dyQmdFRkJRY0RBUVlJS3dZQkJRVUhBd0l3SFFZRApWUjBPQkJZRUZLU0t2VkpyMlJ5ZXFFM2dUdlpwdmhLTWZYUnBNQWtHQTFVZEl3UUNNQUF3RFFZSktvWklodmNOCkFRRUZCUUFEZ2dFQkFLZVQ0OU5DQk5IUzEyREVtcXNNZDRYNDUySElOZW01OUdZSTNLNnkyN3dhOEhLMjJmbVAKZng0WWdITkJheVhnaXhYcTVSTXFVZVphajlHaTR2Sng3TjQ0TGd2VXVTTHZoTmx6cnRSa3Mzb2REZUdUQ0lNVwpnd081dU5pUWVxMVdpTWx3a3RDNHBtdCtRTWNjMC9XRzFTRDc1dWJ0eEhybUgzc1hFVUFIeTY0RVJLcWtlRmgzCjdvK2QvNVRqUG9xUjdUL05oajR2TFBXdDdaVW93akVLRUNNRjNLMk5QMG4zb3VtdllDNFhEb2V6dXlncWs3d3gKNC9CNjcyRWhXNENDSldNUVNnaGI2UEtlNG1yY3NvQmtGRXNUWUNaMWJZM1Q3M2ZtNlhGM1hGUEprV3BURG1hcQpGVkJtdG5rOVJUSEI5aHBXa3VtU2VoRWQ3WlQyaVIrVklpWT0KLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQo=",
"key": "LS0tLS1CRUdJTiBQUklWQVRFIEtFWS0tLS0tCk1JSUV2UUlCQURBTkJna3Foa2lHOXcwQkFRRUZBQVNDQktjd2dnU2pBZ0VBQW9JQkFRQzA0RndNTGJacTZObXgKbjlZM2F5blh6Y1RlOTB5cUx2bE5FajZVdGxSQnZuUmYxOEhmeThBM0FCMU9Yek9aemhsNzNFZEZtSGNYUzRvRQprRHl3d3MvcWVTY3FZWkZyVEVBTEltMTZPUkhiK2RnQXlIVVY4dko2SlNVN3V2bGNSQUVQK3lHQ2NyMzNkTGl5CnNPV204aHNUOU1QNlUxWXI0b3pyYTRxSCtwWkl6YUIvdktPQ25QWjZ1WWpDbnAvdXUyTm5GZmVYdzRWaU9ZQWIKdWZCRTFST0FPUVA0aDZuWWJHTlFDQnp5dmVKTEdvMlF2L1I1alRuZSt3MW5QcHhCcytwbzUyWmtYL2FJVW0xTQo0ZDhQZUI5OUxZWHFHWUdhMTA2SVpOM0JZNnFNdHVlOUJpRHZsaUl5bXpjanNDRWYwejUwd0hBOSs2TDBlaFhVCmZsUE1BMWxoQWdNQkFBRUNnZ0VCQUlubUo2ZE5UcTh4ZVk4WGFpMTBNVUdMYXBZRyt0Y2cyd2FzYXpQYjBsOFIKUWc0MEkyWnBJOUdLNzh6cGZpQzc1ZWZ0cUJUaVNhRmtBUVM3cHRtb25QV0ppcGVnTFVzZHBoSVhIRm83bW1jWQprMXI5ZVc3VGU1UXVMN1hiTmZQbkVOeExQV0FEOU5ydGpVY0c0UCtudzRjeWJCdTBYNVV6c3ZabFZnZVh2KzRlCk5kRGxadzkrMTErNmZZYlhiT3RPQVU4NHZ0djlWalpWbUw5RW5ZUGdpS0d6TTJQVEdqQXE0WEU5cU4xYW9nMGQKWnJCL3RIYjd4SHRpWjFpNXE3N3NSZkl6VVl1ZlQyaFhHL2FUNUJjT2tPRW9ET0tSTitzcC9uT1pkYld4Zmk5VgoxYmxWcENzZzR5eUhpWGVWc3F4MnBxOFk0MkZqNVBMa3h2eEtIamVQTkhFQ2dZRUE2NTNId3JlT0wvSEdPNjdWCnphcGpOVnhZWllVdDhwSzJHMEFMMVZudWVyRmcvLzkrQUZNUlZaSVZIekR4ellLMmpLMnZ5YVF0WGdMNGFiRHIKdFQ0TWdUYlZzUzRFMnZtWDQwTFhHdTAyYXgzT1lEMUswSlduTXZoTWR2cHZsMGJuWTh2QzZGUHpKbkhYWDNwRgpaWjBYY29rdWJKZFhya2dhN3psVWVQUE02dFVDZ1lFQXhJWS9sSWpFN2hETmtJY1F2Z3Vwb0ZrWEVhbFlSTXVoCmJFTXN1R2RqWjZWZWhrcUZUMFIrNzZmWnd4ekU0QmFGNUZHQUFxRmJ6YjQvZThTZnFWdnU2VWNYQXhkVlUvMlIKMTB5azhSSzBCYm50SWJ5b3c2TXQ1WEI3dGRxVFlqS0NqRys4aDVyR2pnOWJjR3B2M0I4WlFWRjNINHdsSFZDUQpSQ0M4SUNNNjRsMENnWUJVT3lxLzFLa1RRTWJTYlZWbjJnTHZmNXptWmk4ZjZnMEtQdUk4R3BOajcyZXkvUjZKCmRTamNRNFlqaVhiWU5tT1dkVDFEdzlxb0lqMjJZeFpReStiaWhyenNRM3hlNEIzSmxBcWNTTE5NcGZJeWU1YjAKYkp2Q1gvdk9DUWU5dUE1ZW9laUM4QWdiOVZTK2dGS3cyZkVZOUN2UmpHVS9HKzN0R2J5MkpNcGNKUUtCZ0grSwo5UmFNRU9yRVl2VUtnMVlqc1luTWFBbGhVMVVLcHcvaEpOUGszWUcxdEh3SlB5MXJzY29Oc0dsTmNZUlJlY0h0CkZ0d2VKcnVIWGlJUVFPS2tOSkNYUDVzVStKN1M4V1MrYkVtOHJyTU1zSloxbnoyZzJMZVFZZWxyR3IzZk5CUzYKcTZ1Q1NweUY0UDA3UnErZ1N6NjJCVTZuSUtzK3p2STRJSC9tL1Y5TkFvR0FURDc4ZUtaeFN1UHRZWXZGbDBCRwoxaGxJT09tRFVDWDhKcEphQmNvQkFnTExZK0p1Vk1JVGNldER1ZS9rK29BbjFFODBPL3lWTFNJKzU2WXFkWkVJCkE4VHdIa283TGpSLzFEQTZ1ejJzd3dUS2dkbVNxR01sdzhVSjUrSHhSOFhscjl6clk1K0xSZU5OZEpsRThTUXcKdFhkWnZGQ2k1THJ4bGo5UHRyTXk3QnM9Ci0tLS0tRU5EIFBSSVZBVEUgS0VZLS0tLS0K"
},
"kind": "Secret",
"metadata": {
"creationTimestamp": null,
"name": "logging-curator"
},
"type": "Opaque"
},
"state": "present"
}
TASK [openshift_logging_curator : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:75
ok: [openshift] => {
"ansible_facts": {
"curator_component": "curator-ops",
"curator_name": "logging-curator-ops"
},
"changed": false
}
TASK [openshift_logging_curator : Generate Curator deploymentconfig] ***********
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:81
ok: [openshift] => {
"changed": false,
"checksum": "608b22bd7e5eeed2ac389b790afed4283a6e1579",
"dest": "/tmp/openshift-logging-ansible-EQRnBW/templates/curator-dc.yaml",
"gid": 0,
"group": "root",
"md5sum": "8a8f0a8a81e7d9e8fa175dabf6f8c30d",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 2365,
"src": "/root/.ansible/tmp/ansible-tmp-1496932712.51-34262868609151/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_curator : Set Curator DC] ******************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:99
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get dc logging-curator-ops -o json -n logging",
"results": [
{
"apiVersion": "v1",
"kind": "DeploymentConfig",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:33Z",
"generation": 2,
"labels": {
"component": "curator-ops",
"logging-infra": "curator",
"provider": "openshift"
},
"name": "logging-curator-ops",
"namespace": "logging",
"resourceVersion": "1555",
"selfLink": "/oapi/v1/namespaces/logging/deploymentconfigs/logging-curator-ops",
"uid": "25eeef02-4c58-11e7-8cf8-0ef95424e992"
},
"spec": {
"replicas": 1,
"selector": {
"component": "curator-ops",
"logging-infra": "curator",
"provider": "openshift"
},
"strategy": {
"activeDeadlineSeconds": 21600,
"recreateParams": {
"timeoutSeconds": 600
},
"resources": {},
"rollingParams": {
"intervalSeconds": 1,
"maxSurge": "25%",
"maxUnavailable": "25%",
"timeoutSeconds": 600,
"updatePeriodSeconds": 1
},
"type": "Recreate"
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "curator-ops",
"logging-infra": "curator",
"provider": "openshift"
},
"name": "logging-curator-ops"
},
"spec": {
"containers": [
{
"env": [
{
"name": "K8S_HOST_URL",
"value": "https://kubernetes.default.svc.cluster.local"
},
{
"name": "ES_HOST",
"value": "logging-es-ops"
},
{
"name": "ES_PORT",
"value": "9200"
},
{
"name": "ES_CLIENT_CERT",
"value": "/etc/curator/keys/cert"
},
{
"name": "ES_CLIENT_KEY",
"value": "/etc/curator/keys/key"
},
{
"name": "ES_CA",
"value": "/etc/curator/keys/ca"
},
{
"name": "CURATOR_DEFAULT_DAYS",
"value": "30"
},
{
"name": "CURATOR_RUN_HOUR",
"value": "0"
},
{
"name": "CURATOR_RUN_MINUTE",
"value": "0"
},
{
"name": "CURATOR_RUN_TIMEZONE",
"value": "UTC"
},
{
"name": "CURATOR_SCRIPT_LOG_LEVEL",
"value": "INFO"
},
{
"name": "CURATOR_LOG_LEVEL",
"value": "ERROR"
}
],
"image": "172.30.168.204:5000/logging/logging-curator:latest",
"imagePullPolicy": "Always",
"name": "curator",
"resources": {
"limits": {
"cpu": "100m"
}
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/etc/curator/keys",
"name": "certs",
"readOnly": true
},
{
"mountPath": "/etc/curator/settings",
"name": "config",
"readOnly": true
}
]
}
],
"dnsPolicy": "ClusterFirst",
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {},
"serviceAccount": "aggregated-logging-curator",
"serviceAccountName": "aggregated-logging-curator",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"name": "certs",
"secret": {
"defaultMode": 420,
"secretName": "logging-curator"
}
},
{
"configMap": {
"defaultMode": 420,
"name": "logging-curator"
},
"name": "config"
}
]
}
},
"test": false,
"triggers": [
{
"type": "ConfigChange"
}
]
},
"status": {
"availableReplicas": 0,
"conditions": [
{
"lastTransitionTime": "2017-06-08T14:38:33Z",
"lastUpdateTime": "2017-06-08T14:38:33Z",
"message": "Deployment config does not have minimum availability.",
"status": "False",
"type": "Available"
},
{
"lastTransitionTime": "2017-06-08T14:38:33Z",
"lastUpdateTime": "2017-06-08T14:38:33Z",
"message": "replication controller \"logging-curator-ops-1\" is waiting for pod \"logging-curator-ops-1-deploy\" to run",
"status": "Unknown",
"type": "Progressing"
}
],
"details": {
"causes": [
{
"type": "ConfigChange"
}
],
"message": "config change"
},
"latestVersion": 1,
"observedGeneration": 2,
"replicas": 0,
"unavailableReplicas": 0,
"updatedReplicas": 0
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_curator : Delete temp directory] ***********************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_curator/tasks/main.yaml:109
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-EQRnBW",
"state": "absent"
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:226
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : include_role] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:241
statically included: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:2
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: {{ openshift_logging_fluentd_nodeselector.keys()
| count }} > 1
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:6
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:10
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:14
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:3
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:7
ok: [openshift] => {
"ansible_facts": {
"fluentd_version": "3_5"
},
"changed": false
}
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:12
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : fail] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/determine_version.yaml:15
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:20
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:26
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : Create temp directory for doing work in] *****
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:33
ok: [openshift] => {
"changed": false,
"cmd": [
"mktemp",
"-d",
"/tmp/openshift-logging-ansible-XXXXXX"
],
"delta": "0:00:00.002399",
"end": "2017-06-08 10:38:37.655418",
"rc": 0,
"start": "2017-06-08 10:38:37.653019"
}
STDOUT:
/tmp/openshift-logging-ansible-mzVmJn
TASK [openshift_logging_fluentd : set_fact] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:38
ok: [openshift] => {
"ansible_facts": {
"tempdir": "/tmp/openshift-logging-ansible-mzVmJn"
},
"changed": false
}
TASK [openshift_logging_fluentd : Create templates subdirectory] ***************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:41
ok: [openshift] => {
"changed": false,
"gid": 0,
"group": "root",
"mode": "0755",
"owner": "root",
"path": "/tmp/openshift-logging-ansible-mzVmJn/templates",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 6,
"state": "directory",
"uid": 0
}
TASK [openshift_logging_fluentd : Create Fluentd service account] **************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:51
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : Create Fluentd service account] **************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:59
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get sa aggregated-logging-fluentd -o json -n logging",
"results": [
{
"apiVersion": "v1",
"imagePullSecrets": [
{
"name": "aggregated-logging-fluentd-dockercfg-bkv82"
}
],
"kind": "ServiceAccount",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:38Z",
"name": "aggregated-logging-fluentd",
"namespace": "logging",
"resourceVersion": "1576",
"selfLink": "/api/v1/namespaces/logging/serviceaccounts/aggregated-logging-fluentd",
"uid": "28feb0ca-4c58-11e7-8cf8-0ef95424e992"
},
"secrets": [
{
"name": "aggregated-logging-fluentd-dockercfg-bkv82"
},
{
"name": "aggregated-logging-fluentd-token-tjn2n"
}
]
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_fluentd : Set privileged permissions for Fluentd] ******
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:68
changed: [openshift] => {
"changed": true,
"present": "present",
"results": {
"cmd": "/bin/oc adm policy add-scc-to-user privileged system:serviceaccount:logging:aggregated-logging-fluentd -n logging",
"results": "",
"returncode": 0
}
}
TASK [openshift_logging_fluentd : Set cluster-reader permissions for Fluentd] ***
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:77
changed: [openshift] => {
"changed": true,
"present": "present",
"results": {
"cmd": "/bin/oc adm policy add-cluster-role-to-user cluster-reader system:serviceaccount:logging:aggregated-logging-fluentd -n logging",
"results": "",
"returncode": 0
}
}
TASK [openshift_logging_fluentd : template] ************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:86
ok: [openshift] => {
"changed": false,
"checksum": "a8c8596f5fc2c5dd7c8d33d244af17a2555be086",
"dest": "/tmp/openshift-logging-ansible-mzVmJn/fluent.conf",
"gid": 0,
"group": "root",
"md5sum": "579698b48ffce6276ee0e8d5ac71a338",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 1301,
"src": "/root/.ansible/tmp/ansible-tmp-1496932720.21-61819985543892/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:94
ok: [openshift] => {
"changed": false,
"checksum": "b3e75eddc4a0765edc77da092384c0c6f95440e1",
"dest": "/tmp/openshift-logging-ansible-mzVmJn/fluentd-throttle-config.yaml",
"gid": 0,
"group": "root",
"md5sum": "25871b8e0a9bedc166a6029872a6c336",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 133,
"src": "/root/.ansible/tmp/ansible-tmp-1496932720.56-143190714524640/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:100
ok: [openshift] => {
"changed": false,
"checksum": "a3aa36da13f3108aa4ad5b98d4866007b44e9798",
"dest": "/tmp/openshift-logging-ansible-mzVmJn/secure-forward.conf",
"gid": 0,
"group": "root",
"md5sum": "1084b00c427f4fa48dfc66d6ad6555d4",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 563,
"src": "/root/.ansible/tmp/ansible-tmp-1496932720.84-253016203447705/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:107
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:113
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : copy] ****************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:119
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging_fluentd : Set Fluentd configmap] ***********************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:125
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get configmap logging-fluentd -o json -n logging",
"results": [
{
"apiVersion": "v1",
"data": {
"fluent.conf": "# This file is the fluentd configuration entrypoint. Edit with care.\n\n@include configs.d/openshift/system.conf\n\n# In each section below, pre- and post- includes don't include anything initially;\n# they exist to enable future additions to openshift conf as needed.\n\n## sources\n## ordered so that syslog always runs last...\n@include configs.d/openshift/input-pre-*.conf\n@include configs.d/dynamic/input-docker-*.conf\n@include configs.d/dynamic/input-syslog-*.conf\n@include configs.d/openshift/input-post-*.conf\n##\n\n<label @INGRESS>\n## filters\n @include configs.d/openshift/filter-pre-*.conf\n @include configs.d/openshift/filter-retag-journal.conf\n @include configs.d/openshift/filter-k8s-meta.conf\n @include configs.d/openshift/filter-kibana-transform.conf\n @include configs.d/openshift/filter-k8s-flatten-hash.conf\n @include configs.d/openshift/filter-k8s-record-transform.conf\n @include configs.d/openshift/filter-syslog-record-transform.conf\n @include configs.d/openshift/filter-viaq-data-model.conf\n @include configs.d/openshift/filter-post-*.conf\n##\n\n## matches\n @include configs.d/openshift/output-pre-*.conf\n @include configs.d/openshift/output-operations.conf\n @include configs.d/openshift/output-applications.conf\n # no post - applications.conf matches everything left\n##\n</label>\n",
"secure-forward.conf": "# @type secure_forward\n\n# self_hostname ${HOSTNAME}\n# shared_key <SECRET_STRING>\n\n# secure yes\n# enable_strict_verification yes\n\n# ca_cert_path /etc/fluent/keys/your_ca_cert\n# ca_private_key_path /etc/fluent/keys/your_private_key\n # for private CA secret key\n# ca_private_key_passphrase passphrase\n\n# <server>\n # or IP\n# host server.fqdn.example.com\n# port 24284\n# </server>\n# <server>\n # ip address to connect\n# host 203.0.113.8\n # specify hostlabel for FQDN verification if ipaddress is used for host\n# hostlabel server.fqdn.example.com\n# </server>\n",
"throttle-config.yaml": "# Logging example fluentd throttling config file\n\n#example-project:\n# read_lines_limit: 10\n#\n#.operations:\n# read_lines_limit: 100\n"
},
"kind": "ConfigMap",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:41Z",
"name": "logging-fluentd",
"namespace": "logging",
"resourceVersion": "1595",
"selfLink": "/api/v1/namespaces/logging/configmaps/logging-fluentd",
"uid": "2ae16a92-4c58-11e7-8cf8-0ef95424e992"
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_fluentd : Set logging-fluentd secret] ******************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:137
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc secrets new logging-fluentd ca=/etc/origin/logging/ca.crt key=/etc/origin/logging/system.logging.fluentd.key cert=/etc/origin/logging/system.logging.fluentd.crt -n logging",
"results": "",
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_fluentd : Generate logging-fluentd daemonset definition] ***
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:154
ok: [openshift] => {
"changed": false,
"checksum": "234c59f3be8055caff01ef6a4d4f3f211cb35750",
"dest": "/tmp/openshift-logging-ansible-mzVmJn/templates/logging-fluentd.yaml",
"gid": 0,
"group": "root",
"md5sum": "015a9e62c1de9161c21a4b9795f3be05",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:admin_home_t:s0",
"size": 3415,
"src": "/root/.ansible/tmp/ansible-tmp-1496932722.93-46627867147064/source",
"state": "file",
"uid": 0
}
TASK [openshift_logging_fluentd : Set logging-fluentd daemonset] ***************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:172
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc get daemonset logging-fluentd -o json -n logging",
"results": [
{
"apiVersion": "extensions/v1beta1",
"kind": "DaemonSet",
"metadata": {
"creationTimestamp": "2017-06-08T14:38:44Z",
"generation": 1,
"labels": {
"component": "fluentd",
"logging-infra": "fluentd",
"provider": "openshift"
},
"name": "logging-fluentd",
"namespace": "logging",
"resourceVersion": "1603",
"selfLink": "/apis/extensions/v1beta1/namespaces/logging/daemonsets/logging-fluentd",
"uid": "2c3c963d-4c58-11e7-8cf8-0ef95424e992"
},
"spec": {
"selector": {
"matchLabels": {
"component": "fluentd",
"provider": "openshift"
}
},
"template": {
"metadata": {
"creationTimestamp": null,
"labels": {
"component": "fluentd",
"logging-infra": "fluentd",
"provider": "openshift"
},
"name": "fluentd-elasticsearch"
},
"spec": {
"containers": [
{
"env": [
{
"name": "K8S_HOST_URL",
"value": "https://kubernetes.default.svc.cluster.local"
},
{
"name": "ES_HOST",
"value": "logging-es"
},
{
"name": "ES_PORT",
"value": "9200"
},
{
"name": "ES_CLIENT_CERT",
"value": "/etc/fluent/keys/cert"
},
{
"name": "ES_CLIENT_KEY",
"value": "/etc/fluent/keys/key"
},
{
"name": "ES_CA",
"value": "/etc/fluent/keys/ca"
},
{
"name": "OPS_HOST",
"value": "logging-es-ops"
},
{
"name": "OPS_PORT",
"value": "9200"
},
{
"name": "OPS_CLIENT_CERT",
"value": "/etc/fluent/keys/cert"
},
{
"name": "OPS_CLIENT_KEY",
"value": "/etc/fluent/keys/key"
},
{
"name": "OPS_CA",
"value": "/etc/fluent/keys/ca"
},
{
"name": "ES_COPY",
"value": "false"
},
{
"name": "USE_JOURNAL",
"value": "true"
},
{
"name": "JOURNAL_SOURCE"
},
{
"name": "JOURNAL_READ_FROM_HEAD",
"value": "false"
}
],
"image": "172.30.168.204:5000/logging/logging-fluentd:latest",
"imagePullPolicy": "Always",
"name": "fluentd-elasticsearch",
"resources": {
"limits": {
"cpu": "100m",
"memory": "512Mi"
}
},
"securityContext": {
"privileged": true
},
"terminationMessagePath": "/dev/termination-log",
"terminationMessagePolicy": "File",
"volumeMounts": [
{
"mountPath": "/run/log/journal",
"name": "runlogjournal"
},
{
"mountPath": "/var/log",
"name": "varlog"
},
{
"mountPath": "/var/lib/docker/containers",
"name": "varlibdockercontainers",
"readOnly": true
},
{
"mountPath": "/etc/fluent/configs.d/user",
"name": "config",
"readOnly": true
},
{
"mountPath": "/etc/fluent/keys",
"name": "certs",
"readOnly": true
},
{
"mountPath": "/etc/docker-hostname",
"name": "dockerhostname",
"readOnly": true
},
{
"mountPath": "/etc/localtime",
"name": "localtime",
"readOnly": true
},
{
"mountPath": "/etc/sysconfig/docker",
"name": "dockercfg",
"readOnly": true
},
{
"mountPath": "/etc/docker",
"name": "dockerdaemoncfg",
"readOnly": true
}
]
}
],
"dnsPolicy": "ClusterFirst",
"nodeSelector": {
"logging-infra-fluentd": "true"
},
"restartPolicy": "Always",
"schedulerName": "default-scheduler",
"securityContext": {},
"serviceAccount": "aggregated-logging-fluentd",
"serviceAccountName": "aggregated-logging-fluentd",
"terminationGracePeriodSeconds": 30,
"volumes": [
{
"hostPath": {
"path": "/run/log/journal"
},
"name": "runlogjournal"
},
{
"hostPath": {
"path": "/var/log"
},
"name": "varlog"
},
{
"hostPath": {
"path": "/var/lib/docker/containers"
},
"name": "varlibdockercontainers"
},
{
"configMap": {
"defaultMode": 420,
"name": "logging-fluentd"
},
"name": "config"
},
{
"name": "certs",
"secret": {
"defaultMode": 420,
"secretName": "logging-fluentd"
}
},
{
"hostPath": {
"path": "/etc/hostname"
},
"name": "dockerhostname"
},
{
"hostPath": {
"path": "/etc/localtime"
},
"name": "localtime"
},
{
"hostPath": {
"path": "/etc/sysconfig/docker"
},
"name": "dockercfg"
},
{
"hostPath": {
"path": "/etc/docker"
},
"name": "dockerdaemoncfg"
}
]
}
},
"templateGeneration": 1,
"updateStrategy": {
"rollingUpdate": {
"maxUnavailable": 1
},
"type": "RollingUpdate"
}
},
"status": {
"currentNumberScheduled": 0,
"desiredNumberScheduled": 0,
"numberMisscheduled": 0,
"numberReady": 0,
"observedGeneration": 1
}
}
],
"returncode": 0
},
"state": "present"
}
TASK [openshift_logging_fluentd : Retrieve list of Fluentd hosts] **************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:183
ok: [openshift] => {
"changed": false,
"results": {
"cmd": "/bin/oc get node -o json -n default",
"results": [
{
"apiVersion": "v1",
"items": [
{
"apiVersion": "v1",
"kind": "Node",
"metadata": {
"annotations": {
"volumes.kubernetes.io/controller-managed-attach-detach": "true"
},
"creationTimestamp": "2017-06-08T14:21:19Z",
"labels": {
"beta.kubernetes.io/arch": "amd64",
"beta.kubernetes.io/os": "linux",
"kubernetes.io/hostname": "172.18.2.72"
},
"name": "172.18.2.72",
"namespace": "",
"resourceVersion": "1593",
"selfLink": "/api/v1/nodes/172.18.2.72",
"uid": "bdb0b59b-4c55-11e7-8cf8-0ef95424e992"
},
"spec": {
"externalID": "172.18.2.72",
"providerID": "aws:////i-0ef23af19a82b749f"
},
"status": {
"addresses": [
{
"address": "172.18.2.72",
"type": "LegacyHostIP"
},
{
"address": "172.18.2.72",
"type": "InternalIP"
},
{
"address": "172.18.2.72",
"type": "Hostname"
}
],
"allocatable": {
"cpu": "4",
"memory": "7129288Ki",
"pods": "40"
},
"capacity": {
"cpu": "4",
"memory": "7231688Ki",
"pods": "40"
},
"conditions": [
{
"lastHeartbeatTime": "2017-06-08T14:38:41Z",
"lastTransitionTime": "2017-06-08T14:21:19Z",
"message": "kubelet has sufficient disk space available",
"reason": "KubeletHasSufficientDisk",
"status": "False",
"type": "OutOfDisk"
},
{
"lastHeartbeatTime": "2017-06-08T14:38:41Z",
"lastTransitionTime": "2017-06-08T14:21:19Z",
"message": "kubelet has sufficient memory available",
"reason": "KubeletHasSufficientMemory",
"status": "False",
"type": "MemoryPressure"
},
{
"lastHeartbeatTime": "2017-06-08T14:38:41Z",
"lastTransitionTime": "2017-06-08T14:21:19Z",
"message": "kubelet has no disk pressure",
"reason": "KubeletHasNoDiskPressure",
"status": "False",
"type": "DiskPressure"
},
{
"lastHeartbeatTime": "2017-06-08T14:38:41Z",
"lastTransitionTime": "2017-06-08T14:21:19Z",
"message": "kubelet is posting ready status",
"reason": "KubeletReady",
"status": "True",
"type": "Ready"
}
],
"daemonEndpoints": {
"kubeletEndpoint": {
"Port": 10250
}
},
"images": [
{
"names": [
"docker.io/openshift/origin-docker-registry@sha256:c94ae3ae1bbe03b958e267aa6df129ac98fa375ad4b37f548b2945327be97d5f",
"docker.io/openshift/origin-docker-registry:latest"
],
"sizeBytes": 1100551985
},
{
"names": [
"openshift/origin-docker-registry:6acabdc",
"openshift/origin-docker-registry:latest"
],
"sizeBytes": 1100164272
},
{
"names": [
"openshift/origin-gitserver:6acabdc",
"openshift/origin-gitserver:latest"
],
"sizeBytes": 1086520226
},
{
"names": [
"openshift/node:6acabdc",
"openshift/node:latest"
],
"sizeBytes": 1051721928
},
{
"names": [
"openshift/origin-haproxy-router:6acabdc",
"openshift/origin-haproxy-router:latest"
],
"sizeBytes": 1022758742
},
{
"names": [
"openshift/origin-deployer:6acabdc",
"openshift/origin-deployer:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"openshift/origin-f5-router:6acabdc",
"openshift/origin-f5-router:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"openshift/origin:6acabdc",
"openshift/origin:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"openshift/origin-sti-builder:6acabdc",
"openshift/origin-sti-builder:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"openshift/origin-recycler:6acabdc",
"openshift/origin-recycler:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"openshift/origin-docker-builder:6acabdc",
"openshift/origin-docker-builder:latest"
],
"sizeBytes": 1001728427
},
{
"names": [
"rhel7.1:latest"
],
"sizeBytes": 765301508
},
{
"names": [
"openshift/dind-master:latest"
],
"sizeBytes": 731456758
},
{
"names": [
"openshift/dind-node:latest"
],
"sizeBytes": 731453034
},
{
"names": [
"172.30.168.204:5000/logging/logging-auth-proxy@sha256:a8c9a617dbf64358bb46b98f8633d119279086d41b2fceaea83f437869a35887",
"172.30.168.204:5000/logging/logging-auth-proxy:latest"
],
"sizeBytes": 715535971
},
{
"names": [
"<none>@<none>",
"<none>:<none>"
],
"sizeBytes": 709532011
},
{
"names": [
"docker.io/node@sha256:46db0dd19955beb87b841c30a6b9812ba626473283e84117d1c016deee5949a9",
"docker.io/node:0.10.36"
],
"sizeBytes": 697128386
},
{
"names": [
"docker.io/openshift/origin-logging-kibana@sha256:70ead525ed596b73301e8df3ac229e33dd7f8431ec1233b37e96544c556530e9",
"docker.io/openshift/origin-logging-kibana:latest"
],
"sizeBytes": 682851528
},
{
"names": [
"172.30.168.204:5000/logging/logging-kibana@sha256:6596a21b8d30d6c25fdb67520d566f4379540c679cd93cd1e6914caadc41ecea",
"172.30.168.204:5000/logging/logging-kibana:latest"
],
"sizeBytes": 682851513
},
{
"names": [
"openshift/dind:latest"
],
"sizeBytes": 640650210
},
{
"names": [
"172.30.168.204:5000/logging/logging-elasticsearch@sha256:75ae4caacd5955f840db27b4aa5ecd511669e14f2b74df7cf4baa1de228802c4",
"172.30.168.204:5000/logging/logging-elasticsearch:latest"
],
"sizeBytes": 623379762
},
{
"names": [
"172.30.168.204:5000/logging/logging-fluentd@sha256:5cb4fa2e9fa12dfb4ff1581ab8bd286fee145dbe18b9e255e2cdc74481cad121",
"172.30.168.204:5000/logging/logging-fluentd:latest"
],
"sizeBytes": 472183174
},
{
"names": [
"docker.io/openshift/origin-logging-elasticsearch@sha256:1e72563ad0551f5c15fc6aa8057a64cc9d0c21b2c40bca7efabdd1b55a4fc2e4",
"docker.io/openshift/origin-logging-elasticsearch:latest"
],
"sizeBytes": 425433997
},
{
"names": [
"172.30.168.204:5000/logging/logging-curator@sha256:fa1cf03b2487a0e4e98da8187e06a9da3414397eac845fcc73b7a57701363979",
"172.30.168.204:5000/logging/logging-curator:latest"
],
"sizeBytes": 418288266
},
{
"names": [
"docker.io/openshift/base-centos7@sha256:aea292a3bddba020cde0ee83e6a45807931eb607c164ec6a3674f67039d8cd7c",
"docker.io/openshift/base-centos7:latest"
],
"sizeBytes": 383049978
},
{
"names": [
"rhel7.2:latest"
],
"sizeBytes": 377493597
},
{
"names": [
"openshift/origin-egress-router:6acabdc",
"openshift/origin-egress-router:latest"
],
"sizeBytes": 364745713
},
{
"names": [
"openshift/origin-base:latest"
],
"sizeBytes": 363070172
},
{
"names": [
"<none>@<none>",
"<none>:<none>"
],
"sizeBytes": 363024702
},
{
"names": [
"docker.io/openshift/origin-logging-fluentd@sha256:bc70848086a50bad58a2f41e166098e8ed351bf4dbe7af83caeb7a29f35b4395",
"docker.io/openshift/origin-logging-fluentd:latest"
],
"sizeBytes": 359217371
},
{
"names": [
"docker.io/fedora@sha256:69281ddd7b2600e5f2b17f1e12d7fba25207f459204fb2d15884f8432c479136",
"docker.io/fedora:25"
],
"sizeBytes": 230864375
},
{
"names": [
"docker.io/openshift/origin-logging-curator@sha256:e820338ca7fb0addfaec25d80d40a49f5ea25b24ff056ab6adbb42dd9eec94b4",
"docker.io/openshift/origin-logging-curator:latest"
],
"sizeBytes": 224977691
},
{
"names": [
"rhel7.3:latest",
"rhel7:latest"
],
"sizeBytes": 219121266
},
{
"names": [
"openshift/origin-pod:latest"
],
"sizeBytes": 213199843
},
{
"names": [
"registry.access.redhat.com/rhel7.2@sha256:98e6ca5d226c26e31a95cd67716afe22833c943e1926a21daf1a030906a02249",
"registry.access.redhat.com/rhel7.2:latest"
],
"sizeBytes": 201376319
},
{
"names": [
"registry.access.redhat.com/rhel7.3@sha256:1e232401d8e0ba53b36b757b4712fbcbd1dab9c21db039c45a84871a74e89e68",
"registry.access.redhat.com/rhel7.3:latest"
],
"sizeBytes": 192693772
},
{
"names": [
"docker.io/centos@sha256:bba1de7c9d900a898e3cadbae040dfe8a633c06bc104a0df76ae24483e03c077"
],
"sizeBytes": 192548999
},
{
"names": [
"openshift/origin-source:latest"
],
"sizeBytes": 192548894
},
{
"names": [
"docker.io/centos@sha256:aebf12af704307dfa0079b3babdca8d7e8ff6564696882bcb5d11f1d461f9ee9",
"docker.io/centos:7",
"docker.io/centos:centos7"
],
"sizeBytes": 192548537
},
{
"names": [
"registry.access.redhat.com/rhel7.1@sha256:1bc5a4c43bbb29a5a96a61896ff696933be3502e2f5fdc4cde02d9e101731fdd",
"registry.access.redhat.com/rhel7.1:latest"
],
"sizeBytes": 158229901
},
{
"names": [
"openshift/hello-openshift:6acabdc",
"openshift/hello-openshift:latest"
],
"sizeBytes": 5643318
}
],
"nodeInfo": {
"architecture": "amd64",
"bootID": "33773cfc-141c-4481-99d2-e188d0d714b7",
"containerRuntimeVersion": "docker://1.12.6",
"kernelVersion": "3.10.0-327.22.2.el7.x86_64",
"kubeProxyVersion": "v1.6.1+5115d708d7",
"kubeletVersion": "v1.6.1+5115d708d7",
"machineID": "f9370ed252a14f73b014c1301a9b6d1b",
"operatingSystem": "linux",
"osImage": "Red Hat Enterprise Linux Server 7.3 (Maipo)",
"systemUUID": "EC2A1098-CB62-E15F-ABE2-8CE899E83DC1"
}
}
}
],
"kind": "List",
"metadata": {},
"resourceVersion": "",
"selfLink": ""
}
],
"returncode": 0
},
"state": "list"
}
TASK [openshift_logging_fluentd : Set openshift_logging_fluentd_hosts] *********
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:190
ok: [openshift] => {
"ansible_facts": {
"openshift_logging_fluentd_hosts": [
"172.18.2.72"
]
},
"changed": false
}
TASK [openshift_logging_fluentd : include] *************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:195
included: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/label_and_wait.yaml for openshift
TASK [openshift_logging_fluentd : Label 172.18.2.72 for Fluentd deployment] ****
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/label_and_wait.yaml:2
changed: [openshift] => {
"changed": true,
"results": {
"cmd": "/bin/oc label node 172.18.2.72 logging-infra-fluentd=true --overwrite",
"results": "",
"returncode": 0
},
"state": "add"
}
TASK [openshift_logging_fluentd : command] *************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/label_and_wait.yaml:10
changed: [openshift -> 127.0.0.1] => {
"changed": true,
"cmd": [
"sleep",
"0.5"
],
"delta": "0:00:00.502565",
"end": "2017-06-08 10:38:47.327802",
"rc": 0,
"start": "2017-06-08 10:38:46.825237"
}
TASK [openshift_logging_fluentd : Delete temp directory] ***********************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging_fluentd/tasks/main.yaml:202
ok: [openshift] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-mzVmJn",
"state": "absent"
}
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging/tasks/install_logging.yaml:253
included: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging/tasks/update_master_config.yaml for openshift
TASK [openshift_logging : include] *********************************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging/tasks/main.yaml:36
skipping: [openshift] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_logging : Cleaning up local temp dir] **************************
task path: /tmp/tmp.3ESZO9bffQ/openhift-ansible/roles/openshift_logging/tasks/main.yaml:40
ok: [openshift -> 127.0.0.1] => {
"changed": false,
"path": "/tmp/openshift-logging-ansible-JnNE65",
"state": "absent"
}
META: ran handlers
META: ran handlers
PLAY [Update Master configs] ***************************************************
skipping: no hosts matched
PLAY RECAP *********************************************************************
localhost : ok=2 changed=0 unreachable=0 failed=0
openshift : ok=207 changed=70 unreachable=0 failed=0
/data/src/github.com/openshift/origin-aggregated-logging
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:170: executing 'oc get pods -l component=es' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.410s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:170: executing 'oc get pods -l component=es' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-es-data-master-ir64oytb-1-4m3x4 1/1 Running 0 52s
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:171: executing 'oc get pods -l component=kibana' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 10.456s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:171: executing 'oc get pods -l component=kibana' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 28s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 29s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 30s
... repeated 3 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 31s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 32s
... repeated 3 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 33s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 34s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 35s
... repeated 3 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 36s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 37s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 0/2 ContainerCreating 0 38s
NAME READY STATUS RESTARTS AGE
logging-kibana-1-p0frk 1/2 Running 0 38s
Standard error from the command:
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:172: executing 'oc get pods -l component=curator' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 14.519s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:172: executing 'oc get pods -l component=curator' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 15s
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 16s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 17s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 18s
... repeated 3 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 19s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 20s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 21s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 22s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 23s
... repeated 3 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 24s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 25s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 26s
... repeated 3 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 27s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 28s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 0/1 ContainerCreating 0 29s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-1-9dgzc 1/1 Running 0 30s
Standard error from the command:
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:175: executing 'oc get pods -l component=es-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.220s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:175: executing 'oc get pods -l component=es-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-es-ops-data-master-vw9gsfsf-1-0qbfc 1/1 Running 0 1m
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:176: executing 'oc get pods -l component=kibana-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 0.221s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:176: executing 'oc get pods -l component=kibana-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-kibana-ops-1-9735s 1/2 Running 0 36s
There was no error output from the command.
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:177: executing 'oc get pods -l component=curator-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s...
SUCCESS after 2.122s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:177: executing 'oc get pods -l component=curator-ops' expecting any result and text 'Running'; re-trying every 0.2s until completion or 180.000s
Standard output from the command:
NAME READY STATUS RESTARTS AGE
logging-curator-ops-1-157gj 0/1 ContainerCreating 0 22s
NAME READY STATUS RESTARTS AGE
logging-curator-ops-1-157gj 0/1 ContainerCreating 0 23s
... repeated 2 times
NAME READY STATUS RESTARTS AGE
logging-curator-ops-1-157gj 0/1 ContainerCreating 0 24s
NAME READY STATUS RESTARTS AGE
logging-curator-ops-1-157gj 1/1 Running 0 24s
Standard error from the command:
Running /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:185: executing 'oc project logging > /dev/null' expecting success...
SUCCESS after 0.236s: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:185: executing 'oc project logging > /dev/null' expecting success
There was no output from the command.
There was no error output from the command.
/data/src/github.com/openshift/origin-aggregated-logging/hack/testing /data/src/github.com/openshift/origin-aggregated-logging
--> Deploying template "logging/logging-fluentd-template-maker" for "-" to project logging
logging-fluentd-template-maker
---------
Template to create template for fluentd
* With parameters:
* MASTER_URL=https://kubernetes.default.svc.cluster.local
* ES_HOST=logging-es
* ES_PORT=9200
* ES_CLIENT_CERT=/etc/fluent/keys/cert
* ES_CLIENT_KEY=/etc/fluent/keys/key
* ES_CA=/etc/fluent/keys/ca
* OPS_HOST=logging-es-ops
* OPS_PORT=9200
* OPS_CLIENT_CERT=/etc/fluent/keys/cert
* OPS_CLIENT_KEY=/etc/fluent/keys/key
* OPS_CA=/etc/fluent/keys/ca
* ES_COPY=false
* ES_COPY_HOST=
* ES_COPY_PORT=
* ES_COPY_SCHEME=https
* ES_COPY_CLIENT_CERT=
* ES_COPY_CLIENT_KEY=
* ES_COPY_CA=
* ES_COPY_USERNAME=
* ES_COPY_PASSWORD=
* OPS_COPY_HOST=
* OPS_COPY_PORT=
* OPS_COPY_SCHEME=https
* OPS_COPY_CLIENT_CERT=
* OPS_COPY_CLIENT_KEY=
* OPS_COPY_CA=
* OPS_COPY_USERNAME=
* OPS_COPY_PASSWORD=
* IMAGE_PREFIX_DEFAULT=172.30.168.204:5000/logging/
* IMAGE_VERSION_DEFAULT=latest
* USE_JOURNAL=
* JOURNAL_SOURCE=
* JOURNAL_READ_FROM_HEAD=false
* USE_MUX=false
* USE_MUX_CLIENT=false
* MUX_ALLOW_EXTERNAL=false
* BUFFER_QUEUE_LIMIT=1024
* BUFFER_SIZE_LIMIT=16777216
--> Creating resources ...
template "logging-fluentd-template" created
--> Success
Run 'oc status' to view your app.
WARNING: bridge-nf-call-ip6tables is disabled
Error: timed out waiting for /var/log/journal.pos - check Fluentd pod log
[ERROR] PID 4227: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:615: `return 1` exited with status 1.
[INFO] Stack Trace:
[INFO] 1: /data/src/github.com/openshift/origin-aggregated-logging/logging.sh:615: `return 1`
[INFO] Exiting with code 1.
/data/src/github.com/openshift/origin-aggregated-logging/hack/lib/log/system.sh: line 31: 4591 Terminated sar -A -o "${binary_logfile}" 1 86400 > /dev/null 2> "${stderr_logfile}" (wd: /data/src/github.com/openshift/origin-aggregated-logging)
[INFO] [CLEANUP] Beginning cleanup routines...
[INFO] [CLEANUP] Dumping cluster events to /tmp/origin-aggregated-logging/artifacts/events.txt
[INFO] [CLEANUP] Dumping etcd contents to /tmp/origin-aggregated-logging/artifacts/etcd
[WARNING] No compiled `etcdhelper` binary was found. Attempting to build one using:
[WARNING] $ hack/build-go.sh tools/etcdhelper
++ Building go targets for linux/amd64: tools/etcdhelper
/data/src/github.com/openshift/origin-aggregated-logging/../origin/hack/build-go.sh took 149 seconds
2017-06-08 10:42:51.887416 I | warning: ignoring ServerName for user-provided CA for backwards compatibility is deprecated
[INFO] [CLEANUP] Dumping container logs to /tmp/origin-aggregated-logging/logs/containers
[INFO] [CLEANUP] Truncating log files over 200M
[INFO] [CLEANUP] Stopping docker containers
[INFO] [CLEANUP] Removing docker containers
Error: No such image, container or task: 06596958d646
json: cannot unmarshal array into Go value of type types.ContainerJSON
Error response from daemon: You cannot remove a running container 95fb7a2d01d3c7362e7f19ca1431eb0795cc6a7710ffed441ce205cd6475c310. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 7f5bdb37ecca626968b1144c2815b301e5747525dc52510c704b4e176bf0c478. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 4cbefc916a9b271443aa9a4a0eae291e262343bb6c40e75da785f770bfc68c2a. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container f79f86d6b2bd20517a210816ac8e0a494d720bfd35bcb7608a9b618a80854c5e. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container c868ad36b425b25e2eb86f43bdaf0e5d896d46b40a18ba25f44defc4d1e2b237. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container e8dc2126713dc72bf167643e4365f34a54da9579ce23d8605dd21aba4d519e60. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container cc7804cf802965379c2eb53d59b26715fc6848569affcd7c427401892547bb58. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 6fe9cdccc8ee85625966983e2fb1a87c9f59e01dc4a2340e187d0ecc6b3f2327. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container dfa489301d03c525cc4546365bd0161920c4244849df82358d5fe5ffe1c4531f. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container e08fa1c47ee0f417e91993118b42742065ccc409c8e58266c370fee14c18d154. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container f5dfe187055a523470297545376e9626b1821c74bb4b556d367de8f59057db39. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container af848de7e14ae96c120cec6056f38403e83467eace101430f3062b5102e69c2b. Stop the container before attempting removal or use -f
Error response from daemon: You cannot remove a running container 9c55b3302415eb7a4ff565ce2986215f7a4f8ac69f19245cf0db66723a606e54. Stop the container before attempting removal or use -f
[INFO] [CLEANUP] Killing child processes
[INFO] [CLEANUP] Pruning etcd data directory
[ERROR] /data/src/github.com/openshift/origin-aggregated-logging/logging.sh exited with code 1 after 00h 29m 13s
Error while running ssh/sudo command:
set -e
pushd /data/src/github.com/openshift//origin-aggregated-logging/hack/testing >/dev/null
export PATH=$GOPATH/bin:$PATH
echo '***************************************************'
echo 'Running GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh...'
time GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh
echo 'Finished GIT_URL=https://github.com/openshift/origin-aggregated-logging GIT_BRANCH=master O_A_L_DIR=/data/src/github.com/openshift/origin-aggregated-logging OS_ROOT=/data/src/github.com/openshift/origin ENABLE_OPS_CLUSTER=true USE_LOCAL_SOURCE=true TEST_PERF=false VERBOSE=1 OS_ANSIBLE_REPO=https://github.com/openshift/openshift-ansible OS_ANSIBLE_BRANCH=master ./logging.sh'
echo '***************************************************'
popd >/dev/null
The SSH command responded with a non-zero exit status. Vagrant
assumes that this means the command failed. The output for this command
should be in the log above. Please read the output to determine what
went wrong.
==> openshiftdev: Downloading logs
==> openshiftdev: Downloading artifacts from '/var/log/yum.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts/yum.log'
==> openshiftdev: Downloading artifacts from '/var/log/secure' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts/secure'
==> openshiftdev: Downloading artifacts from '/var/log/audit/audit.log' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts/audit.log'
==> openshiftdev: Downloading artifacts from '/tmp/origin-aggregated-logging/' to '/var/lib/jenkins/jobs/test-origin-aggregated-logging/workspace/origin/artifacts'
Build step 'Execute shell' marked build as failure
[description-setter] Could not determine description.
[PostBuildScript] - Execution post build scripts.
[workspace] $ /bin/sh -xe /tmp/hudson5354174135280436271.sh
+ INSTANCE_NAME=origin_logging-rhel7-1626
+ pushd origin
~/jobs/test-origin-aggregated-logging/workspace/origin ~/jobs/test-origin-aggregated-logging/workspace
+ rc=0
+ '[' -f .vagrant-openshift.json ']'
++ /usr/bin/vagrant ssh -c 'sudo ausearch -m avc'
+ ausearchresult='<no matches>'
+ rc=1
+ '[' '<no matches>' = '<no matches>' ']'
+ rc=0
+ /usr/bin/vagrant destroy -f
==> openshiftdev: Terminating the instance...
==> openshiftdev: Running cleanup tasks for 'shell' provisioner...
+ popd
~/jobs/test-origin-aggregated-logging/workspace
+ exit 0
[BFA] Scanning build for known causes...
[BFA] Found failure cause(s):
[BFA] Command Failure from category failure
[BFA] Done. 0s
Finished: FAILURE