Ansible is totally broken after a OS upgrade

84 views
Skip to first unread message

Mark Tovey

unread,
Oct 28, 2019, 4:08:40 PM10/28/19
to Ansible Project

    I just performed a OS upgrade on our Ansible server from RedHat 7.6 to RedHat 7.7, and now Ansible has become unusable.  When I try to run any task that uses any Ansible fact, I get the following error:

[WARNING]: Failure using method (v2_runner_on_ok) in callback plugin (<ansible.plugins.callback.yaml.CallbackModule object at 0x7efe1283de50>):

value must be a string


    The task that generated this error was a simple debug statement: 

- debug: msg="{{ansible_distribution}}"

    Interestingly, if I use the full hostvars path, it works:

   - debug: msg="{{hostvars[inventory_hostname].ansible_distribution}}"

    The error first appeared with Ansible 2.8.1, and I immediately upgraded to Ansible 2.8.1 to see if that would help, but it did not.  I suspect that it is an issue in Python, but I have no idea of where to look.  Does anyone else have any ideas about this?  This has totally brought our use of Ansible to a halt until we can find a resolution for this.

    The simple playbook I wrote to test this is below:

---
- hosts: "localhost"
  gather_facts: yes
  become: no

  tasks:
  - debug: msg="this is a test"
  - debug: msg="{{hostvars[inventory_hostname].ansible_distribution}}"
  - debug: msg="{{ansible_distribution}}"


    And full debug output is below:

ansible-playbook ~/devansible/playbooks/NewTest -i ~/devansible/inventory -vvvv
ansible-playbook 2.8.5
  config file = /etc/ansible/ansible.cfg
  configured module search path = [u'/home/mtovey/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python2.7/site-packages/ansible
  executable location = /usr/bin/ansible-playbook
  python version = 2.7.5 (default, Jun 11 2019, 14:33:56) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
Using /etc/ansible/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /home/mtovey/devansible/inventory/hosts as it did not pass it's verify_file() method
script declined parsing /home/mtovey/devansible/inventory/hosts as it did not pass it's verify_file() method
auto declined parsing /home/mtovey/devansible/inventory/hosts as it did not pass it's verify_file() method
Set default localhost to localhost
Parsed /home/mtovey/devansible/inventory/hosts inventory source with ini plugin
Loading callback plugin yaml of type stdout, v2.0 from /usr/lib/python2.7/site-packages/ansible/plugins/callback/yaml.pyc

PLAYBOOK: NewTest ******************************************************************************************************************************************************************
Positional arguments: /home/mtovey/devansible/playbooks/NewTest
become_method: sudo
inventory: (u'/home/mtovey/devansible/inventory',)
forks: 5
tags: (u'all',)
verbosity: 4
connection: smart
timeout: 10
1 plays in /home/mtovey/devansible/playbooks/NewTest

PLAY [localhost] *******************************************************************************************************************************************************************

TASK [Gathering Facts] *************************************************************************************************************************************************************
task path: /home/mtovey/devansible/playbooks/NewTest:2
<localhost> ESTABLISH SSH CONNECTION FOR USER: None
<localhost> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/mtovey/.ansible/cp/8a5a4c6a60 localhost '/bin/sh -c '"'"'echo ~ && sleep 0'"'"''
<localhost> (0, '/home/mtovey\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips  26 Jan 2017\r\ndebug1: Reading configuration data /home/mtovey/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 58: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug1: Control socket "/home/mtovey/.ansible/cp/8a5a4c6a60" does not exist\r\ndebug2: resolving "localhost" port 22\r\ndebug2: ssh_connect_direct: needpriv 0\r\ndebug1: Connecting to localhost [::1] port 22.\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug1: fd 3 clearing O_NONBLOCK\r\ndebug1: Connection established.\r\ndebug3: timeout: 10000 ms remain after connect\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /home/mtovey/.ssh/id_rsa type -1\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /home/mtovey/.ssh/id_rsa-cert type -1\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /home/mtovey/.ssh/id_dsa type -1\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /home/mtovey/.ssh/id_dsa-cert type -1\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /home/mtovey/.ssh/id_ecdsa type -1\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /home/mtovey/.ssh/id_ecdsa-cert type -1\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /home/mtovey/.ssh/id_ed25519 type -1\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /home/mtovey/.ssh/id_ed25519-cert type -1\r\ndebug1: Enabling compatibility mode for protocol 2.0\r\ndebug1: Local version string SSH-2.0-OpenSSH_7.4\r\ndebug1: Remote protocol version 2.0, remote software version OpenSSH_7.4\r\ndebug1: match: OpenSSH_7.4 pat OpenSSH* compat 0x04000000\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug1: Authenticating to localhost:22 as \'mtovey\'\r\ndebug3: hostkeys_foreach: reading file "/home/mtovey/.ssh/known_hosts"\r\ndebug3: send packet: type 20\r\ndebug1: SSH2_MSG_KEXINIT sent\r\ndebug3: receive packet: type 20\r\ndebug1: SSH2_MSG_KEXINIT received\r\ndebug2: local client KEXINIT proposal\r\ndebug2: KEX algorithms: curve25519-sha256,curve255...@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha256,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1,ext-info-c\r\ndebug2: host key algorithms: ecdsa-sha2-nis...@openssh.com,ecdsa-sha2-nis...@openssh.com,ecdsa-sha2-nis...@openssh.com,ssh-ed2551...@openssh.com,ssh-rsa-...@openssh.com,ssh-dss-...@openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,ssh-ed25519,rsa-sha2-512,rsa-sha2-256,ssh-rsa,ssh-dss\r\ndebug2: ciphers ctos: chacha20...@openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes12...@openssh.com,aes25...@openssh.com,aes128-cbc,aes192-cbc,aes256-cbc\r\ndebug2: ciphers stoc: chacha20...@openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes12...@openssh.com,aes25...@openssh.com,aes128-cbc,aes192-cbc,aes256-cbc\r\ndebug2: MACs ctos: umac-...@openssh.com,umac-1...@openssh.com,hmac-sha...@openssh.com,hmac-sha...@openssh.com,hmac-s...@openssh.com,uma...@openssh.com,umac...@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1\r\ndebug2: MACs stoc: umac-...@openssh.com,umac-1...@openssh.com,hmac-sha...@openssh.com,hmac-sha...@openssh.com,hmac-s...@openssh.com,uma...@openssh.com,umac...@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1\r\ndebug2: compression ctos: zl...@openssh.com,zlib,none\r\ndebug2: compression stoc: zl...@openssh.com,zlib,none\r\ndebug2: languages ctos: \r\ndebug2: languages stoc: \r\ndebug2: first_kex_follows 0 \r\ndebug2: reserved 0 \r\ndebug2: peer server KEXINIT proposal\r\ndebug2: KEX algorithms: curve25519-sha256,curve255...@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha256,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1\r\ndebug2: host key algorithms: ssh-rsa,rsa-sha2-512,rsa-sha2-256,ssh-dss\r\ndebug2: ciphers ctos: chacha20...@openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes12...@openssh.com,aes25...@openssh.com,aes128-cbc,aes192-cbc,aes256-cbc,blowfish-cbc,cast128-cbc,3des-cbc\r\ndebug2: ciphers stoc: chacha20...@openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes12...@openssh.com,aes25...@openssh.com,aes128-cbc,aes192-cbc,aes256-cbc,blowfish-cbc,cast128-cbc,3des-cbc\r\ndebug2: MACs ctos: umac-...@openssh.com,umac-1...@openssh.com,hmac-sha...@openssh.com,hmac-sha...@openssh.com,hmac-s...@openssh.com,uma...@openssh.com,umac...@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1\r\ndebug2: MACs stoc: umac-...@openssh.com,umac-1...@openssh.com,hmac-sha...@openssh.com,hmac-sha...@openssh.com,hmac-s...@openssh.com,uma...@openssh.com,umac...@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1\r\ndebug2: compression ctos: none,zl...@openssh.com\r\ndebug2: compression stoc: none,zl...@openssh.com\r\ndebug2: languages ctos: \r\ndebug2: languages stoc: \r\ndebug2: first_kex_follows 0 \r\ndebug2: reserved 0 \r\ndebug1: kex: algorithm: curve25519-sha256\r\ndebug1: kex: host key algorithm: rsa-sha2-512\r\ndebug1: kex: server->client cipher: chacha20...@openssh.com MAC: <implicit> compression: zl...@openssh.com\r\ndebug1: kex: client->server cipher: chacha20...@openssh.com MAC: <implicit> compression: zl...@openssh.com\r\ndebug1: kex: curve25519-sha256 need=64 dh_need=64\r\ndebug1: kex: curve25519-sha256 need=64 dh_need=64\r\ndebug3: send packet: type 30\r\ndebug1: expecting SSH2_MSG_KEX_ECDH_REPLY\r\ndebug3: receive packet: type 31\r\ndebug1: Server host key: ssh-rsa SHA256:tb0gnjRbXN1gKDR56CNl7xEIrP/kkozmzZDbWHX88B0\r\ndebug3: hostkeys_foreach: reading file "/home/mtovey/.ssh/known_hosts"\r\nWarning: Permanently added \'localhost\' (RSA) to the list of known hosts.\r\ndebug3: send packet: type 21\r\ndebug2: set_newkeys: mode 1\r\ndebug1: rekey after 134217728 blocks\r\ndebug1: SSH2_MSG_NEWKEYS sent\r\ndebug1: expecting SSH2_MSG_NEWKEYS\r\ndebug3: receive packet: type 21\r\ndebug1: SSH2_MSG_NEWKEYS received\r\ndebug2: set_newkeys: mode 0\r\ndebug1: rekey after 134217728 blocks\r\ndebug2: key: rsa-key-20100825 (0x55d8f897a970), agent\r\ndebug2: key: /home/mtovey/.ssh/id_rsa ((nil))\r\ndebug2: key: /home/mtovey/.ssh/id_dsa ((nil))\r\ndebug2: key: /home/mtovey/.ssh/id_ecdsa ((nil))\r\ndebug2: key: /home/mtovey/.ssh/id_ed25519 ((nil))\r\ndebug3: send packet: type 5\r\ndebug3: receive packet: type 7\r\ndebug1: SSH2_MSG_EXT_INFO received\r\ndebug1: kex_input_ext_info: server-sig-algs=<rsa-sha2-256,rsa-sha2-512>\r\ndebug3: receive packet: type 6\r\ndebug2: service_accept: ssh-userauth\r\ndebug1: SSH2_MSG_SERVICE_ACCEPT received\r\ndebug3: send packet: type 50\r\ndebug3: receive packet: type 51\r\ndebug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,keyboard-interactive\r\ndebug3: start over, passed a different list publickey,gssapi-keyex,gssapi-with-mic,keyboard-interactive\r\ndebug3: preferred gssapi-with-mic,gssapi-keyex,hostbased,publickey\r\ndebug3: authmethod_lookup gssapi-with-mic\r\ndebug3: remaining preferred: gssapi-keyex,hostbased,publickey\r\ndebug3: authmethod_is_enabled gssapi-with-mic\r\ndebug1: Next authentication method: gssapi-with-mic\r\ndebug1: Unspecified GSS failure.  Minor code may provide more information\nNo Kerberos credentials available (default cache: KEYRING:persistent:2027)\n\r\ndebug1: Unspecified GSS failure.  Minor code may provide more information\nNo Kerberos credentials available (default cache: KEYRING:persistent:2027)\n\r\ndebug2: we did not send a packet, disable method\r\ndebug3: authmethod_lookup gssapi-keyex\r\ndebug3: remaining preferred: hostbased,publickey\r\ndebug3: authmethod_is_enabled gssapi-keyex\r\ndebug1: Next authentication method: gssapi-keyex\r\ndebug1: No valid Key exchange context\r\ndebug2: we did not send a packet, disable method\r\ndebug3: authmethod_lookup publickey\r\ndebug3: remaining preferred: ,publickey\r\ndebug3: authmethod_is_enabled publickey\r\ndebug1: Next authentication method: publickey\r\ndebug1: Offering RSA public key: rsa-key-20100825\r\ndebug3: send_pubkey_test\r\ndebug3: send packet: type 50\r\ndebug2: we sent a publickey packet, wait for reply\r\ndebug3: receive packet: type 60\r\ndebug1: Server accepts key: pkalg rsa-sha2-512 blen 276\r\ndebug2: input_userauth_pk_ok: fp SHA256:zX9R42XHkBUTRguTXXsstV6gQvyII1Bso3yu/T3Ur7M\r\ndebug3: sign_and_send_pubkey: RSA SHA256:zX9R42XHkBUTRguTXXsstV6gQvyII1Bso3yu/T3Ur7M\r\ndebug3: send packet: type 50\r\ndebug3: receive packet: type 52\r\ndebug1: Enabling compression at level 6.\r\ndebug1: Authentication succeeded (publickey).\r\nAuthenticated to localhost ([::1]:22).\r\ndebug1: setting up multiplex master socket\r\ndebug3: muxserver_listen: temporary control path /home/mtovey/.ansible/cp/8a5a4c6a60.yhK43RXJXmBMXTTJ\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug3: fd 5 is O_NONBLOCK\r\ndebug3: fd 5 is O_NONBLOCK\r\ndebug1: channel 0: new [/home/mtovey/.ansible/cp/8a5a4c6a60]\r\ndebug3: muxserver_listen: mux listener channel 0 fd 5\r\ndebug2: fd 3 setting TCP_NODELAY\r\ndebug3: ssh_packet_set_tos: set IPV6_TCLASS 0x08\r\ndebug1: control_persist_detach: backgrounding master process\r\ndebug2: control_persist_detach: background process is 7257\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug1: forking to background\r\ndebug1: Entering interactive session.\r\ndebug1: pledge: id\r\ndebug2: set_control_persist_exit_time: schedule exit in 60 seconds\r\ndebug1: multiplexing control connection\r\ndebug2: fd 6 setting O_NONBLOCK\r\ndebug3: fd 6 is O_NONBLOCK\r\ndebug1: channel 1: new [mux-control]\r\ndebug3: channel_post_mux_listener: new mux channel 1 fd 6\r\ndebug3: mux_master_read_cb: channel 1: hello sent\r\ndebug2: set_control_persist_exit_time: cancel scheduled exit\r\ndebug3: mux_master_read_cb: channel 1 packet type 0x00000001 len 4\r\ndebug2: process_mux_master_hello: channel 1 slave version 4\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_master_read_cb: channel 1 packet type 0x10000004 len 4\r\ndebug2: process_mux_alive_check: channel 1: alive check\r\ndebug3: mux_client_request_alive: done pid = 7259\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_master_read_cb: channel 1 packet type 0x10000002 len 91\r\ndebug2: process_mux_new_session: channel 1: request tty 0, X 0, agent 0, subsys 0, term "putty", cmd "/bin/sh -c \'echo ~ && sleep 0\'", env 1\r\ndebug3: process_mux_new_session: got fds stdin 7, stdout 8, stderr 9\r\ndebug2: fd 8 setting O_NONBLOCK\r\ndebug2: fd 9 setting O_NONBLOCK\r\ndebug1: channel 2: new [client-session]\r\ndebug2: process_mux_new_session: channel_new: 2 linked to control channel 1\r\ndebug2: channel 2: send open\r\ndebug3: send packet: type 90\r\ndebug3: receive packet: type 80\r\ndebug1: client_input_global_request: rtype hostk...@openssh.com want_reply 0\r\ndebug3: receive packet: type 91\r\ndebug2: callback start\r\ndebug2: client_session2_setup: id 2\r\ndebug1: Sending environment.\r\ndebug1: Sending env LANG = en_US.UTF-8\r\ndebug2: channel 2: request env confirm 0\r\ndebug3: send packet: type 98\r\ndebug1: Sending command: /bin/sh -c \'echo ~ && sleep 0\'\r\ndebug2: channel 2: request exec confirm 1\r\ndebug3: send packet: type 98\r\ndebug3: mux_session_confirm: sending success reply\r\ndebug2: callback done\r\ndebug2: channel 2: open confirm rwindow 0 rmax 32768\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug2: channel 2: rcvd adjust 2097152\r\ndebug3: receive packet: type 99\r\ndebug2: channel_input_status_confirm: type 99 id 2\r\ndebug2: exec request accepted on channel 2\r\ndebug3: receive packet: type 98\r\ndebug1: client_input_channel_req: channel 2 rtype exit-status reply 0\r\ndebug3: mux_exit_message: channel 2: exit message, exitval 0\r\ndebug3: receive packet: type 98\r\ndebug1: client_input_channel_req: channel 2 rtype e...@openssh.com reply 0\r\ndebug2: channel 2: rcvd eow\r\ndebug2: channel 2: close_read\r\ndebug2: channel 2: input open -> closed\r\ndebug3: receive packet: type 96\r\ndebug2: channel 2: rcvd eof\r\ndebug2: channel 2: output open -> drain\r\ndebug2: channel 2: obuf empty\r\ndebug2: channel 2: close_write\r\ndebug2: channel 2: output drain -> closed\r\ndebug3: receive packet: type 97\r\ndebug2: channel 2: rcvd close\r\ndebug3: channel 2: will not send data after close\r\ndebug2: channel 2: send close\r\ndebug3: send packet: type 97\r\ndebug2: channel 2: is dead\r\ndebug2: channel 2: gc: notify user\r\ndebug3: mux_master_session_cleanup_cb: entering for channel 2\r\ndebug2: channel 1: rcvd close\r\ndebug2: channel 1: output open -> drain\r\ndebug2: channel 1: close_read\r\ndebug2: channel 1: input open -> closed\r\ndebug2: channel 2: gc: user detached\r\ndebug2: channel 2: is dead\r\ndebug2: channel 2: garbage collecting\r\ndebug1: channel 2: free: client-session, nchannels 3\r\ndebug3: channel 2: status: The following connections are open:\r\n  #1 mux-control (t16 r-1 i3/0 o1/16 fd 6/6 cc -1)\r\n  #2 client-session (t4 r0 i3/0 o3/0 fd -1/-1 cc -1)\r\n\r\ndebug2: channel 1: obuf empty\r\ndebug2: channel 1: close_write\r\ndebug2: channel 1: output drain -> closed\r\ndebug2: channel 1: is dead (local)\r\ndebug2: channel 1: gc: notify user\r\ndebug3: mux_master_control_cleanup_cb: entering for channel 1\r\ndebug2: channel 1: gc: user detached\r\ndebug2: channel 1: is dead (local)\r\ndebug2: channel 1: garbage collecting\r\ndebug1: channel 1: free: mux-control, nchannels 2\r\ndebug3: channel 1: status: The following connections are open:\r\n  #1 mux-control (t16 r-1 i3/0 o3/0 fd 6/6 cc -1)\r\n\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: set_control_persist_exit_time: schedule exit in 60 seconds\r\ndebug2: Received exit status from master 0\r\n')
<localhost> ESTABLISH SSH CONNECTION FOR USER: None
<localhost> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/mtovey/.ansible/cp/8a5a4c6a60 localhost '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076 `" && echo ansible-tmp-1572288869.96-280974017029076="` echo /home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076 `" ) && sleep 0'"'"''
<localhost> (0, 'ansible-tmp-1572288869.96-280974017029076=/home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips  26 Jan 2017\r\ndebug1: Reading configuration data /home/mtovey/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 58: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 7259\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<localhost> Attempting python interpreter discovery
<localhost> ESTABLISH SSH CONNECTION FOR USER: None
<localhost> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/mtovey/.ansible/cp/8a5a4c6a60 localhost '/bin/sh -c '"'"'echo PLATFORM; uname; echo FOUND; command -v '"'"'"'"'"'"'"'"'/usr/bin/python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.5'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/libexec/platform-python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/bin/python3'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python'"'"'"'"'"'"'"'"'; echo ENDFOUND && sleep 0'"'"''
<localhost> (0, 'PLATFORM\nLinux\nFOUND\n/usr/bin/python\n/usr/bin/python2.7\n/usr/libexec/platform-python\n/usr/bin/python\nENDFOUND\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips  26 Jan 2017\r\ndebug1: Reading configuration data /home/mtovey/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 58: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 7259\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<localhost> ESTABLISH SSH CONNECTION FOR USER: None
<localhost> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/mtovey/.ansible/cp/8a5a4c6a60 localhost '/bin/sh -c '"'"'/usr/bin/python && sleep 0'"'"''
<localhost> (0, '{"osrelease_content": "NAME=\\"Red Hat Enterprise Linux Server\\"\\nVERSION=\\"7.7 (Maipo)\\"\\nID=\\"rhel\\"\\nID_LIKE=\\"fedora\\"\\nVARIANT=\\"Server\\"\\nVARIANT_ID=\\"server\\"\\nVERSION_ID=\\"7.7\\"\\nPRETTY_NAME=\\"Red Hat Enterprise Linux\\"\\nANSI_COLOR=\\"0;31\\"\\nCPE_NAME=\\"cpe:/o:redhat:enterprise_linux:7.7:GA:server\\"\\nHOME_URL=\\"https://www.redhat.com/\\"\\nBUG_REPORT_URL=\\"https://bugzilla.redhat.com/\\"\\n\\nREDHAT_BUGZILLA_PRODUCT=\\"Red Hat Enterprise Linux 7\\"\\nREDHAT_BUGZILLA_PRODUCT_VERSION=7.7\\nREDHAT_SUPPORT_PRODUCT=\\"Red Hat Enterprise Linux\\"\\nREDHAT_SUPPORT_PRODUCT_VERSION=\\"7.7\\"\\n", "platform_dist_result": ["redhat", "7.7", "Maipo"]}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips  26 Jan 2017\r\ndebug1: Reading configuration data /home/mtovey/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 58: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 7259\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/system/setup.py
<localhost> PUT /home/mtovey/.ansible/tmp/ansible-local-7237kXtIbY/tmphErpzc TO /home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076/AnsiballZ_setup.py
<localhost> SSH: EXEC scp -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/mtovey/.ansible/cp/8a5a4c6a60 /home/mtovey/.ansible/tmp/ansible-local-7237kXtIbY/tmphErpzc '[localhost]:/home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076/AnsiballZ_setup.py'
<localhost> (0, '', 'Executing: program /usr/bin/ssh host localhost, user (unspecified), command scp -v -t /home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076/AnsiballZ_setup.py\nOpenSSH_7.4p1, OpenSSL 1.0.2k-fips  26 Jan 2017\r\ndebug1: Reading configuration data /home/mtovey/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 58: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 7259\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\nSending file modes: C0600 252497 tmphErpzc\nSink: C0600 252497 tmphErpzc\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<localhost> ESTABLISH SSH CONNECTION FOR USER: None
<localhost> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/mtovey/.ansible/cp/8a5a4c6a60 localhost '/bin/sh -c '"'"'chmod u+x /home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076/ /home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076/AnsiballZ_setup.py && sleep 0'"'"''
<localhost> (0, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips  26 Jan 2017\r\ndebug1: Reading configuration data /home/mtovey/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 58: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 7259\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<localhost> ESTABLISH SSH CONNECTION FOR USER: None
<localhost> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/mtovey/.ansible/cp/8a5a4c6a60 -tt localhost '/bin/sh -c '"'"'/usr/bin/python /home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076/AnsiballZ_setup.py && sleep 0'"'"''
<localhost> (0, '\r\n{"invocation": {"module_args": {"filter": "*", "gather_subset": ["all"], "fact_path": "/etc/ansible/facts.d", "gather_timeout": 10}}, "ansible_facts": {"ansible_fibre_channel_wwn": [], "module_setup": true, "ansible_distribution_version": "7.7", "ansible_distribution_file_variety": "RedHat", "ansible_env": {"HISTTIMEFORMAT": "%d/%m/%y %T ", "LC_NUMERIC": "C", "HISTFILE": "/dsv/audit/.sh_history.mtovey:mtovey:pts-5", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "SSH_CLIENT": "::1 40918 22", "LOGNAME": "mtovey", "USER": "mtovey", "PATH": "/usr/local/bin:/usr/bin:/usr/local/bin:/usr/local/sbin", "HOME": "/home/mtovey", "LANG": "C", "TERM": "putty", "SHELL": "/bin/bash", "SHLVL": "2", "EXTENDED_HISTORY": "ON", "HISTSIZE": "5000", "_": "/usr/bin/python", "XDG_RUNTIME_DIR": "/run/user/2027", "LC_ALL": "C", "XDG_SESSION_ID": "10", "DIRHIST": "/home/mtovey", "DIRCOUNT": "1", "SSH_CONNECTION": "::1 40918 ::1 22", "SSH_TTY": "/dev/pts/5", "PWD": "/home/mtovey", "MAIL": "/var/mail/mtovey", "LS_COLORS": ""}, "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_default_ipv4": {"macaddress": "00:50:56:8d:68:e3", "network": "145.218.238.0", "mtu": 1500, "broadcast": "145.218.238.255", "alias": "ens33", "netmask": "255.255.255.0", "address": "145.218.238.132", "interface": "ens33", "type": "ether", "gateway": "145.218.238.1"}, "ansible_swapfree_mb": 3946, "ansible_default_ipv6": {}, "ansible_cmdline": {"LANG": "en_US.UTF-8", "BOOT_IMAGE": "/vmlinuz-3.10.0-1062.4.1.el7.x86_64", "quiet": true, "rhgb": true, "rd.lvm.lv": "rootvg/usrlv", "crashkernel": "auto", "ro": true, "root": "/dev/mapper/rootvg-rootlv"}, "ansible_machine_id": "1ad34a4b5708db820d709c3b5b6309a1", "ansible_userspace_architecture": "x86_64", "ansible_product_uuid": "NA", "ansible_pkg_mgr": "yum", "ansible_distribution": "RedHat", "ansible_iscsi_iqn": "", "ansible_all_ipv6_addresses": ["fe80::250:56ff:fe8d:68e3", "fe80::250:56ff:fe8d:bedc"], "ansible_uptime_seconds": 13303, "ansible_kernel": "3.10.0-1062.4.1.el7.x86_64", "ansible_system_capabilities_enforced": "True", "ansible_python": {"executable": "/usr/bin/python", "version": {"micro": 5, "major": 2, "releaselevel": "final", "serial": 0, "minor": 7}, "type": "CPython", "has_sslcontext": true, "version_info": [2, 7, 5, "final", 0]}, "ansible_is_chroot": true, "ansible_hostnqn": "", "ansible_user_shell": "/bin/bash", "ansible_product_serial": "NA", "ansible_form_factor": "Other", "ansible_distribution_file_parsed": true, "ansible_fips": false, "ansible_user_id": "mtovey", "ansible_selinux_python_present": true, "ansible_local": {}, "ansible_processor_vcpus": 4, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E7-8890 v4 @ 2.20GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E7-8890 v4 @ 2.20GHz", "2", "GenuineIntel", "Intel(R) Xeon(R) CPU E7-8890 v4 @ 2.20GHz", "3", "GenuineIntel", "Intel(R) Xeon(R) CPU E7-8890 v4 @ 2.20GHz"], "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCsW3jt+perdsGpIeMMy8UKybd/hC8mAV3QzVNvbbdsNFzb9embsecuJH+LeWTqHZNWDaEIWzgRYG9pTgC6HzEE=", "ansible_user_gid": 900, "ansible_system_vendor": "VMware, Inc.", "ansible_swaptotal_mb": 4095, "ansible_distribution_major_version": "7", "ansible_real_group_id": 900, "ansible_lsb": {}, "ansible_ens33": {"macaddress": "00:50:56:8d:68:e3", "features": {"tx_checksum_ipv4": "off [fixed]", "generic_receive_offload": "on", "tx_checksum_ipv6": "off [fixed]", "tx_scatter_gather_fraglist": "off [fixed]", "rx_all": "off", "highdma": "off [fixed]", "rx_fcs": "off", "tx_lockless": "off [fixed]", "tx_tcp_ecn_segmentation": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tx_tcp6_segmentation": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_ipip_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_checksumming": "on", "vlan_challenged": "off [fixed]", "loopback": "off [fixed]", "fcoe_mtu": "off [fixed]", "scatter_gather": "on", "tx_checksum_sctp": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "tx_gso_partial": "off [fixed]", "rx_gro_hw": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "large_receive_offload": "off [fixed]", "tx_scatter_gather": "on", "rx_checksumming": "off", "tx_tcp_segmentation": "on", "netns_local": "off [fixed]", "busy_poll": "off [fixed]", "generic_segmentation_offload": "on", "tx_udp_tnl_segmentation": "off [fixed]", "tcp_segmentation_offload": "on", "l2_fwd_offload": "off [fixed]", "rx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_nocache_copy": "off", "tx_udp_tnl_csum_segmentation": "off [fixed]", "udp_fragmentation_offload": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_sit_segmentation": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "hw_tc_offload": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_fcoe_segmentation": "off [fixed]", "rx_vlan_filter": "on [fixed]", "tx_vlan_offload": "on [fixed]", "receive_hashing": "off [fixed]", "tx_gre_segmentation": "off [fixed]"}, "type": "ether", "pciid": "0000:02:01.0", "module": "e1000", "mtu": 1500, "device": "ens33", "promisc": false, "timestamping": ["tx_software", "rx_software", "software"], "ipv4": {"broadcast": "145.218.238.255", "netmask": "255.255.255.0", "network": "145.218.238.0", "address": "145.218.238.132"}, "ipv6": [{"scope": "link", "prefix": "64", "address": "fe80::250:56ff:fe8d:68e3"}], "active": true, "speed": 1000, "hw_timestamp_filters": []}, "ansible_machine": "x86_64", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABAQDBVcgLHcyEnuOSobpOgMcfGOVDMsAIfkHbFJxhbNBB0ZB8oYke/a/UMmOnpUHlIDHWXEBA5i50viHn1vv4TBz1/ljiWC6PSP9ncz0h5RXTVI+/9ADRP/FEwceeyPjrPADuAwV4A0RtzGzlHc75qmjuSDeR2Khgy2c/Gfh2OW2e5oC4n8/GDuIsdF1lXGsNUkzc0/WBKb1YvvzbsfSSS31S/j4VqqcWf7y/iNdh2FgC4cV6oC3yDUK2tCSjfugRZ5L2RfJC2dnkxtYVDiQ9/gnFPdhUcpmRze4XCP7k2s/oDp71r7ZBM6Y1ao8InVFbGpwVU/uylBBa9jVOFONoz8A5", "ansible_user_gecos": "/N/Mark Tovey - DSV", "ansible_processor_threads_per_core": 1, "ansible_product_name": "VMware Virtual Platform", "ansible_all_ipv4_addresses": ["145.218.238.132", "192.168.163.28"], "ansible_python_version": "2.7.5", "ansible_product_version": "None", "ansible_service_mgr": "systemd", "ansible_memory_mb": {"real": {"total": 3789, "used": 540, "free": 3249}, "swap": {"cached": 11, "total": 4095, "free": 3946, "used": 149}, "nocache": {"used": 317, "free": 3472}}, "ansible_user_dir": "/home/mtovey", "gather_subset": ["all"], "ansible_real_user_id": 2027, "ansible_virtualization_role": "guest", "ansible_dns": {"nameservers": ["145.218.226.65"], "search": ["dsv.com", "dsv.com"]}, "ansible_effective_group_id": 900, "ansible_lo": {"features": {"tx_checksum_ipv4": "off [fixed]", "generic_receive_offload": "on", "tx_checksum_ipv6": "off [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "rx_all": "off [fixed]", "highdma": "on [fixed]", "rx_fcs": "off [fixed]", "tx_lockless": "on [fixed]", "tx_tcp_ecn_segmentation": "on", "rx_udp_tunnel_port_offload": "off [fixed]", "tx_tcp6_segmentation": "on", "tx_gso_robust": "off [fixed]", "tx_ipip_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "on", "tx_checksumming": "on", "vlan_challenged": "on [fixed]", "loopback": "on [fixed]", "fcoe_mtu": "off [fixed]", "scatter_gather": "on", "tx_checksum_sctp": "on [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "tx_gso_partial": "off [fixed]", "rx_gro_hw": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "large_receive_offload": "off [fixed]", "tx_scatter_gather": "on [fixed]", "rx_checksumming": "on [fixed]", "tx_tcp_segmentation": "on", "netns_local": "on [fixed]", "busy_poll": "off [fixed]", "generic_segmentation_offload": "on", "tx_udp_tnl_segmentation": "off [fixed]", "tcp_segmentation_offload": "on", "l2_fwd_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_nocache_copy": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "udp_fragmentation_offload": "on", "tx_sctp_segmentation": "on", "tx_sit_segmentation": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "hw_tc_offload": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "rx_vlan_filter": "off [fixed]", "tx_vlan_offload": "off [fixed]", "receive_hashing": "off [fixed]", "tx_gre_segmentation": "off [fixed]"}, "hw_timestamp_filters": [], "mtu": 65536, "device": "lo", "promisc": false, "timestamping": ["rx_software", "software"], "ipv4": {"broadcast": "host", "netmask": "255.0.0.0", "network": "127.0.0.0", "address": "127.0.0.1"}, "ipv6": [{"scope": "host", "prefix": "128", "address": "::1"}], "active": true, "type": "loopback"}, "ansible_memtotal_mb": 3789, "ansible_device_links": {"masters": {"sdb1": ["dm-8", "dm-9"], "sda5": ["dm-0", "dm-1", "dm-2", "dm-3", "dm-4", "dm-5", "dm-6", "dm-7"]}, "labels": {}, "ids": {"sdb1": ["lvm-pv-uuid-UTu3ka-3Tmg-ax5g-txqE-nAH7-a74Y-mCwLDc"], "sr0": ["ata-VMware_Virtual_IDE_CDROM_Drive_10000000000000000001"], "sda5": ["lvm-pv-uuid-x5pT22-YE7x-2K3T-yPGr-jenp-oV2t-74iCTl"], "dm-8": ["dm-name-datavg-ansiblelv", "dm-uuid-LVM-IVOmQ2Iuf5YyvMHMYUTAfz7wDfmxBvMH1NixEFVo3UXk1Qm4uoVK2KOncoR0zrZx"], "dm-9": ["dm-name-datavg-repolv", "dm-uuid-LVM-IVOmQ2Iuf5YyvMHMYUTAfz7wDfmxBvMHQlwty3KEO5ECkXFw4nOmddyAXItvpVTN"], "dm-6": ["dm-name-rootvg-homelv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm5DWn9xTJfvaOqkM7V0eIWAGcQGBc89sW3"], "dm-7": ["dm-name-rootvg-dsvlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm51qI36Lt7JWoRzWU2iyM3flTEdKkaWb0S"], "dm-4": ["dm-name-rootvg-tmplv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm59C7m8JBYYmBX3O59vhMik37iDG19CtRH"], "dm-5": ["dm-name-rootvg-optlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm5i7dqnlaYKjlgELGr73C2qq4StLtkXVBz"], "dm-2": ["dm-name-rootvg-usrlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm5L0nUVltrdsSIJKdRfHHRHoiQb9V3fDi8"], "dm-3": ["dm-name-rootvg-varlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm59nxBZQSGBiKJVioY3H9rQCoBqo4hrRqT"], "dm-0": ["dm-name-rootvg-rootlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm51Ye92TAr3IxBJm95z31ivNn0xU7a0xCI"], "dm-1": ["dm-name-rootvg-swap", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm5lZ1Tb4kvjH5RnuVWNmxZVJ59rHPi8Gd6"]}, "uuids": {"sda1": ["4109ba2c-e479-490f-9a46-6ec2835424cf"], "dm-8": ["7769745e-f3d7-4f2f-a162-45a2db3a6cc6"], "dm-9": ["3f2e9cd8-1466-483b-90ca-1817c4ad437e"], "dm-6": ["db316022-ea9e-4fc4-a8eb-3398bf6bea9a"], "dm-7": ["9712a956-7e93-4569-927b-47d78228f462"], "dm-4": ["be177c1a-c50c-4b07-bef7-47fec939f148"], "dm-5": ["28534b43-f941-4e65-bff1-451d8e81ed7d"], "dm-2": ["e9009758-43f1-4949-9972-18320b2c69f5"], "dm-3": ["bbf186d8-ec95-434e-afce-45447dc84d62"], "dm-0": ["685781e9-8242-4ac6-9850-45e72cb36f8b"], "dm-1": ["4571d54e-57d6-4e9e-bfdb-9a8bb90a23be"]}}, "ansible_apparmor": {"status": "disabled"}, "ansible_proc_cmdline": {"LANG": "en_US.UTF-8", "BOOT_IMAGE": "/vmlinuz-3.10.0-1062.4.1.el7.x86_64", "quiet": true, "rhgb": true, "rd.lvm.lv": ["rootvg/rootlv", "rootvg/swap", "rootvg/usrlv"], "crashkernel": "auto", "ro": true, "root": "/dev/mapper/rootvg-rootlv"}, "ansible_memfree_mb": 3249, "ansible_processor_count": 4, "ansible_hostname": "ansprdapp3", "ansible_interfaces": ["lo", "ens33", "ens160"], "ansible_selinux": {"status": "disabled"}, "ansible_fqdn": "ansprdapp3.dsv.com", "ansible_mounts": [{"block_used": 10209, "uuid": "7769745e-f3d7-4f2f-a162-45a2db3a6cc6", "size_total": 53660876800, "block_total": 13100800, "mount": "/dsv/ansible", "block_available": 13090591, "size_available": 53619060736, "fstype": "xfs", "inode_total": 26214400, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/mapper/datavg-ansiblelv", "inode_used": 1450, "block_size": 4096, "inode_available": 26212950}, {"block_used": 538368, "uuid": "bbf186d8-ec95-434e-afce-45447dc84d62", "size_total": 5354029056, "block_total": 1307136, "mount": "/var", "block_available": 768768, "size_available": 3148873728, "fstype": "xfs", "inode_total": 2619392, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/mapper/rootvg-varlv", "inode_used": 4253, "block_size": 4096, "inode_available": 2615139}, {"block_used": 13622, "uuid": "be177c1a-c50c-4b07-bef7-47fec939f148", "size_total": 5354029056, "block_total": 1307136, "mount": "/tmp", "block_available": 1293514, "size_available": 5298233344, "fstype": "xfs", "inode_total": 2619392, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/mapper/rootvg-tmplv", "inode_used": 1171, "block_size": 4096, "inode_available": 2618221}, {"block_used": 467080, "uuid": "e9009758-43f1-4949-9972-18320b2c69f5", "size_total": 5354029056, "block_total": 1307136, "mount": "/usr", "block_available": 840056, "size_available": 3440869376, "fstype": "xfs", "inode_total": 2619392, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/mapper/rootvg-usrlv", "inode_used": 61702, "block_size": 4096, "inode_available": 2557690}, {"block_used": 87883, "uuid": "4109ba2c-e479-490f-9a46-6ec2835424cf", "size_total": 1063256064, "block_total": 259584, "mount": "/boot", "block_available": 171701, "size_available": 703287296, "fstype": "xfs", "inode_total": 524288, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/sda1", "inode_used": 346, "block_size": 4096, "inode_available": 523942}, {"block_used": 518816, "uuid": "db316022-ea9e-4fc4-a8eb-3398bf6bea9a", "size_total": 5354029056, "block_total": 1307136, "mount": "/home", "block_available": 788320, "size_available": 3228958720, "fstype": "xfs", "inode_total": 2619392, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/mapper/rootvg-homelv", "inode_used": 90395, "block_size": 4096, "inode_available": 2528997}, {"block_used": 18265, "uuid": "685781e9-8242-4ac6-9850-45e72cb36f8b", "size_total": 5354029056, "block_total": 1307136, "mount": "/", "block_available": 1288871, "size_available": 5279215616, "fstype": "xfs", "inode_total": 2619392, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/mapper/rootvg-rootlv", "inode_used": 2838, "block_size": 4096, "inode_available": 2616554}, {"block_used": 219086, "uuid": "28534b43-f941-4e65-bff1-451d8e81ed7d", "size_total": 5354029056, "block_total": 1307136, "mount": "/opt", "block_available": 1088050, "size_available": 4456652800, "fstype": "xfs", "inode_total": 2619392, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/mapper/rootvg-optlv", "inode_used": 4337, "block_size": 4096, "inode_available": 2615055}, {"block_used": 1400885, "uuid": "3f2e9cd8-1466-483b-90ca-1817c4ad437e", "size_total": 10726932480, "block_total": 2618880, "mount": "/dsv/repos", "block_available": 1217995, "size_available": 4988907520, "fstype": "xfs", "inode_total": 5242880, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/mapper/datavg-repolv", "inode_used": 446, "block_size": 4096, "inode_available": 5242434}, {"block_used": 13415, "uuid": "9712a956-7e93-4569-927b-47d78228f462", "size_total": 5354029056, "block_total": 1307136, "mount": "/dsv", "block_available": 1293721, "size_available": 5299081216, "fstype": "xfs", "inode_total": 2619392, "options": "rw,relatime,attr2,inode64,noquota", "device": "/dev/mapper/rootvg-dsvlv", "inode_used": 536, "block_size": 4096, "inode_available": 2618856}], "ansible_nodename": "ansprdapp3.dsv.com", "ansible_distribution_file_search_string": "Red Hat", "ansible_domain": "dsv.com", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_virtualization_type": "VMware", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIEpVvaJirjkcb0o/1+8jb1V9Y1FvxDtTPs9uwTt2vmhQ", "ansible_processor_cores": 1, "ansible_bios_version": "6.00", "ansible_date_time": {"weekday_number": "1", "iso8601_basic_short": "20191028T195433", "tz": "CET", "weeknumber": "43", "hour": "19", "year": "2019", "minute": "54", "tz_offset": "+0100", "month": "10", "epoch": "1572288873", "iso8601_micro": "2019-10-28T18:54:33.475361Z", "weekday": "Monday", "time": "19:54:33", "date": "2019-10-28", "iso8601": "2019-10-28T18:54:33Z", "day": "28", "iso8601_basic": "20191028T195433475254", "second": "33"}, "ansible_distribution_release": "Maipo", "ansible_os_family": "RedHat", "ansible_effective_user_id": 2027, "ansible_system": "Linux", "ansible_devices": {"dm-8": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "104857600", "links": {"masters": [], "labels": [], "ids": ["dm-name-datavg-ansiblelv", "dm-uuid-LVM-IVOmQ2Iuf5YyvMHMYUTAfz7wDfmxBvMH1NixEFVo3UXk1Qm4uoVK2KOncoR0zrZx"], "uuids": ["7769745e-f3d7-4f2f-a162-45a2db3a6cc6"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "50.00 GB"}, "sr0": {"scheduler_mode": "deadline", "rotational": "1", "vendor": "NECVMWar", "sectors": "2097151", "links": {"masters": [], "labels": [], "ids": ["ata-VMware_Virtual_IDE_CDROM_Drive_10000000000000000001"], "uuids": []}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "IDE interface: Intel Corporation 82371AB/EB/MB PIIX4 IDE (rev 01)", "sectorsize": "512", "removable": "1", "support_discard": "0", "model": "VMware IDE CDR10", "partitions": {}, "holders": [], "size": "1024.00 MB"}, "sda": {"scheduler_mode": "deadline", "rotational": "1", "vendor": "VMware", "sectors": "104857600", "links": {"masters": [], "labels": [], "ids": [], "uuids": []}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "SCSI storage controller: Broadcom / LSI 53c1030 PCI-X Fusion-MPT Dual Ultra320 SCSI (rev 01)", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": "Virtual disk", "partitions": {"sda5": {"sectorsize": 512, "uuid": null, "links": {"masters": ["dm-0", "dm-1", "dm-2", "dm-3", "dm-4", "dm-5", "dm-6", "dm-7"], "labels": [], "ids": ["lvm-pv-uuid-x5pT22-YE7x-2K3T-yPGr-jenp-oV2t-74iCTl"], "uuids": []}, "sectors": "81754112", "start": "23070720", "holders": ["rootvg-rootlv", "rootvg-swap", "rootvg-usrlv", "rootvg-varlv", "rootvg-tmplv", "rootvg-optlv", "rootvg-homelv", "rootvg-dsvlv"], "size": "38.98 GB"}, "sda2": {"sectorsize": 512, "uuid": null, "links": {"masters": [], "labels": [], "ids": [], "uuids": []}, "sectors": "2", "start": "23068672", "holders": [], "size": "1.00 KB"}, "sda1": {"sectorsize": 512, "uuid": "4109ba2c-e479-490f-9a46-6ec2835424cf", "links": {"masters": [], "labels": [], "ids": [], "uuids": ["4109ba2c-e479-490f-9a46-6ec2835424cf"]}, "sectors": "2097152", "start": "20971520", "holders": [], "size": "1.00 GB"}}, "holders": [], "size": "50.00 GB"}, "sdb": {"scheduler_mode": "deadline", "rotational": "1", "vendor": "VMware", "sectors": "209715200", "links": {"masters": [], "labels": [], "ids": [], "uuids": []}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "SCSI storage controller: Broadcom / LSI 53c1030 PCI-X Fusion-MPT Dual Ultra320 SCSI (rev 01)", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": "Virtual disk", "partitions": {"sdb1": {"sectorsize": 512, "uuid": null, "links": {"masters": ["dm-8", "dm-9"], "labels": [], "ids": ["lvm-pv-uuid-UTu3ka-3Tmg-ax5g-txqE-nAH7-a74Y-mCwLDc"], "uuids": []}, "sectors": "209713152", "start": "2048", "holders": ["datavg-ansiblelv", "datavg-repolv"], "size": "100.00 GB"}}, "holders": [], "size": "100.00 GB"}, "dm-9": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "20971520", "links": {"masters": [], "labels": [], "ids": ["dm-name-datavg-repolv", "dm-uuid-LVM-IVOmQ2Iuf5YyvMHMYUTAfz7wDfmxBvMHQlwty3KEO5ECkXFw4nOmddyAXItvpVTN"], "uuids": ["3f2e9cd8-1466-483b-90ca-1817c4ad437e"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "10.00 GB"}, "dm-6": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "10477568", "links": {"masters": [], "labels": [], "ids": ["dm-name-rootvg-homelv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm5DWn9xTJfvaOqkM7V0eIWAGcQGBc89sW3"], "uuids": ["db316022-ea9e-4fc4-a8eb-3398bf6bea9a"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "5.00 GB"}, "dm-7": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "10477568", "links": {"masters": [], "labels": [], "ids": ["dm-name-rootvg-dsvlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm51qI36Lt7JWoRzWU2iyM3flTEdKkaWb0S"], "uuids": ["9712a956-7e93-4569-927b-47d78228f462"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "5.00 GB"}, "dm-4": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "10477568", "links": {"masters": [], "labels": [], "ids": ["dm-name-rootvg-tmplv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm59C7m8JBYYmBX3O59vhMik37iDG19CtRH"], "uuids": ["be177c1a-c50c-4b07-bef7-47fec939f148"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "5.00 GB"}, "dm-5": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "10477568", "links": {"masters": [], "labels": [], "ids": ["dm-name-rootvg-optlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm5i7dqnlaYKjlgELGr73C2qq4StLtkXVBz"], "uuids": ["28534b43-f941-4e65-bff1-451d8e81ed7d"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "5.00 GB"}, "dm-2": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "10477568", "links": {"masters": [], "labels": [], "ids": ["dm-name-rootvg-usrlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm5L0nUVltrdsSIJKdRfHHRHoiQb9V3fDi8"], "uuids": ["e9009758-43f1-4949-9972-18320b2c69f5"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "5.00 GB"}, "dm-3": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "10477568", "links": {"masters": [], "labels": [], "ids": ["dm-name-rootvg-varlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm59nxBZQSGBiKJVioY3H9rQCoBqo4hrRqT"], "uuids": ["bbf186d8-ec95-434e-afce-45447dc84d62"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "5.00 GB"}, "dm-0": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "10477568", "links": {"masters": [], "labels": [], "ids": ["dm-name-rootvg-rootlv", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm51Ye92TAr3IxBJm95z31ivNn0xU7a0xCI"], "uuids": ["685781e9-8242-4ac6-9850-45e72cb36f8b"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "5.00 GB"}, "dm-1": {"scheduler_mode": "", "rotational": "1", "vendor": null, "sectors": "8388608", "links": {"masters": [], "labels": [], "ids": ["dm-name-rootvg-swap", "dm-uuid-LVM-5tFihrJ9GoVzLuOvBi8iXS7ObQKGJKm5lZ1Tb4kvjH5RnuVWNmxZVJ59rHPi8Gd6"], "uuids": ["4571d54e-57d6-4e9e-bfdb-9a8bb90a23be"]}, "sas_device_handle": null, "sas_address": null, "virtual": 1, "host": "", "sectorsize": "512", "removable": "0", "support_discard": "0", "model": null, "partitions": {}, "holders": [], "size": "4.00 GB"}}, "ansible_user_uid": 2027, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKw9CwjuEPhaQHI9ZvfBlHnydMkxkV+W/a5kqRE3fgG55fs9YSKjUV1vzDwdJV0/9TgcVgA8fPA+LHk5mUzCed8sLfnRj7U2fne8YkEB5ChbuLaA7CBu2ce8smMCFbnO91wATQc7NxqB6wL53qrIv0TFpO6GLyC7fz9A1DJhTaOXAAAAFQDkB9MOq/NMiyQdflMv/Za+8XAjXwAAAIBfW2HrRI8jyKjMi4fAGFiwbPzV3KavRUpPaXLPDkxBtO7OLKXhJMKJH/y0DUpbvbGnDGnk55/e9y+6nFrQXwphleNE9IMd5w3GGSnYPOwJOgy4jnauAZTz8G/bTp8A6MsdAwqz7HSedYdxWyuPRGCduA6FztKILoxK6498ApETKQAAAIBb8cRHPq6T4VCcPhLeVyhdDnkxRtAdL170I/T0O0qdaKvwQ9mqUFsu5AWG2ZciFA23ylGBJUkzZDeyt9gxC5m1hrDRXo/Lqfii3EeljSn23q2PthqsFxYX6yh7AwkZ4nX0BOu9iEVF8SmhG93hsRXepUY40M2yPlxooyfCztmDQA==", "ansible_bios_date": "07/03/2018", "ansible_system_capabilities": [""], "ansible_ens160": {"macaddress": "00:50:56:8d:be:dc", "features": {"tx_checksum_ipv4": "off [fixed]", "generic_receive_offload": "on", "tx_checksum_ipv6": "off [fixed]", "tx_scatter_gather_fraglist": "off [fixed]", "rx_all": "off [fixed]", "highdma": "on", "rx_fcs": "off [fixed]", "tx_lockless": "off [fixed]", "tx_tcp_ecn_segmentation": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tx_tcp6_segmentation": "on", "tx_gso_robust": "off [fixed]", "tx_ipip_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_checksumming": "on", "vlan_challenged": "off [fixed]", "loopback": "off [fixed]", "fcoe_mtu": "off [fixed]", "scatter_gather": "on", "tx_checksum_sctp": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "tx_gso_partial": "off [fixed]", "rx_gro_hw": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "large_receive_offload": "on", "tx_scatter_gather": "on", "rx_checksumming": "on", "tx_tcp_segmentation": "on", "netns_local": "off [fixed]", "busy_poll": "off [fixed]", "generic_segmentation_offload": "on", "tx_udp_tnl_segmentation": "off [fixed]", "tcp_segmentation_offload": "on", "l2_fwd_offload": "off [fixed]", "rx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_nocache_copy": "off", "tx_udp_tnl_csum_segmentation": "off [fixed]", "udp_fragmentation_offload": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_sit_segmentation": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "hw_tc_offload": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_fcoe_segmentation": "off [fixed]", "rx_vlan_filter": "on [fixed]", "tx_vlan_offload": "on", "receive_hashing": "on", "tx_gre_segmentation": "off [fixed]"}, "type": "ether", "pciid": "0000:03:00.0", "module": "vmxnet3", "mtu": 1500, "device": "ens160", "promisc": false, "timestamping": ["rx_software", "software"], "ipv4": {"broadcast": "192.168.191.255", "netmask": "255.255.224.0", "network": "192.168.160.0", "address": "192.168.163.28"}, "ipv6": [{"scope": "link", "prefix": "64", "address": "fe80::250:56ff:fe8d:bedc"}], "active": true, "speed": 10000, "hw_timestamp_filters": []}}}\r\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips  26 Jan 2017\r\ndebug1: Reading configuration data /home/mtovey/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 58: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 7259\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to localhost closed.\r\n')
<localhost> ESTABLISH SSH CONNECTION FOR USER: None
<localhost> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/mtovey/.ansible/cp/8a5a4c6a60 localhost '/bin/sh -c '"'"'rm -f -r /home/mtovey/.ansible/tmp/ansible-tmp-1572288869.96-280974017029076/ > /dev/null 2>&1 && sleep 0'"'"''
<localhost> (0, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips  26 Jan 2017\r\ndebug1: Reading configuration data /home/mtovey/.ssh/config\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 58: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 7259\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
ok: [localhost]
META: ran handlers

TASK [debug] ***********************************************************************************************************************************************************************
task path: /home/mtovey/devansible/playbooks/NewTest:7
ok: [localhost] =>
  msg: this is a test

TASK [debug] ***********************************************************************************************************************************************************************
task path: /home/mtovey/devansible/playbooks/NewTest:8
ok: [localhost] =>
  msg: RedHat

TASK [debug] ***********************************************************************************************************************************************************************
task path: /home/mtovey/devansible/playbooks/NewTest:9
 [WARNING]: Failure using method (v2_runner_on_ok) in callback plugin (<ansible.plugins.callback.yaml.CallbackModule object at 0x7f17e54ecad0>): value must be a string

Callback Exception:
  File "/usr/lib/python2.7/site-packages/ansible/executor/task_queue_manager.py", line 333, in send_callback
    method(*new_args, **kwargs)
   File "/usr/lib/python2.7/site-packages/ansible/plugins/callback/default.py", line 135, in v2_runner_on_ok
    msg += " => %s" % (self._dump_results(result._result),)
   File "/usr/lib/python2.7/site-packages/ansible/plugins/callback/yaml.py", line 123, in _dump_results
    dumped += to_text(yaml.dump(abridged_result, allow_unicode=True, width=1000, Dumper=AnsibleDumper, default_flow_style=False))
   File "/usr/lib64/python2.7/site-packages/yaml/__init__.py", line 202, in dump
    return dump_all([data], stream, Dumper=Dumper, **kwds)
   File "/usr/lib64/python2.7/site-packages/yaml/__init__.py", line 190, in dump_all
    dumper.represent(data)
   File "/usr/lib64/python2.7/site-packages/yaml/representer.py", line 29, in represent
    self.serialize(node)
   File "_yaml.pyx", line 1348, in _yaml.CEmitter.serialize (ext/_yaml.c:15672)
   File "_yaml.pyx", line 1510, in _yaml.CEmitter._serialize_node (ext/_yaml.c:17814)
   File "_yaml.pyx", line 1431, in _yaml.CEmitter._serialize_node (ext/_yaml.c:16752)

META: ran handlers
META: ran handlers

PLAY RECAP *************************************************************************************************************************************************************************
localhost                  : ok=4    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0

Mark Tovey - DSV

unread,
Oct 28, 2019, 5:44:25 PM10/28/19
to ansible...@googlegroups.com

 

    I used “yum history undo” to rollback the upgrade to RedHat 7.6, and now Ansible is functioning again.  Clearly there is something in the OS upgrade that is impacting Ansible.  Beware!!!

    -Mark

--
You received this message because you are subscribed to a topic in the Google Groups "Ansible Project" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/ansible-project/fbUF5J-vcZI/unsubscribe.
To unsubscribe from this group and all its topics, send an email to ansible-proje...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ansible-project/496fe0fe-f27a-4ebf-bad8-cca7cb9bd7ad%40googlegroups.com.

Chris Bidwell - NOAA Federal

unread,
Oct 28, 2019, 5:56:27 PM10/28/19
to ansible...@googlegroups.com
Yeah when I upgraded to RHEL7.7 I got some pretty weird behavior...not just ansible-related.  

You received this message because you are subscribed to the Google Groups "Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ansible-proje...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ansible-project/AM5PR0602MB2771991FABB56997AAD047F79E660%40AM5PR0602MB2771.eurprd06.prod.outlook.com.



Stephen John Smoogen

unread,
Oct 29, 2019, 11:14:32 AM10/29/19
to ansible...@googlegroups.com
On Mon, 28 Oct 2019 at 16:08, 'Mark Tovey' via Ansible Project
<ansible...@googlegroups.com> wrote:
>
>
> I just performed a OS upgrade on our Ansible server from RedHat 7.6 to RedHat 7.7, and now Ansible has become unusable. When I try to run any task that uses any Ansible fact, I get the following error:
>
> [WARNING]: Failure using method (v2_runner_on_ok) in callback plugin (<ansible.plugins.callback.yaml.CallbackModule object at 0x7efe1283de50>):
>
> value must be a string
>

Hmmm our systems have been RHEL-7.7 for a while and I have not seen
this but we are using

[smooge@batcave01 ansible (master)]$ ansible --version
ansible 2.8.5
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/srv/web/infra/ansible/library',
u'/usr/share/ansible']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.5 (default, Jun 11 2019, 14:33:56) [GCC 4.8.5
20150623 (Red Hat 4.8.5-39)]

Do you have a list of python and ansible packages installed before and
after the upgrade? I am not sure it will help but it might.

--
Stephen J Smoogen.

Mark Tovey - DSV

unread,
Oct 29, 2019, 11:53:39 AM10/29/19
to ansible...@googlegroups.com

Unfortunately I have already rolled back to RedHat 7.6 so I do not know what was installed for 7.7. The Python packages that are installed now are below. We had Ansible 2.8.1 installed previously. I upgraded to 2.8.5 to see if that would help (it did not), and that is still installed.
-Mark

rpm -qa | grep python
python2-pyasn1-0.1.9-7.el7.noarch
python2-rhnlib-2.8.11.1-28.1.noarch
libselinux-python-2.5-14.1.el7.x86_64
python-simplejson-3.3.1-2.1.x86_64
python-yubico-1.2.3-1.el7.noarch
python-dateutil-1.5-7.el7.noarch
python-lxml-3.2.1-4.el7.x86_64
python-pycurl-7.19.0-19.el7.x86_64
python-enum34-1.0.4-1.el7.noarch
python27-python-pip-8.1.2-3.el7.noarch
python-passlib-1.6.5-2.el7.noarch
python-msgpack-python-0.4.6-2.1.x86_64
python-certifi-2015.9.6.2-2.1.noarch
python-ldap-2.4.15-2.el7.x86_64
python-ethtool-0.8-7.el7.x86_64
python-httplib2-0.9.2-1.el7.noarch
python27-python-setuptools-0.9.8-7.el7.noarch
python2-jmespath-0.9.0-4.el7ae.noarch
python-babel-0.9.6-8.el7.noarch
python2-hwdata-2.3.5-12.1.noarch
python-cffi-1.6.0-5.el7.x86_64
python-libipa_hbac-1.16.2-13.el7_6.8.x86_64
python-chardet-2.2.1-1.el7_1.noarch
python-pycrypto-2.6.1-5.1.x86_64
python-tornado-4.2.1-5.3.x86_64
python-virtualenv-15.1.0-2.el7.noarch
python-slip-0.4.0-4.el7.noarch
python-inotify-0.9.4-4.el7.noarch
python2-pyasn1-modules-0.1.9-7.el7.noarch
python-nss-0.16.0-3.el7.x86_64
python-decorator-3.4.0-3.el7.noarch
python-libs-2.7.5-77.el7_6.x86_64
dbus-python-1.1.1-9.el7.x86_64
python-pyudev-0.15-9.el7.noarch
python-configobj-4.7.2-7.el7.noarch
python-sssdconfig-1.16.2-13.el7_6.8.noarch
python-gobject-base-3.22.0-1.el7_4.1.x86_64
python-dmidecode-3.12.2-3.el7.x86_64
python-six-1.9.0-2.el7.noarch
python27-python-libs-2.7.13-5.el7.x86_64
python-setuptools-0.9.8-7.el7.noarch
python-pycparser-2.14-1.el7.noarch
python-idna-2.4-1.el7.noarch
python-devel-2.7.5-77.el7_6.x86_64
python-sss-murmur-1.16.2-13.el7_6.8.x86_64
python-magic-5.11-35.el7.noarch
python-paramiko-2.1.1-9.el7.noarch
python-psutil-1.2.1-0.2.1.x86_64
python-dns-1.12.0-4.20150617git465785f.el7.noarch
python-kitchen-1.1.1-5.el7.noarch
python-qrcode-core-5.0.1-1.el7.noarch
python-netifaces-0.10.4-3.el7.x86_64
newt-python-0.52.15-4.el7.x86_64
python-2.7.5-77.el7_6.x86_64
python-gudev-147.2-7.el7.x86_64
python-linux-procfs-0.4.9-4.el7.noarch
python-perf-3.10.0-957.12.2.el7.x86_64
rpm-python-4.11.3-35.el7.x86_64
python2-futures-3.1.1-5.el7.noarch
redhat-support-lib-python-0.9.7-6.el7.noarch
python2-cryptography-1.7.2-2.el7.x86_64
python-ipaddress-1.0.16-2.el7.noarch
python27-python-2.7.13-5.el7.x86_64
python-ply-3.4-11.el7.noarch
python-jinja2-2.7.2-3.el7_6.noarch
python-zmq-14.5.0-2.1.x86_64
python-jwcrypto-0.4.2-1.el7.noarch
python-schedutils-0.4-6.el7.x86_64
libxml2-python-2.9.1-6.el7_2.3.x86_64
python-firewall-0.5.3-5.el7.noarch
python-urlgrabber-3.10-9.el7.noarch
python27-runtime-1.1-26.1.el7.x86_64
python-backports-1.0-8.el7.x86_64
python-markupsafe-0.11-10.el7.x86_64
python-gssapi-1.2.0-3.el7.x86_64
python-slip-dbus-0.4.0-4.el7.noarch
python-netaddr-0.7.5-9.el7.noarch
python-iniparse-0.4-9.el7.noarch
python-backports-ssl_match_hostname-3.5.0.1-1.el7.noarch





--------------------------------------------------------------------------------------------------------------------------
Mark Tovey - UNIX Engineer
DSV | 110 N Marine Dr. | Bldg 1 | Portland | Oregon | 97217 | USA
Mark...@dsv.com | +1 503 222-5546
--
You received this message because you are subscribed to a topic in the Google Groups "Ansible Project" group.
To unsubscribe from this topic, visit https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Ftopic%2Fansible-project%2FfbUF5J-vcZI%2Funsubscribe&amp;data=02%7C01%7Cmark.tovey%40dsv.com%7Cf22b3743b2394f78baed08d75c82b671%7C4a90c23a3ece4ef2b857522f23b8204c%7C0%7C0%7C637079588781259053&amp;sdata=9nJkZdLum2IA%2FCkjHz%2B%2F3CcfJQ4GRw5APY7Qgbefda8%3D&amp;reserved=0.
To unsubscribe from this group and all its topics, send an email to ansible-proje...@googlegroups.com.
To view this discussion on the web visit https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Fansible-project%2FCANnLRdg%253D9%252B23xmUiSC%252BzvW4gn2k1KD%252BshJEJnEKYTJvBKM%253DhEg%2540mail.gmail.com&amp;data=02%7C01%7Cmark.tovey%40dsv.com%7Cf22b3743b2394f78baed08d75c82b671%7C4a90c23a3ece4ef2b857522f23b8204c%7C0%7C0%7C637079588781259053&amp;sdata=oI3K8cu226XKLOBMqVauRd6hdvVW7LZccDnO8xYwBF4%3D&amp;reserved=0.

Mark Tovey - DSV

unread,
Oct 29, 2019, 5:17:08 PM10/29/19
to ansible...@googlegroups.com

So today I ran a playbook on our Ansible server that is still running RedHat 7.6 and I saw the issue again. Through experimenting, I finally traced it to setting "ANSIBLE_STDOUT_CALLBACK=yaml" before running the playbook, or having "stdout_callback = yaml" in the configuration file. That really surprised me as I have been doing that for a long time with no problem, but something apparently has changed. The puzzling thing is everything was fine until I upgraded the OS, and I started having issues after that. I thought it had cleared up after I rolled back, but I see now that there is still something wrong here. It does not seem to occur as much after the rollback, but it still does happen.
I can continue to use Ansible by not setting ANSIBLE_STDOUT_CALLBACK, but it seems very odd that this is now an issue that seems to have come out of nowhere. I will keep poking at it to see if I can narrow it down some.
-Mark
To unsubscribe from this topic, visit https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Ftopic%2Fansible-project%2FfbUF5J-vcZI%2Funsubscribe&amp;data=02%7C01%7Cmark.tovey%40dsv.com%7C0a283a93e11744e148f108d75c882eea%7C4a90c23a3ece4ef2b857522f23b8204c%7C0%7C0%7C637079612323716766&amp;sdata=GQjxyRhb1tFSpxRN0OxwnkLUadMhsPVGt%2B%2FNdRkMbaY%3D&amp;reserved=0.
To unsubscribe from this group and all its topics, send an email to ansible-proje...@googlegroups.com.
To view this discussion on the web visit https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Fansible-project%2FCANnLRdg%253D9%252B23xmUiSC%252BzvW4gn2k1KD%252BshJEJnEKYTJvBKM%253DhEg%2540mail.gmail.com&amp;data=02%7C01%7Cmark.tovey%40dsv.com%7C0a283a93e11744e148f108d75c882eea%7C4a90c23a3ece4ef2b857522f23b8204c%7C0%7C0%7C637079612323716766&amp;sdata=WFid%2B0qMdnMpm10617cQJdokZBh4X76mgP2NiAXq09U%3D&amp;reserved=0.

--
You received this message because you are subscribed to a topic in the Google Groups "Ansible Project" group.
To unsubscribe from this topic, visit https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Ftopic%2Fansible-project%2FfbUF5J-vcZI%2Funsubscribe&amp;data=02%7C01%7Cmark.tovey%40dsv.com%7C0a283a93e11744e148f108d75c882eea%7C4a90c23a3ece4ef2b857522f23b8204c%7C0%7C0%7C637079612323716766&amp;sdata=GQjxyRhb1tFSpxRN0OxwnkLUadMhsPVGt%2B%2FNdRkMbaY%3D&amp;reserved=0.
To unsubscribe from this group and all its topics, send an email to ansible-proje...@googlegroups.com.
To view this discussion on the web visit https://eur03.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgroups.google.com%2Fd%2Fmsgid%2Fansible-project%2FAM5PR0602MB2771B44FE712321821B37ABE9E610%2540AM5PR0602MB2771.eurprd06.prod.outlook.com&amp;data=02%7C01%7Cmark.tovey%40dsv.com%7C0a283a93e11744e148f108d75c882eea%7C4a90c23a3ece4ef2b857522f23b8204c%7C0%7C0%7C637079612323726722&amp;sdata=DSpTJx%2ByK56UmB25fGLbNgqKYKXab6%2BtXOHAMkr8rkk%3D&amp;reserved=0.

Stephen John Smoogen

unread,
Oct 29, 2019, 5:25:35 PM10/29/19
to ansible...@googlegroups.com
On Tue, 29 Oct 2019 at 17:17, 'Mark Tovey - DSV' via Ansible Project
<ansible...@googlegroups.com> wrote:
>
>
> So today I ran a playbook on our Ansible server that is still running RedHat 7.6 and I saw the issue again. Through experimenting, I finally traced it to setting "ANSIBLE_STDOUT_CALLBACK=yaml" before running the playbook, or having "stdout_callback = yaml" in the configuration file. That really surprised me as I have been doing that for a long time with no problem, but something apparently has changed. The puzzling thing is everything was fine until I upgraded the OS, and I started having issues after that. I thought it had cleared up after I rolled back, but I see now that there is still something wrong here. It does not seem to occur as much after the rollback, but it still does happen.
> I can continue to use Ansible by not setting ANSIBLE_STDOUT_CALLBACK, but it seems very odd that this is now an issue that seems to have come out of nowhere. I will keep poking at it to see if I can narrow it down some.
> -Mark
>

I am glad you 'found' what the issue looks like but my condolences on
it becoming an issue now. I have not used those callbacks so would not
have even thought of looking there.


--
Stephen J Smoogen.

Dick Visser

unread,
Oct 29, 2019, 5:35:58 PM10/29/19
to ansible...@googlegroups.com
You can shield yourself from such inadvertent OS changes by using the pip installation method, preferably with virtual env.
This will also allow you to run any version of ansible.


Dick

--
You received this message because you are subscribed to the Google Groups "Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ansible-proje...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ansible-project/CANnLRdinQSEr%3Dn3DC-X7UJ635cB0UhOT%2BCd43WFBGRG5_p%2BNfQ%40mail.gmail.com.
--
Sent from a mobile device - please excuse the brevity, spelling and punctuation.
Reply all
Reply to author
Forward
0 new messages