How to force creating a new ssh connection

4,772 views
Skip to first unread message

jan.w...@codilime.com

unread,
Mar 18, 2016, 1:39:28 PM3/18/16
to Ansible Project
Hello,

by default Ansible uses SSH's ControlPersist feature to reuse one ssh connection for running multiple tasks.  This is very nice and helpful.  However, there is one situation when this is a problem: when I change sshd configuration, I want Ansible to start a new connection so that it will use new configs.

My playbook basically looks like this:

- hosts: all
  tasks:
    - name: change ssh config from X to Y
      notify: reload ssh

- hosts: all
  tasks:
    - [do more stuff that requires ssh to have config Y]

Right now, ansible reuses the connection established for running first play (when ssh configuration was X) in the second play - but the second play fails when ssh configuration is X.  Can I force Ansible to create a new connection between these two plays?

Note that I don't want to disable ControlPersist completely because it's quite useful.

best,
Jan

Arthur Reyes

unread,
Mar 18, 2016, 4:00:55 PM3/18/16
to Ansible Project
Take a look at asynchronous actions:

Brian Coca

unread,
Mar 18, 2016, 4:26:08 PM3/18/16
to ansible...@googlegroups.com
you can run a command to kill the connection locally:

ssh -O stop <hostname>-o ControlPath=~/.ansible/cp/ansible-ssh-<hostname>-22-<user> 


----------
Brian Coca

John Favorite

unread,
Mar 20, 2016, 12:28:12 PM3/20/16
to ansible...@googlegroups.com

Here is an article I came across that solves an issue I had:

https://dmsimard.com/2016/03/15/changing-the-ssh-port-with-ansible


--
You received this message because you are subscribed to the Google Groups "Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ansible-proje...@googlegroups.com.
To post to this group, send email to ansible...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ansible-project/CACVha7dE12jy%3D6HsmU5intpa8nZ2dYvj16CPLg4TyBBpP9kf8A%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

jan.w...@codilime.com

unread,
Mar 29, 2016, 9:35:51 AM3/29/16
to Ansible Project
Hello,

thanks a lot!  For the sake of people having the same problem as me, here's a complete task that kills connections to all hosts from the current play:

  # this will force Ansible to create new connection(s) so that changes in ssh
 
# settings will have effect (normally Ansible uses ControlPersist feature to
 
# reuse one connection for all tasks). Note that the path to the socket must
 
# be the same as what is configured in ansible.cfg.
- name: kill cached ssh connection
  local_action
: >
    shell ssh
-O stop {{ hostvars[item].ansible_ssh_host }}
   
-o ControlPath=/tmp/ansible-ssh-{{ hostvars[item].ansible_ssh_user }}-{{ hostvars[item].ansible_ssh_host }}-{{ hostvars[item].ansible_ssh_port }}
  run_once
: yes
 
register: socket_removal
  failed_when
: >
    socket_removal
|failed
   
and "No such file or directory" not in socket_removal.stderr
  with_items
: "{{ play_hosts }}"

If you have any further suggestions, let me know!

best,
Jan

jan.w...@codilime.com

unread,
Mar 29, 2016, 10:08:59 AM3/29/16
to Ansible Project
I forgot that ansible_ssh_port and ansible_ssh_host may be unset.  This version should work:

  # this will force Ansible to create new connection(s) so that changes in ssh
 
# settings will have effect (normally Ansible uses ControlPersist feature to
 
# reuse one connection for all tasks). Note that the path to the socket must
 
# be the same as what is configured in ansible.cfg.
- name: kill cached ssh connection
  local_action
: >

    shell ssh
-O stop {{ hostvars[item].ansible_ssh_host|default(inventory_hostname) }}
   
-o ControlPath=~/.ansible/cp/ansible-ssh-{{ hostvars[item].ansible_ssh_host|default(inventory_hostname) }}-{{ hostvars[item].ansible_ssh_port|default('22') }}-{{ hostvars[item].ansible_ssh_user }}

  run_once
: yes
 
register: socket_removal
  failed_when
: >
    socket_removal
|failed
   
and "No such file or directory" not in socket_removal.stderr
  with_items
: "{{ play_hosts }}"

rich...@gmail.com

unread,
Oct 23, 2017, 12:04:06 PM10/23/17
to Ansible Project
Sorry to bring up an old thread - I'm trying to use the code below, and whilst it doesn't error and the command appears to run, if I then move to the next Ansible command I'm getting an unreachable error which suggests that Ansible doesn't realise it needs to re-create a new session. Output as below. I tried adding a pause after killing the SSH connection to see if that would make any difference (it doesn't). If I break my playbook down into two components i..e
1) Playbook 1 --> Make change to remote shell, then run the SSH session kill code
2) Playbook 2 --> add interfaces using command based on shell change made in Playbook 1
This works fine as two executions of ansible-playbook. Only if I concatenate them into a single play do I get an issue.

Any thoughts?

thanks, Keith

TASK [kill cached ssh connection] *******************************************************************************************************************************************************
changed
: [172.27.254.170 -> localhost] => (item=172.27.254.170)


TASK
[wait a few seconds] ***************************************************************************************************************************************************************
Pausing for 10 seconds
(ctrl+C then 'C' = continue early, ctrl+C then 'A' = abort)
ok
: [172.27.254.170]


PLAY
[172.27.254.170] *******************************************************************************************************************************************************************


TASK
[Ccnfigure Interface eth1 to 1.2.3.4/24 and change state to on] ********************************************************************************************************************
failed
: [172.27.254.170] (item=clish -c 'set interface eth1 ipv4-address 1.2.3.4 mask-length 24' -s) => {"item": "clish -c 'set interface eth1 ipv4-address 1.2.3.4 mask-length 24' -s", "msg": "Authentication or permission failure. In some cases, you may have been able to authenticate and did not have permissions on the target directory. Consider changing the remote temp path in ansible.cfg to a path rooted in \"/tmp\". Failed command was: ( umask 77 && mkdir -p \"` echo CLINFR0329  Invalid command:'/bin/sh -c 'echo ~ && sleep 0''./.ansible/tmp/ansible-tmp-1508773836.62-247443091731403 `\" && echo ansible-tmp-1508773836.62-247443091731403=\"` echo CLINFR0329  Invalid command:'/bin/sh -c 'echo ~ && sleep 0''./.ansible/tmp/ansible-tmp-1508773836.62-247443091731403 `\" ), exited with result 1, stdout output: CLINFR0329  Invalid command:'/bin/sh -c '( umask 77 && mkdir -p \"` echo CLINFR0329  Invalid command:'\"'\"'/bin/sh -c '\"'\"'echo ~ && sleep 0'\"'\"''\"'\"'./.ansible/tmp/ansible-tmp-1508773836.62-247443091731403 `\" && echo ansible-tmp-1508773836.62-247443091731403=\"` echo CLINFR0329  Invalid command:'\"'\"'/bin/sh -c '\"'\"'echo ~ && sleep 0'\"'\"''\"'\"'./.ansible/tmp/ansible-tmp-1508773836.62-247443091731403 `\" ) && sleep 0''.\n", "unreachable": true}
failed
: [172.27.254.170] (item=clish -c 'set interface eth1 state on' -s) => {"item": "clish -c 'set interface eth1 state on' -s", "msg": "Authentication or permission failure. In some cases, you may have been able to authenticate and did not have permissions on the target directory. Consider changing the remote temp path in ansible.cfg to a path rooted in \"/tmp\". Failed command was: ( umask 77 && mkdir -p \"` echo CLINFR0329  Invalid command:'/bin/sh -c 'echo ~ && sleep 0''./.ansible/tmp/ansible-tmp-1508773836.89-36032531829548 `\" && echo ansible-tmp-1508773836.89-36032531829548=\"` echo CLINFR0329  Invalid command:'/bin/sh -c 'echo ~ && sleep 0''./.ansible/tmp/ansible-tmp-1508773836.89-36032531829548 `\" ), exited with result 1, stdout output: CLINFR0329  Invalid command:'/bin/sh -c '( umask 77 && mkdir -p \"` echo CLINFR0329  Invalid command:'\"'\"'/bin/sh -c '\"'\"'echo ~ && sleep 0'\"'\"''\"'\"'./.ansible/tmp/ansible-tmp-1508773836.89-36032531829548 `\" && echo ansible-tmp-1508773836.89-36032531829548=\"` echo CLINFR0329  Invalid command:'\"'\"'/bin/sh -c '\"'\"'echo ~ && sleep 0'\"'\"''\"'\"'./.ansible/tmp/ansible-tmp-1508773836.89-36032531829548 `\" ) && sleep 0''.\n", "unreachable": true}
fatal
: [172.27.254.170]: UNREACHABLE! => {"changed": false, "msg": "All items completed", "results": [{"_ansible_item_result": true, "item": "clish -c 'set interface eth1 ipv4-address 1.2.3.4 mask-length 24' -s", "msg": "Authentication or permission failure. In some cases, you may have been able to authenticate and did not have permissions on the target directory. Consider changing the remote temp path in ansible.cfg to a path rooted in \"/tmp\". Failed command was: ( umask 77 && mkdir -p \"` echo CLINFR0329  Invalid command:'/bin/sh -c 'echo ~ && sleep 0''./.ansible/tmp/ansible-tmp-1508773836.62-247443091731403 `\" && echo ansible-tmp-1508773836.62-247443091731403=\"` echo CLINFR0329  Invalid command:'/bin/sh -c 'echo ~ && sleep 0''./.ansible/tmp/ansible-tmp-1508773836.62-247443091731403 `\" ), exited with result 1, stdout output: CLINFR0329  Invalid command:'/bin/sh -c '( umask 77 && mkdir -p \"` echo CLINFR0329  Invalid command:'\"'\"'/bin/sh -c '\"'\"'echo ~ && sleep 0'\"'\"''\"'\"'./.ansible/tmp/ansible-tmp-1508773836.62-247443091731403 `\" && echo ansible-tmp-1508773836.62-247443091731403=\"` echo CLINFR0329  Invalid command:'\"'\"'/bin/sh -c '\"'\"'echo ~ && sleep 0'\"'\"''\"'\"'./.ansible/tmp/ansible-tmp-1508773836.62-247443091731403 `\" ) && sleep 0''.\n", "unreachable": true}, {"_ansible_item_result": true, "item": "clish -c 'set interface eth1 state on' -s", "msg": "Authentication or permission failure. In some cases, you may have been able to authenticate and did not have permissions on the target directory. Consider changing the remote temp path in ansible.cfg to a path rooted in \"/tmp\". Failed command was: ( umask 77 && mkdir -p \"` echo CLINFR0329  Invalid command:'/bin/sh -c 'echo ~ && sleep 0''./.ansible/tmp/ansible-tmp-1508773836.89-36032531829548 `\" && echo ansible-tmp-1508773836.89-36032531829548=\"` echo CLINFR0329  Invalid command:'/bin/sh -c 'echo ~ && sleep 0''./.ansible/tmp/ansible-tmp-1508773836.89-36032531829548 `\" ), exited with result 1, stdout output: CLINFR0329  Invalid command:'/bin/sh -c '( umask 77 && mkdir -p \"` echo CLINFR0329  Invalid command:'\"'\"'/bin/sh -c '\"'\"'echo ~ && sleep 0'\"'\"''\"'\"'./.ansible/tmp/ansible-tmp-1508773836.89-36032531829548 `\" && echo ansible-tmp-1508773836.89-36032531829548=\"` echo CLINFR0329  Invalid command:'\"'\"'/bin/sh -c '\"'\"'echo ~ && sleep 0'\"'\"''\"'\"'./.ansible/tmp/ansible-tmp-1508773836.89-36032531829548 `\" ) && sleep 0''.\n", "unreachable": true}]}
        to
retry, use: --limit @/home/krichards/Ansible-Local-Deployment_and_Setup/kr-test.retry


PLAY RECAP
******************************************************************************************************************************************************************************
172.27.254.170             : ok=4    changed=3    unreachable=1    failed=0



Reply all
Reply to author
Forward
0 new messages