Ansible and SSH agent forwarding

1,589 views
Skip to first unread message

Mark

unread,
Oct 1, 2014, 6:48:52 AM10/1/14
to ansible...@googlegroups.com
So I'm running 3 CentOS 6.5 machines and came upon this thing which I don't understand.

server 1: client machine
server 2: ansible machine
server 3: any target machine controlled by ansible.

server 2/3 both have my public key. so when I ssh to them from server 1, it all goes well.

I ssh from server 1 to server 2 with the -a flag. (disable agent forwarding). When I run any playbook against server 3 it will fail because of permission denied (I needs my key) So this is expected.

When I ssh with the -A flag (enable agent forwarding) it should work. and so it does. Also as expected.

But now the tricky part:

Inmediatly after I ran the playbook with the ssh -A (enable agent forwarding) I disconnect from server 2 and reconnect with -a (disable agent forwarding) 
I run the playbook and it DOESN't fail?
when I try to ssh from server 2 to 3 it says: permission denied (as expected since it doesn't have my key)

So the question remains. Who captures my key and leaves it there on server 2. Is this paramiko or is this Ansible. And moreover, why? Is this as designed?
I recreated this occurence on ubuntu 14.04 lts, which should indicate that paramiko is not causing this behaviour but ansible it self is.




Michael DeHaan

unread,
Oct 1, 2014, 8:22:55 AM10/1/14
to ansible...@googlegroups.com
I'm having a bit of difficulty following the above, but I did want to point out that ansible is not doing anything to move/store your key.



--
You received this message because you are subscribed to the Google Groups "Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ansible-proje...@googlegroups.com.
To post to this group, send email to ansible...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ansible-project/9113391c-341f-4af3-8b2f-91af1f744533%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Mark

unread,
Oct 1, 2014, 8:34:36 AM10/1/14
to ansible...@googlegroups.com
I'm not sure if I can make it any clearer than this.

Server 1 has the private key. Server 2 and 3 have the public key of server 1
If I connect to server 3 from server 1 through server 2 I would need to use ssh -A, since it needs to take my key from server 1, through 2 to 3.
This is done through SSH_AUTH_SOCK. 

If I connect with ssh -a it will go to server 2 but wont patch me through from 2 to 3, since it is missing the key to do the challange from.

Ansible, however, allows to run the playbook when I connect with -A first. run the playbook. disconnect, connect with -a again and run the playbook.
it shouldn't do this since I dont have a private key send to server 2 to do the challange. So the question remains why Ansible fails to do the SSH key check? since when I don't run the playbook the second time while I'm logged on with ssh -a (after i did the ssh -A + run the playbook  and completed it and logged out) I can't ssh to server 3. 



Op woensdag 1 oktober 2014 14:22:55 UTC+2 schreef Michael DeHaan:

Karl E. Jorgensen

unread,
Oct 1, 2014, 8:41:01 AM10/1/14
to ansible...@googlegroups.com
Hi

On Wed, Oct 01, 2014 at 03:48:52AM -0700, Mark wrote:
> So I'm running 3 CentOS 6.5 machines and came upon this thing which I don't
> understand.
>
> server 1: client machine
> server 2: ansible machine
> server 3: any target machine controlled by ansible.
>
> server 2/3 both have my public key. so when I ssh to them from server 1, it all
> goes well.
>
> I ssh from server 1 to server 2 with the -a flag. (disable agent forwarding).
> When I run any playbook against server 3 it will fail because of permission
> denied (I needs my key) So this is expected.
>
> When I ssh with the -A flag (enable agent forwarding) it should work. and so it
> does. Also as expected.
>
> But now the tricky part:
>
> Inmediatly after I ran the playbook with the ssh -A (enable agent forwarding) I
> disconnect from server 2 and reconnect with -a (disable agent forwarding)
> I run the playbook and it DOESN't fail?

Because ansible runs ssh stuff with the ControlMaster,ControlPath and
ControlPersist options - thus the SSH connection is kept open for a
little while afterwards. This allows ansible to re-use previous SSH
connections and thus speeds things up.

This is a handy thing to use even for interactive use...

--
Karl E. Jorgensen

Brian Coca

unread,
Oct 1, 2014, 8:41:11 AM10/1/14
to ansible...@googlegroups.com
Check to see if you still have the control persist sockets open,
ansible will reuse those and the forwarding settings will be the ones
used to create the socket, so if created with -A, forwarding will
continue to work.

--
Brian Coca
Stultorum infinitus est numerus
0110000101110010011001010110111000100111011101000010000001111001011011110111010100100000011100110110110101100001011100100111010000100001
Pedo mellon a minno

Mark

unread,
Oct 1, 2014, 8:55:40 AM10/1/14
to ansible...@googlegroups.com
So, allthough I have not specified ControlPersist in the playbook, nor in .ssh/config , when I reconnect to the machine in a short enough time and run the playbook to server 3, the ssh tunnel is still open and will be used by Ansible? 

So what I want, is probably, a way to close the tunnel when I logout of Ansible, while there is no playbook running.
Am I correct in assuming that disabling ControlPersist will result in breaking a playbook if Ansible is busy running such?

Op woensdag 1 oktober 2014 14:41:11 UTC+2 schreef Brian Coca:

Brian Coca

unread,
Oct 1, 2014, 9:09:37 AM10/1/14
to ansible...@googlegroups.com
ansible by default tries to use control persist (you can turn this off
in ansible.cfg) if using a new enough version of openssh as it speeds
things up considerably.
Reply all
Reply to author
Forward
0 new messages