127.0.0.1 and localhost in synchronize module

198 views
Skip to first unread message

Michael Dur

unread,
Jul 21, 2014, 2:38:47 PM7/21/14
to ansible...@googlegroups.com
Why is localhost and 127.0.0.1 a special case in the synchronize module?

Timothy Appnel

unread,
Jul 21, 2014, 6:43:27 PM7/21/14
to ansible...@googlegroups.com
Given the nature of rsync and what it does, your question can be read a few ways but let me guess at what you are getting at.

127.0.0.1 and localhost are treated specially because you're already on that box so you don't need to provide all of the SSH credentials. You can do a sync that doesn't go out to the network at all.

<tim/>

Jeppe Toustrup

unread,
Aug 13, 2014, 10:59:52 AM8/13/14
to ansible...@googlegroups.com
Sorry for bringing up a (somewhat) old topic, but what about the use case where Ansible is used to configure a Vagrant VM? In that case the VM would be accessible through 127.0.0.1:2222, and thus get hit by the special handling in the synchronization module. This means if you want to write to a path where the unprivileged/'vagrant' user doesn't have write permissions, you will have to add the non-obvious 'rsync_path="sudo rsync"' to your task configuration, instead of just adding 'sudo: True' like everywhere else in order to get the same result.

I don't know if there's a good way to detect situations like this so 'sudo: True' can have the proper effect. If not, then I think it might be worth mentioning explicitly in the documentation for the module.

Michael DeHaan

unread,
Aug 14, 2014, 7:41:35 AM8/14/14
to ansible...@googlegroups.com
Ansible views Vagrant as no different than any other computer.

I believe it should set ansible_ssh_port in inventory.

If you can paste what kind of errors you are seeing and your ansible version it might be more clear what you're talking about, but I don't understand "get hit by the special handling" means in this particular case exactly.

More information would be useful.




On Wed, Aug 13, 2014 at 10:59 AM, Jeppe Toustrup <ten...@tenzer.dk> wrote:
Sorry for bringing up a (somewhat) old topic, but what about the use case where Ansible is used to configure a Vagrant VM? In that case the VM would be accessible through 127.0.0.1:2222, and thus get hit by the special handling in the synchronization module. This means if you want to write to a path where the unprivileged/'vagrant' user doesn't have write permissions, you will have to add the non-obvious 'rsync_path="sudo rsync"' to your task configuration, instead of just adding 'sudo: True' like everywhere else in order to get the same result.

I don't know if there's a good way to detect situations like this so 'sudo: True' can have the proper effect. If not, then I think it might be worth mentioning explicitly in the documentation for the module.

--
You received this message because you are subscribed to the Google Groups "Ansible Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email to ansible-proje...@googlegroups.com.
To post to this group, send email to ansible...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/ansible-project/23c697de-1916-40ae-802b-fb1618ce28ab%40googlegroups.com.

For more options, visit https://groups.google.com/d/optout.

Dan Vaida

unread,
Jan 3, 2015, 1:13:09 PM1/3/15
to ansible...@googlegroups.com
Here's what I do to gain some other benefits:
- I let Vagrant dynamically generate the hosts file that is later used by Ansible.
- the Vagrant boxes use a different subnet thus not conflicting with the corner case described above.

Leonel Galan Recinos

unread,
Feb 4, 2015, 4:35:02 PM2/4/15
to ansible...@googlegroups.com
Dan, this is far from a corner case. Vagrant's default behavior is to setup the ssh address to: 127.0.0.1:2222.

Michael, thanks for your hard work on Ansible!  It's not that it treats Vagrant boxes differently, synchronize treats localhost and 127.0.0.1 differently, but maybe it shouldn't unless `ansible_connection=local` is set. I posted extended logs and information on my inventory file and task in https://github.com/ansible/ansible/issues/5240#issuecomment-72944174
Reply all
Reply to author
Forward
0 new messages