Vagrant supporting Ansible -u command

69 views
Skip to first unread message

Ian Smith

unread,
Oct 8, 2016, 6:00:08 AM10/8/16
to Vagrant
Hi,

Probably a daft question but I'm looking at using vagrant to automate bring up of machines created using ansible scripts. I've hit an issue in that the existing playbook uses two play calls. One to the account that already exists, so in this case vagrant and that all works. However the first playbook creates a user called "ansible_user" and then the second playbook call then uses -u ansible_user to indicate that the playbook is run as that new user.

What I've found is that in the second case with vagrant the playbook is still run as the vagrant user. Even though in the output I can see that the added -u ansible_user can be seen it seems that under Vagrant it has no effect.

So what I have for the playbook provisioning in my VagrantFile is the following

config.vm.provision "demotest1", type: "ansible" do |ansible|                                                                                      
    ansible.verbose = "vvv"                                                                                                                       
    ansible.playbook = "demo.yml"                                                                                                           
    ansible.raw_arguments = ["-e '@overridden_variables.json'", "-u ansible_user"]                                                                                                    
    ansible.raw_ssh_args = ['-o ForwardAgent=yes','-o ControlMaster=auto','-o ControlPersist=5m']
end

But I can see that with the verbose on there is an attempt to create a file in /home/ansible_user not as ansible_user which I have indicated but as vagrant which fails due to permission issues.

What am I missing in the config to make this all work?

Kind regards, Ian

Alvaro Miranda Aguilera

unread,
Oct 9, 2016, 2:52:01 PM10/9/16
to vagra...@googlegroups.com
Hello

I think the issue is the user being used.

if you log in using vagrant, that user wont have permissions in that folder.

So if you want to use ansible_user, then you could update the Vagrantfile for

config.ssh.username
config.ssh.password

Thanks
Alvaro.

--
This mailing list is governed under the HashiCorp Community Guidelines - https://www.hashicorp.com/community-guidelines.html. Behavior in violation of those guidelines may result in your removal from this mailing list.
 
GitHub Issues: https://github.com/mitchellh/vagrant/issues
IRC: #vagrant on Freenode
---
You received this message because you are subscribed to the Google Groups "Vagrant" group.
To unsubscribe from this group and stop receiving emails from it, send an email to vagrant-up+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/vagrant-up/1691e5d6-7522-4052-b419-713dfb72ce57%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Alvaro

Gilles Cornu

unread,
Oct 10, 2016, 2:30:06 AM10/10/16
to Vagrant
Hi Ian,

As of Vagrant 1.8+ the Ansible remote user is forced by default. Therefore your use case requires to set the "force_remote_user" option to false.

See:
Gilles

Ian Smith

unread,
Oct 10, 2016, 9:25:03 AM10/10/16
to Vagrant
Hi Gilles,

I *think* I see where you are going with this but it ends up with the following error message " UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh.", "unreachable": true}"

Running ansible directly the following commands work

ansible-playbook -v -e "@overridden_variables.json" -i inventories/demo.hosts -u ubuntu --ask-become-pass add_deployment_user.yml


ansible
-playbook -v -e "@overridden_variables.json" -i inventories/demo.hosts -u ansible_user demo.yml

Therefore if I assume that the vagrant user is the same as my initial ubuntu user and that the first playbook creates the user "ansible_user" then I believe that the following should work in the same way on vagrant 1.8.5

Vagrant.require_version ">= 1.7.0"


Vagrant.configure(2) do |config|


  config
.vm.box = "ubuntu/trusty64"


 
# Disable the new default behavior introduced in Vagrant 1.7, to
 
# ensure that all Vagrant machines will use the same SSH key pair.
 
# See https://github.com/mitchellh/vagrant/issues/5005
  config
.ssh.insert_key = false


 
# Setup the user first
  config
.vm.provision "deployuser", type: "ansible" do |ansible|
    ansible
.verbose = "v"
    ansible
.playbook = "add_deployment_user.yml"
    ansible
.sudo = true
    ansible
.raw_arguments = ["-e '@overridden_variables.json'"]
 
end


  config
.vm.provision "demo", type: "ansible" do |ansible|
    ansible
.verbose = "v"

    ansible
.playbook = "demo.yml"
    ansible
.raw_arguments = ["-e '@overridden_variables.json'", "-u ansible_user"]

    remote_user
= "ansible_user"

    ansible
.raw_ssh_args = ['-o ForwardAgent=yes','-o ControlMaster=auto','-o ControlPersist=5m']

    ansible
.force_remote_user = false
 
end
end

However it doesn't. Now one of the things in the override_variables is a list of keys which includes my public key which is different to the insecure_private_key used by vagrant. So in the case of the second ssh what key is used? 

Ian Smith

unread,
Oct 11, 2016, 12:05:04 PM10/11/16
to Vagrant
Having spent time on and off today looking at this and comparing the output of the ansible log as well as the console I think the ssh method described is correct. The issue appears to be that the key required is not being installed. The ansible script in question uses override JSON file to allow per user override of the keys to be installed with the user. It appears that running ansible via vagrant results in the -e option not being passed through.

In my VagrantFile I now have

ansible.raw_arguments = ["-e '@overridden_variables.json'"]



On the console output when vagrant provision is called I see

PYTHONUNBUFFERED=1 ANSIBLE_FORCE_COLOR=true ANSIBLE_HOST_KEY_CHECKING=false ANSIBLE_SSH_ARGS='-o UserKnownHostsFile=/dev/null -o IdentitiesOnly=yes -o ForwardAgent=yes -o ControlMaster=auto -o ControlPersist=60s' ansible-playbook --connection=ssh --timeout=30 --limit="demo_server" --inventory-file=/home/ismith/work/vagrant/demo/.vagrant/provisioners/ansible/inventory -v -e '@overridden_variables.json' add_deployment_user.yml


But if I look at the output in the ansible.log file I don't see the keys defined in the override_variables.json file present - implying that the parameter has been ignored.

Gilles Cornu

unread,
Oct 11, 2016, 4:33:08 PM10/11/16
to Vagrant
Hey Ian!

Thank you for bringing more details, which helped me to figure out where is the "second" problem (very probably).

I think that the two variants below should fix your problem:

1) [recommended] use the extra_vars option to pass your variable file to Ansible Playbook:

ansible.extra_vars = 'overridden_variables.json'

2) use the raw_arguments option in a way it works

ansible.raw_arguments = ["--extra-vars=@overridden_variables.json"]
or
ansible.raw_arguments = ["-e'@overridden_variables.json'"]
or
ansible.raw_arguments = ["-e", "'@overridden_variables.json'"]

You also hit the same problem if you want to specify the remote user via raw_arguments ("-u ansible_user" should be changed to "--user=ansible_user" or "-uansible_user")

I've just realised that this confusing "no space" issue was already reported once in the mailing list, but unfortunately didn't tracked attention. I've opened GH-7890 to tackle this bug.

Please give a try to the above, and let us know how it works for you.

Best,
Gilles

Ian Smith

unread,
Oct 12, 2016, 3:42:29 AM10/12/16
to Vagrant
Hi Gilles,

Thank you for that tip. I can see that the override file is now being used as expected. However the problem is now that the override server name is correctly being used the initial vagrant ssh connection to create the user is attempting to connect to the IP address of the server not 127.0.0.1 and the ssh port on 2222 is not present on that network interface. So now I get ssh failure on the first playbook.

Is there a way I can get the ssh port to work on both interfaces with Vagrant?

Best regards, Ian

Ian Smith

unread,
Oct 12, 2016, 4:13:16 AM10/12/16
to Vagrant
Hi Gilles,

Sorry error between chair and keyboard this time. Now that the override file is being read there was an ansible_ssh_host line in the override file which overrode the Vagrant generated one in the inventory. Removing that line now means that the deployment now completes.

Many thanks for all of your help, Ian

Gilles Cornu

unread,
Oct 12, 2016, 8:09:21 AM10/12/16
to Vagrant
Hi Ian,

Excellent News!

Best regards,
Gilles
Reply all
Reply to author
Forward
0 new messages