Ansible Not Using `ansible_become_pass` Variable

983 views
Skip to first unread message

Chad Sheets

unread,
Apr 25, 2016, 8:11:12 AM4/25/16
to Ansible Project
Hi Everyone,

I recently installed a new Ansible host (Ubuntu Xenial, upgrading ansible from 1.9.4 to 2.0.0.2 in the process) and I can't get the new host to use the `ansible_become_pass` variable.

My plays always start by importing an ansible-vault encrypted file in which `ansible_become_pass` is defined. I can see the variable is imported through `- debug:`.

However, unless I use the -K flag Ansible fails shortly after with:

fatal: [Remote-Computer]: FAILED! => {"changed": false, "failed": true, "module_stderr": "", "module_stdout": "sudo: a password is required\r\n", "msg": "MODULE FAILURE", "parsed": false}

Passing the sudo password using -K works fine. The plays also still work fine on my 1.9.4 install.

Running the command as -vvvv:

<Remote-Computer> ESTABLISH SSH CONNECTION FOR USER: ansible

<Remote-Computer> SSH: EXEC sshpass -d14 ssh -C -vvv -o ControlMaster=auto -o ControlPersist=60s -o Port=22 -o User=ansible -o ConnectTimeout=10 -o ControlPath=/home/ansible/.ansible/cp/ansible-ssh-%h-%p-%r -tt Remote-Computer '( umask 22 && mkdir -p "$( echo $HOME/.ansible/tmp/ansible-tmp-1461548589.83-187900167106488 )" && echo "$( echo $HOME/.ansible/tmp/ansible-tmp-1461548589.83-187900167106488 )" )'

<Remote-Computer> PUT /tmp/tmpQuB4zJ TO /home/ansible/.ansible/tmp/ansible-tmp-1461548589.83-187900167106488/setup

<Remote-Computer> SSH: EXEC sshpass -d14 sftp -b - -C -vvv -o ControlMaster=auto -o ControlPersist=60s -o Port=22 -o User=ansible -o ConnectTimeout=10 -o ControlPath=/home/ansible/.ansible/cp/ansible-ssh-%h-%p-%r '[Remote-Computer]'

<Remote-Computer> ESTABLISH SSH CONNECTION FOR USER: ansible

<Remote-Computer> SSH: EXEC sshpass -d14 ssh -C -vvv -o ControlMaster=auto -o ControlPersist=60s -o Port=22 -o User=ansible -o ConnectTimeout=10 -o ControlPath=/home/ansible/.ansible/cp/ansible-ssh-%h-%p-%r -tt Remote-Computer '/bin/sh -c '"'"'sudo -H -S -n -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-ykcwgdxjxeyqbnmatwohoyxwtdpyljqj; LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 LC_MESSAGES=en_US.UTF-8 /usr/bin/python /home/ansible/.ansible/tmp/ansible-tmp-1461548589.83-187900167106488/setup; rm -rf "/home/ansible/.ansible/tmp/ansible-tmp-1461548589.83-187900167106488/" > /dev/null 2>&1'"'"'"'"'"'"'"'"''"'"''

fatal: [Remote-Computer]: FAILED! => {"changed": false, "failed": true, "invocation": {"module_name": "setup"}, "module_stderr": "OpenSSH_7.2p2 Ubuntu-4, OpenSSL 1.0.2g-fips  1 Mar 2016\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 24837\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\nShared connection to Remote-Computer closed.\r\n", "module_stdout": "sudo: a password is required\r\n", "msg": "MODULE FAILURE", "parsed": false}

I can confirm ssh'ing into Remote-Computer and running only `sudo -H -S -n -u root /bin/sh -c` returns the "sudo: a password is required" message.

I've tried adding `
ansible_become_method` and `ansible_become_user` as well.

Any assistance would be much appreciated.

Johannes Kastl

unread,
Apr 25, 2016, 11:21:38 AM4/25/16
to ansible...@googlegroups.com
On 25.04.16 04:12 Chad Sheets wrote:

> I can confirm ssh'ing into Remote-Computer and running only `sudo -H -S -n
> -u root /bin/sh -c` returns the "sudo: a password is required" message.

What happens when you enter your password at the prompt you get when
manually calling the command? Does it work?

Johannes

signature.asc
Message has been deleted

Chad Sheets

unread,
Apr 25, 2016, 1:22:17 PM4/25/16
to Ansible Project
Using Ansibles exact command on the remote machine doesn't prompt for a password at the terminal due to the flags passed to sudo. I am able to escalate using sudo on the remote machine when ssh'ed in as ansibles user.

I've actually made some progress since I submitted the post. I guess there was an issue with my playbook structure / variable scope or maybe a change to how scope is handled in 2.0.0.2 (I dont understand why a global var like ansible_become_pass should be restricted to a scope).

My playbook was structured as follows:

#|  Import encrypted host variables
- hosts: all
  vars_files
:
   
- host_vars/{{inventory_
hostname
}}.yml
   
- host_vars/{{inventory_hostname}}-encrypted.yml # <-- ansible_become_pass stored here
  tags
: [ 'always' ]

#|  Initialization, run before further plays
- name: Initialize with Core Plays
  hosts
: all
  become
: yes
  roles
:
   
- role: users
   
- role: hosts
  tags
: [ 'initialize' ]



My mistake was inserting the debugging statement in the same `host` block the variables are imported in.

 my playbooks work when I import the variables following every `become` command in every play.

#|  Initialization, run before further plays
- name: Initialize with Core Plays
  hosts
: all
  become
: yes
  vars_files
:
   
- host_vars/{{inventory_hostname}}.yml
   
- host_vars/{{inventory_hostname}}-encrypted.yml
  roles
:
   
- role: users
   
- role: hosts
  tags
: [ 'initialize' ]



This is sort-of annoying because I now have dozens of the same vars_files: statements in many different playbooks where I was able to declare it only once prior to 2.x.

I've looked but been unable to find the change in Ansible that causes this behavior. I assume I was importing variables incorrectly.

Thanks,
Chad

Johannes Kastl

unread,
Apr 25, 2016, 3:40:28 PM4/25/16
to ansible...@googlegroups.com
On 25.04.16 19:21 Chad Sheets wrote:
>
> My playbook was structured as follows:
>

I'm a little bit confused by your playbook. To me as an unexperienced
user it looks like two playbooks put into the same file. Also you are
mixing tasks with roles, I have no idea if this is expected to work.

What happens when you change it to something like this:

#################################
---
- hosts: all
become: yes
#| Import encrypted host variables
vars_files:
- host_vars/{{inventory_hostname}}.yml
- host_vars/{{inventory_hostname}}-encrypted.yml

#| Initialization, run before further plays
roles:
- users
- hosts
#################################

Johannes

signature.asc

Brian Coca

unread,
Apr 25, 2016, 3:45:13 PM4/25/16
to ansible...@googlegroups.com
Also, you should not need to import these files, the first should already be loaded automatically (ansible aways matches hostname inside host_vars).

A way to have both autoloaded is:

host_vars/{{inventory_hostname}}/clear.yml
host_vars/{{inventory_hostname}}/encyrpted.yml

as it will match the directory and import all files for that host.


As per scoping, I'm not sure what you mean, Ansible does not distinguish on variable names, but when/where they are defined.

----------
Brian Coca

Chad Sheets

unread,
Apr 25, 2016, 4:16:12 PM4/25/16
to Ansible Project
Thank you very much Brian, moving encrypted.yml into "host_vars/{{inventory_hostname}}/" fixed my issue. ansible_become_pass is now imported and available for all my plays.

I'm sure what I was doing before was bad form, but it's strange that it works fine in Ansible 1.9.4 but not now.

What I meant by scope was importing variables at the beginning of a playbook using the below play first made the variables in `{{inventory_hostname}}-encrypted.yml` available for all subsequent plays.

- hosts: all
  vars_files
:

   
- host_vars/{{inventory_hostname}}.yml
   
- host_vars/{{inventory_hostname}}-encrypted.
yml
  tags
: [ 'always' ]

To me, it seems that (in recent Ansible versions) variables imported using this command are only available in the play where the vars_files: statement is located.

Maybe it's related to this bug https://github.com/ansible/ansible/issues/9723? (found after looking into this dupe https://github.com/ansible/ansible/issues/10000)

Either way, thanks for taking the time to correct me.
Reply all
Reply to author
Forward
0 new messages