Using include_vars and with_items with optiona files == bug?

1,504 views
Skip to first unread message

Dan Linder

unread,
Jul 10, 2017, 3:55:18 PM7/10/17
to Ansible Project
I'm trying to setup a hierarchical set of variable files so that different teams can setup value, permitting more specific instances to override generic ones.  The key is that the more specific instances may not be defined in all cases.  I thought I solved it with a combination of "include_vars" and "with_items" and using "ignore_errors" to skip a missing file.

But, when any of the include files are missing the entire set of included values are lost.  It appears that any ignored failure in the include_vars deletes the entire data structure when it continues on.

Here's a sample playbook and variable files.  (For what it's worth this is Ansible 2.2.0.0 on RHEL 6)
teststackvars.yml
#!/usr/bin/env ansible-playbook
# Run like this:
# ANSIBLE_HASH_BEHAVIOUR=merge ./teststackvars.yml -i localhost, -l localhost --check
# or set hash behavior in ansible.cfg and run without variable.
---
- hosts: all
  gather_facts: False

  tasks:
  - name: Include the variables in precedence
    include_vars:
      file: "{{ item }}"
      name: myvars
    with_items:
      - info_a.yml
      - info_b.yml
      - info_{{ inventory_hostname }}.yml
    ignore_errors: True

  - debug:
      msg: "{{ myvars }}"

And these three vars files:
vars/info_a.yml
---
var_from_info_a: from info_a.yml
some_var: value from info_a
 
vars/info_b.yml
---
var_from_info_b: from info_b.yml
some_var: value from info_b

vars/info_localhost.yml
---
var_from_info_localhost: from info_localhost.yml
some_var: value from info_localhost

When I run it with all files defined it works just fine:
$ ./teststackvars.yml -i localhost, -l localhost --check

PLAY [all] *********************************************************************

TASK [Include the variables in precedence] *************************************
ok: [localhost] => (item=info_a.yml)
ok: [localhost] => (item=info_b.yml)
ok: [localhost] => (item=info_localhost.yml)

TASK [debug] *******************************************************************
ok: [localhost] => {
    "msg": {
        "some_var": "value from info_localhost",
        "var_from_info_a": "from info_a.yml",
        "var_from_info_b": "from info_b.yml",
        "var_from_info_localhost": "from info_localhost.yml"
    }
}

PLAY RECAP *********************************************************************
localhost                  : ok=2    changed=0    unreachable=0    failed=0

That looks good - all the unique variables from info_a/b/localhost are defined, and the common "some_var" is overwritten by the last file.

When I rename one of the YML files, the entire "myvars" variable structure goes away:
$ mv vars/info_localhost.yml vars/info_localhost.yml.disabled
$ ./teststackvars.yml -i localhost, -l localhost --check

PLAY [all] *********************************************************************

TASK [Include the variables in precedence] *************************************
ok: [localhost] => (item=info_a.yml)
ok: [localhost] => (item=info_b.yml)
fatal: [localhost]: FAILED! => {"failed": true, "msg": "Unable to find 'info_localhost.yml' in expected paths."}
...ignoring

TASK [debug] *******************************************************************
fatal: [localhost]: FAILED! => {"failed": true, "msg": "the field 'args' has an invalid value, which appears to include a variable that is undefined. The error was: 'myvars' is undefined\n\nThe error appears to have been in '/home/dan/teststackvars/teststackvars.yml': line 20, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n  - debug:\n    ^ here\n"}
        to retry, use: --limit @/home/dan/teststackvars/teststackvars.retry

PLAY RECAP *********************************************************************
localhost                  : ok=1    changed=0    unreachable=0    failed=1

$

I searched the group and Google in general for "with_items" and "include_vars" but didn't see anything that seems to pertain to this.  Shouldn't this work - at least let the variables that were in the "myvars" stay in-spite of the missing variable file?

Thanks,
Dan

Brian Coca

unread,
Jul 10, 2017, 3:59:22 PM7/10/17
to Ansible Project
Well, when you include_vars you are normally overwriting existing
vars, this will happen in a with_ loop or outside of it, that is
normal behaviour.

As for the error wiping out myvars ... the task failed, so none of the
work was done. The previously accumulated data in the 2 files that did
work is lost as the task ends up not importing any vars as it failed
as a whole, even if parts of it succeeded.

----------
Brian Coca

Dan Linder

unread,
Jul 10, 2017, 4:10:13 PM7/10/17
to Ansible Project
I forgot to add that I've defined the hash_behaviour to "merge" just for this express purpose of keeping and stacking/appending variables so the normal behavior is suppose to be modified here.

Brian Coca

unread,
Jul 10, 2017, 4:39:29 PM7/10/17
to Ansible Project
The merge setting will affect the result of the task, but not the
internal iterator of the task.

You might want to make this 3 tasks or use vars_files.


--
----------
Brian Coca

Dan Linder

unread,
Jul 10, 2017, 5:35:51 PM7/10/17
to Ansible Project
I'm falling back to the six tasks for now (three stat file, three include when exists).

In my mind this is a bug; the "ignore_errors" setting should let it keep the partial result.  The parallel being that the file module could partially succeed in setting a file owner but SELinux or other methods could deny the setting of the file group.

But I'll agree to disagree with you on that point. :-)

I could see the addition of the "hash_behavior" as an argument to "include_vars" and let it have the option of "keep_partial" as an extension of merge (replace, merge, merge_partial).

Brian Coca

unread,
Jul 10, 2017, 8:05:02 PM7/10/17
to Ansible Project
Well, this is not include_vars behaviour but 'merge loop results
behaviour' that affects all modules as this is result processing.

The other part of 'saving intermediate results' would also affect all tasks.

I doubt either will change in the future.


----------
Brian Coca
Reply all
Reply to author
Forward
0 new messages