Retaining information about hosts after add_hosts

39 views
Skip to first unread message

Kurt Yoder

unread,
Sep 9, 2014, 3:36:55 PM9/9/14
to ansible...@googlegroups.com
Hi list,

I posted a while back about a way to parallelize Openstack node creation. To recap, I have a role with the following task:

- name: Set up API connections for all Openstack nodes
  add_host
:
    name
: "os_api_{{ item }}"
    ansible_ssh_host
: 127.0.0.1
    groups
: os_api
    ansible_connection
: local
    oshost
: "{{ item }}"
  with_items
: cluster


This gives me a bunch of API connections which I run in parallel in another role and task:

- name: Launch cluster VM on Openstack
  nova_compute
:
    name
: "{{ os_username }}_{{ oshost }}"
    state
: present
    login_username
: "{{ os_username }}"
    login_tenant_name
: "{{ os_tenant }}"
    login_password
: "{{ os_password }}"
    image_id
: "{{ os_image_id }}"
    key_name
: "{{ os_username }}_controller_key"
    wait_for
: 200
    flavor_id
: "{{ os_flavor_id }}"
    auth_url
: "{{ os_url }}"
    user_data
: "{{ lookup('template', '../templates/cloud-config.j2') }}"

- name: Assign IP address to cluster VM
  quantum_floating_ip
:
    state
: present
    login_username
: "{{ os_username }}"
    login_password
: "{{ os_password }}"
    login_tenant_name
: "{{ os_tenant }}"
    network_name
: "{{ os_network_name }}"
    instance_name
: "{{ os_username }}_{{ oshost }}"
    internal_network_name
: "{{ os_internal_network_name }}"
    auth_url
: "{{ os_url }}"
 
register: quantum_info

- name: Wait for cluster SSH to become available
  wait_for
:
    port
: 22
    host
: "{{ quantum_info.public_ip }}"
    timeout
: 180
    state
: started

- name: Retrieve cluster public SSH host key
  shell
: "ssh-keyscan {{ quantum_info.public_ip }}"
 
register: scanned_key


Now I have a list of IPs for the configured hosts. I want to record their SSH host key, as captured by "scanned_key". If I add the following, will it safely serialize access to the local "known_hosts" file:

- name: Set SSH known_hosts entry
  lineinfile:
    dest: ~/.ssh/known_hosts
    line: "{{ scanned_key.stdout }}"
    state: present
    regexp: "^{{ quantum_info.public_ip }} "
  delegate_to: localhost

Is this the recommended way to do it?


Thanks,

-Kurt

Kurt Yoder

unread,
Sep 9, 2014, 4:17:43 PM9/9/14
to ansible...@googlegroups.com
I just did some more testing with this. The behavior is:

Ansible writes sometimes two, sometimes three entries to the file.

To me, this indicates that file access is not exclusive. So the parallel Ansible processes are all opening and closing the file, and last-one-in wins. I also tried with both

delegate_to: localhost

and

connection: local

Neither of these fixed the file consistency problem outlined above.


Other approaches:
  • I can't use "add_host", because it does not work in parallel (see github issue).
  • I could retain "register" in variables and reuse it in a later role, but I can't see a way to do it (group thread about this is unanswered ATM).
  • I guess I'll have to write each host's variables to a local yaml file, then read those files as variables in a later role. Seems clunky, but I see no other way.

Any other suggestions?

Kurt Yoder

unread,
Sep 10, 2014, 9:08:55 AM9/10/14
to ansible...@googlegroups.com
I found a workaround which I am now using:

  • Break out the ssh_known_hosts code into a new role
  • Execute the new role with "serial: 1"

-Kurt
Reply all
Reply to author
Forward
0 new messages