wait_for through bastion stalls

938 views
Skip to first unread message

David Pires

unread,
Jun 16, 2015, 1:16:03 PM6/16/15
to ansible...@googlegroups.com
I am having a weird issue when launching ec2 instances. I have a playbook where I launch an instance into a private VPC, using item.private_ip etc. 

My ssh config is setup to proxy requests to any host in this subnet. I can ssh successfully into any host, using my bastion and ssh config, so I know this is working correctly.

When I am launching an ec2 instance, my wait_for local action always times out, it can't connect "msg: Timeout when waiting for <private ip>", however, after the playbook fails I can ssh into this host, and adding this host into my inventory file also works.

- hosts: localhost

  connection: local

  gather_facts: False

  vars:

    instance_count: 1

  tasks:

    - name: Create instances

      local_action:

        module: ec2

        region: us-west-2

        group_id: ****

        keypair: ***

        instance_type: t2.micro

        image: ****

        vpc_subnet_id: ****

        count: "{{ instance_count }}"

        wait: yes

      register: ec2


    - name: Add instances to inventory

      add_host: name={{ item.private_ipi }} groups=new_instances

      with_items: ec2.instances


    - name: Wait for SSH

      local_action: wait_for

                      host={{ item.private_ip }}

                      port=22

                      state=started

      with_items: ec2.instances


- name: Configure

  hosts: new_instances

  gather_facts: True

  remote_user: ***

  sudo: True

  roles:

    - ***



 

Brian Coca

unread,
Jun 16, 2015, 8:34:02 PM6/16/15
to ansible...@googlegroups.com
That makes sense since wait_for does not have direct access (it just
uses a socket call), you might need to delegate the wait for to the
bastion machine.

--
Brian Coca

David Pires

unread,
Jun 16, 2015, 10:47:08 PM6/16/15
to ansible...@googlegroups.com

    - name: Wait for SSH                                                           

      delegate_to: <bastion ip>                                                    

      wait_for:                                                                    

        host={{ item.private_ip }}                                                 

        port=22                                                                    

        state=started                                                              

      with_items: ec2.instances 


I have tried delegating the wait_for, and with debug on, I can see that the task is being run on the bastion host. I am still having the same issue.

While it is stuck on the wait_for task, i can ssh from the bastion to the new instance, so I know it's not a timing issue, and as soon as the task fails, I can ssh directly from my ansible host.

I can get around the issue by removing the wait_for and use a pause task, but i'd like to know what i'm doing wrong.

Brian Coca

unread,
Jun 16, 2015, 11:18:28 PM6/16/15
to ansible...@googlegroups.com
You might want to add a small delay to wait_for (10s is normally good).



--
Brian Coca

David Pires

unread,
Jun 17, 2015, 12:06:04 PM6/17/15
to ansible...@googlegroups.com
I've added a 10s delay, no luck. 

I must have something else configured incorrectly, I'll just replace my wait_for with a pause for now, that works.

Khoa Nguyen

unread,
Sep 28, 2015, 6:27:18 AM9/28/15
to Ansible Project
Hi David,

You can use delegate_to for wait_for.

Remove: connection: local
Add: remote_user: of delegate instance. For example: ubuntu


- hosts: localhost

  #connection: local

  gather_facts: False


Regards,
Khoa Nguyen
Reply all
Reply to author
Forward
0 new messages