Help with ansible (not local) provisioner -- can't connect

2,001 views
Skip to first unread message

Peter Loron

unread,
Mar 16, 2016, 4:59:29 AM3/16/16
to Packer
I'm running Packer 0.9 on OSX 10.11. I'm trying to create and provision an AMI. The EC2 instance gets created fine, but things fail when ansible tries to connect to the target. The playbook has the line "remote_host: default" in it. Can anybody help me get this resolved? The docs on this don't discuss these details...

Thanks!

==> amazon-ebs: Prevalidating AMI Name...
==> amazon-ebs: Inspecting the source AMI...
==> amazon-ebs: Creating temporary keypair: packer 56e91e5c-da6f-4c99-0173-656544882a41
==> amazon-ebs: Creating temporary security group for this instance...
==> amazon-ebs: Authorizing access to port 22 the temporary security group...
==> amazon-ebs: Launching a source AWS instance...
    amazon-ebs: Instance ID: i-154114d2
==> amazon-ebs: Waiting for instance (i-154114d2) to become ready...
==> amazon-ebs: Waiting for SSH to become available...
==> amazon-ebs: Connected to SSH!
==> amazon-ebs: Provisioning with shell script: /var/folders/ks/czq4wcg92bd9t85d_sxhyw440000gn/T/packer-shell853451492
==> amazon-ebs: Provisioning with Ansible...
==> amazon-ebs: SSH proxy: serving on 127.0.0.1:62912
==> amazon-ebs: Executing Ansible: ansible-playbook /Users/peterl/Projects/btc_aws/install_btc_classic.yaml -i /var/folders/ks/czq4wcg92bd9t85d_sxhyw440000gn/T/packer-provisioner-ansible892941253 --private-key /var/folders/ks/czq4wcg92bd9t85d_sxhyw440000gn/T/ansible-key773948030
    amazon-ebs:
    amazon-ebs: PLAY ***************************************************************************
    amazon-ebs:
    amazon-ebs: TASK [setup] *******************************************************************
    amazon-ebs: SSH proxy: accepted connection
==> amazon-ebs: authentication attempt from 127.0.0.1:62913 to 127.0.0.1:62912 as packer-ansible using none
==> amazon-ebs: authentication attempt from 127.0.0.1:62913 to 127.0.0.1:62912 as packer-ansible using publickey
    amazon-ebs: fatal: [default]: UNREACHABLE! => {"changed": false, "msg": "SSH Error: data could not be sent to the remote host. Make sure this host can be reached over ssh", "unreachable": true}
    amazon-ebs: to retry, use: --limit @/Users/peterl/Projects/btc_aws/install_btc_classic.retry
    amazon-ebs:
    amazon-ebs: PLAY RECAP *********************************************************************
    amazon-ebs: default                    : ok=0    changed=0    unreachable=1    failed=0
    amazon-ebs:
==> amazon-ebs: shutting down the SSH proxy
==> amazon-ebs: Terminating the source AWS instance...
==> amazon-ebs: No AMIs to cleanup
==> amazon-ebs: Deleting temporary security group...
==> amazon-ebs: Deleting temporary keypair...
Build 'amazon-ebs' errored: Error executing Ansible: Non-zero exit status: exit status 3

Jeff Fairchild

unread,
Mar 16, 2016, 3:19:22 PM3/16/16
to Packer
Have you tried setting the parameter "groups: group_name" in the ansible provisioner? https://www.packer.io/docs/provisioners/ansible.html
Then in the playbook you'd need "host: group_name"

Not sure if you need "remote_host" in your playbook.

Peter Loron

unread,
Mar 16, 2016, 5:13:06 PM3/16/16
to Packer
Same behavior, unfortunately. I set "groups: target" in the provisioner section of the packer file. In the playbook, I have "hosts: target". The value that shows up in the output of the ansible run indicates that the host name "default" is properly getting set in the temp ansible hosts file:

    amazon-ebs: SSH proxy: accepted connection
==> amazon-ebs: authentication attempt from 127.0.0.1:52002 to 127.0.0.1:52001 as packer-ansible using none
==> amazon-ebs: authentication attempt from 127.0.0.1:52002 to 127.0.0.1:52001 as packer-ansible using publickey
    amazon-ebs: fatal: [default]: UNREACHABLE! => {"changed": false, "msg": "SSH Error: data could not be sent to the remote host. Make sure this host can be reached over ssh", "unreachable": true}
    amazon-ebs: to retry, use: --limit @/Users/peterl/Projects/btc_aws/install_btc_classic.retry

Is there any way to turn low-level tracing on to see more of what is going on here?

Jeff Fairchild

unread,
Mar 16, 2016, 5:17:31 PM3/16/16
to Packer
I used "extra_arguments": "-vvvv"

I haven't gotten much farther that you, however.

Peter Loron

unread,
Mar 17, 2016, 5:44:46 PM3/17/16
to Packer
That didn't shed any extra light on why this is crapping out. Looking at the output, I'm not even sure if the problem is ansible connecting from my host to the proxy or from the proxy out to the machine being provisioned...

jonathan...@sendachi.com

unread,
Apr 1, 2016, 12:03:35 PM4/1/16
to Packer
What guest OS are you using?  I was getting the same error using atomic centos 7 ... it appears that the problem was that "/usr/bin/sftp-server" (which packer uses by default in the guest) does not exist in that distro... 

I added this to the provisioner section in my packer.json:

"sftp_command":"/usr/libexec/openssh/sftp-server -e"

which resolved the issue.

Meysam Zarezadeh

unread,
May 2, 2016, 9:52:32 AM5/2/16
to Packer
Try setting following environment variables in order to save packer logs in file:
export PACKER_LOG=10
export PACKER_LOG_PATH=./packer.log

In my situation I see these lines in packer logs:
2016/05/02 17:55:24 ui: ==> vmware-iso: starting sftp subsystem
2016/05/02 17:55:24 packer-v0.10.0: 2016/05/02 17:55:24 opening new ssh session
2016/05/02 17:55:24 packer-v0.10.0: 2016/05/02 17:55:24 starting remote command: /usr/lib/sftp-server -e
2016/05/02 17:55:24 packer-v0.10.0: 2016/05/02 17:55:24 remote command exited with '130': /usr/lib/sftp-server -e
2016/05/02 17:55:24 packer-v0.10.0: 2016/05/02 17:55:24 [INFO] RPC endpoint: Communicator ended with: 130
After investigation I found that root cause is setting "ssh_pty":"true" in builders section of packer json config.Setting ssh_pty to false solved this problem

Peter Mooshammer

unread,
May 22, 2016, 1:30:11 PM5/22/16
to Packer
Here are my settings that work with AWS:
  "provisioners": [{
    "type": "ansible",
    "user": "centos",
    "sftp_command": "/usr/libexec/openssh/sftp-server",
    "playbook_file": "....."
  }]

In builders I set:
    "ssh_username": "centos",



Peter
Reply all
Reply to author
Forward
0 new messages