I've been having issues with the '{{ var }}' format and yaml
compatibility so I've had to mix-and-match ${ var } syntax. Perhaps I
just always need to use curlies quoted like "{{ var }}"?
I might be misunderstanding with setting $hosts, so here's my complete playbook (launch-ec2.yml):
# Launches an EC2 instance and provisions it as the specified role
#
# A particular role can be provisioned with, e.g.:
# ansible-playbook launch-ec2.yml -i localhost --extra-vars "role=webserver env=dev01"
- name: Launching an EC2 instance then provisioning it as a {{ role }}
vars_files:
- vars/all.yml
- vars/$env.yml # include the env-specific variables (e.g. dev01.yml)
hosts: localhost
gather_facts: False
connection: local
tasks:
- name: Launching instance
local_action:
module: ec2
keypair: ${aws.keypair}
group: ${aws.security_group}
instance_type: ${aws.instance_type}
image: ${aws.image}
wait: true
region: ${aws.region}
instance_tags: '{"Name": "{{ role }}", "env": "{{ env }}"}'
user_data: "{{ lookup('file', 'files/cloudinit') }}"
ec2_access_key: ${aws.access_key_id}
ec2_secret_key: ${aws.secret_access_key}
register: ec2
- name: Adding new instance to host group
local_action: add_host hostname=${item.public_dns_name} groupname=webservers
with_items: ec2.instances
- name: Waiting for SSH to come up
local_action: wait_for host=${item.public_dns_name} port=22 delay=10 timeout=120 state=started
with_items: ec2.instances
- include: provision.yml hosts=webservers env=env
And provision.yml is:
# Provisions a running instance as the role supplied by extra-vars.
#
# A running EC2 instance can be provisioned directly with, e.g.:
# ansible-playbook provision.yml -i ./ec2.py --extra-vars "role=webserver env=dev01 user=ubuntu hosts=tag_Name_webserver"
- name: Configuring instance(s) as a {{ role }}
vars_prompt:
role: "Enter role to configure"
env: "Enter environment name"
vars_files: # need this here too in case we're using vagrant
- vars/all.yml
- vars/$env.yml # include the env-specific variables (e.g. dev01.yml)
hosts: $hosts
sudo: True
gather_facts: True
roles: # provision the instance as the appropriate role, passed as a parameter to this play
- base
- $role
# this is where I want to dynamically add the $hosts to 'webservers', ready for the deploy-site.yml playbook
- include: deploy-site.yml
And deploy-site.yml is:
- name: Deploy the web app to the web servers
vars_files: # need this here too in case we're using vagrant
- vars/all.yml
- vars/$env.yml # include the env-specific variables (e.g. dev01.yml)
- roles/webserver/vars/main.yml
hosts: webservers # only target webservers
user: $user
sudo: True
sudo_user: web
gather_facts: True
tasks:
- include: roles/webserver/tasks/deploy.yml
I've
set these up so that I can either call launch-ec2.yml directly, or I
can use vagrant just with provision.yml. Here's my Vagrantfile:
Vagrant.configure("2") do |config|
config.vm.provision :ansible do |ansible|
ansible.playbook = "ansible/provision.yml"
# uncomment the following line to only run the deployment
# ansible.playbook = "ansible/deploy-site.yml"
ansible.inventory_file = "ansible/vagrant-hosts"
ansible.verbose = true
ansible.extra_vars = {
hosts: "vagrant",
role: "webserver",
user: "vagrant",
env: "vagrant"
}
end
...
Also, if I want I can reprovision either a vagrant or an EC2 instance by just calling provision.yml or deploy-site.yml.
Perhaps
the issue is that I'm only targetting webservers in deploy-site.yml
(because that's all I want to deploy to for this playbook)? I tried
using 'all' in combination with a --limit parameter to ansible-playbook but then ansible started trying to configure 'localhost'.
So
that's where I thought if I could pass a host-group from the ec2.py
inventory script, but then add all hosts in it to 'webservers', that
might work, but I don't seem able to...