How do I tell Ansible to include localhost on the task? - ansible

I have this task:
- name: Install OpenJDK
become: true
apt:
name: openjdk-8-jre-headless
cache_valid_time: 60
state: latest
I want to run it in all hosts, including localhost. How can I tell Ansible to include localhost in the hosts for just one play?

You just add localhost to the pattern of targeted hosts in your play. Note that, unless your re-define it in your inventory, localhost is implicit and does not match the all special group.
The global idea
---
- name: This play will target all hosts in inventory
hosts: all
tasks:
- debug:
msg: I'm a dummy task
- name: This play will target all inventory hosts AND implicit localhost
hosts: all:localhost
tasks:
- debug:
msg: Yet an other dummy task

Related

Running tasks on dynamically collected EC2 instances

I'm trying to write a role with tasks that filters out several EC2 instances, adds them to the inventory and then stops a PHP service on them.
This is how far I've gotten, the add_host idea I'm copying from here: http://docs.catalystcloud.io/tutorials/ansible-create-x-servers-using-in-memory-inventory.html
My service task does not appear to be running on the target instances but instead on the host specified in the playbook which runs this role.
---
- name: Collect ec2 data
connection: local
ec2_remote_facts:
region: "us-east-1"
filters:
"tag:Name": MY_TAG
register: ec2_info
- name: "Add the ec2 hosts to a group"
add_host:
name: "{{ item.id }}"
groups: foobar
ansible_user: root
with_items: "{{ ec2_info.instances }}"
- name: Stop the service
hosts: foobar
become: yes
gather_facts: false
service: name=yii-queue#1 state=stopped enabled=yes
UPDATE: when I try baptistemm's suggestion, I get this:
PLAY [MAIN_HOST_NAME] ***************************
TASK [ec2-manage : Collect ec2 data]
*******************************************
ok: [MAIN_HOST_NAME]
TASK [ec2-manage : Add the hosts to a new group]
*******************************************************
PLAY RECAP **********************************************************
MAIN_HOST_NAME : ok=1 changed=0 unreachable=0 failed=0
UPDATE #2 - yes, the ec2_remote_tags filter does return instances (using the real tag value not the fake one I put in this post). Also, I have seen ec2_instance_facts but I ran into some issues with that (boto3 required although I have a workaround there, still I'm trying to fix the current issue first).
If you want to play tasks on a group of targets you need to define a new play after you created the in-memory list of targets with add_host (as mentionned in your link). So to do that you need like this:
… # omit the code before
- name: "Add the ec2 hosts to a group"
add_host:
name: "{{ item.id }}"
groups: foobar
ansible_user: root
with_items: "{{ ec2_info.instances }}"
- hosts: foobar
gather_facts: false
become: yes
tasks:
- name: Stop the service
hosts: foobar
service: name=yii-queue#1 state=stopped enabled=yes

How to add host to group in Ansible Tower inventory?

How can I add a host to a group using tower_group or tower_host modules?
The following code creates a host and a group, but they are unrelated to each other:
---
- hosts: localhost
connection: local
gather_facts: false
tasks:
- tower_inventory:
name: My Inventory
organization: Default
state: present
tower_config_file: "~/tower_cli.cfg"
- tower_host:
name: myhost
inventory: My Inventory
state: present
tower_config_file: "~/tower_cli.cfg"
- tower_group:
name: mygroup
inventory: My Inventory
state: present
tower_config_file: "~/tower_cli.cfg"
Docs mention instance_filters parameter ("Comma-separated list of filter expressions for matching hosts."), however do not provide any usage example.
Adding instance_filters: myhost to the tower_group task has no effect.
I solved it using Ansible shell module and tower-cli. I Know that create a ansible module is better than it, but to a fast solution...
- hosts: awx
vars:
tasks:
- name: Create Inventory
tower_inventory:
name: "Foo Inventory"
description: "Our Foo Cloud Servers"
organization: "Default"
state: present
- name: Create Group
tower_group:
inventory: "Foo Inventory"
name: Testes
register: fs_group
- name: Create Host
tower_host:
inventory: "Foo Inventory"
name: "host"
register: fs_host
- name: Associate host group
shell: tower-cli host associate --host "{{fs_host.id}}" --group "> {{fs_group.id}}"
This isn't natively available in the modules included with Tower, which are older and use the deprecated tower-cli package.
But it is available in the newer AWX collection, which uses the awx CLI, as long as you have a recent enough Ansible (2.9 should be fine).
In essence, install the awx collection through a requirements file, or directly like
ansible-galaxy collection install awx.awx -p ./collections
Add the awx.awx collection to your playbook
collections:
- awx.awx
and then use the hosts: option to tower_group:.
- tower_group:
name: mygroup
inventory: My Inventory
hosts:
- myhost
state: present
You can see a demo playbook here.
Be aware though that you may need preserve_existing_hosts: True if your group already contains other hosts. Unfortunately there does not seem to be an easy way to remove a single host from a group.
In terms of your example this would probably work:
---
- hosts: localhost
connection: local
gather_facts: false
collections:
- awx.awx
tasks:
- tower_inventory:
name: My Inventory
organization: Default
state: present
tower_config_file: "~/tower_cli.cfg"
- tower_host:
name: myhost
inventory: My Inventory
state: present
tower_config_file: "~/tower_cli.cfg"
- tower_group:
name: mygroup
inventory: My Inventory
state: present
tower_config_file: "~/tower_cli.cfg"
hosts:
- myhost

Ansible Playbook skip some play on multiple play playbook file

I have an ansible playbook YAML file which contains 3 plays.
The first play and the third play run on localhost but the second play runs on remote machine as you can see an example below:
- name: Play1
hosts: localhost
connection: local
gather_facts: false
tasks:
- ... task here
- name: Play2
hosts: remote_host
tasks:
- ... task here
- name: Play3
hosts: localhost
connection: local
gather_facts: false
tasks:
- ... task here
I found that, on the first run, Ansible Playbook executes Play1 and Play3 and skips Play2. Then, I try to run again, it executes all of them correctly.
What is wrong here?
The problem is that, at Play2, I use ec2 inventor like tag_Name_my_machine but this instance was not created yet, because it would be created at Play1's task.
Once Play1 finished, it will run Play2 but no host found so it silently skip this play.
The solution is to create dynamic inventor and manually register at Play1's tasks:
Playbook may look like this:
- name: Play1
hosts: localhost
connection: local
gather_facts: false
tasks:
- name: Launch new ec2 instance
register: ec2
ec2: ...
- name: create dynamic group
add_host:
name: "{{ ec2.instances[0].private_ip }}"
group: host_dynamic_lastec2_created
- name: Play2
user: ...
hosts: host_dynamic_lastec2_created
become: yes
become_method: sudo
become_user: root
tasks:
- name: do something
shell: ...
- name: Play3
hosts: localhost
connection: local
gather_facts: false
tasks:
- ... task here

Ansible - Get Facts from Remote Windows Hosts

I am using Ansible / Ansible Tower and would like to determine what facts are available on my Windows host. The documentation states that I can run the following:
ansible hostname -m setup
How would I incorporate this into a playbook I run from Tower so I can gather the information from hosts?
Here is the current Playbook per the assistance given:
# This play outputs the facts of the Windows targets in scope
- name: Gather Windows Facts
hosts: "{{ target }}"
gather_facts: yes
tasks:
- setup:
register: ansible_facts
- debug: item
with_dict: ansible_facts
However, running this produces the following error:
ERROR! this task 'debug' has extra params, which is only allowed in
the following modules: command, shell, script, include, include_vars,
add_host, group_by, set_fact, raw, meta
Use gather_facts which is true by default. It is equivalent to running setup module.
- hosts: ....
gather_facts: yes
The facts are saved in ansible variables to be used in playbooks. See System Facts
There are many ways to display the ansible facts. For you to understand how it works, try the following:
- hosts: 127.0.0.1
gather_facts: true
tasks:
- setup:
register: ansible_facts
- debug: item
with_dict: ansible_facts
Testing and working through it, this is working for me:
- name: Gather Windows Facts
hosts: "{{ target }}"
tasks:
- debug: var=vars
- debug: var=hostvars[inventory_hostname]

how to run a particular task on specific host in ansible

my inventory file's contents -
[webservers]
x.x.x.x ansible_ssh_user=ubuntu
[dbservers]
x.x.x.x ansible_ssh_user=ubuntu
in my tasks file which is in common role i.e. it will run on both hosts but I want to run a following task on host webservers not in dbservers which is defined in inventory file
- name: Install required packages
apt: name={{ item }} state=present
with_items:
- '{{ programs }}'
become: yes
tags: programs
is when module helpful or there is any other way? How could I do this ?
If you want to run your role on all hosts but only a single task limited to the webservers group, then - like you already suggested - when is your friend.
You could define a condition like:
when: inventory_hostname in groups['webservers']
Thank you, this helps me too.
hosts file:
[production]
host1.dns.name
[internal]
host2.dns.name
requirements.yml file:
- name: install the sphinx-search rpm from a remote repo on x86_64 - internal host
when: inventory_hostname in groups['internal']
yum:
name: http://sphinxsearch.com/files/sphinx-2.2.11-1.rhel7.x86_64.rpm
state: present
- name: install the sphinx-search rpm from a remote repo on i386 - Production
when: inventory_hostname in groups['production']
yum:
name: http://sphinxsearch.com/files/sphinx-2.2.11-2.rhel6.i386.rpm
state: present
An alternative to consider in some scenarios is -
delegate_to: hostname
There is also this example form the ansible docs, to loop over a group. https://docs.ansible.com/ansible/latest/user_guide/playbooks_delegation.html -
- hosts: app_servers
tasks:
- name: gather facts from db servers
setup:
delegate_to: "{{item}}"
delegate_facts: True
loop: "{{groups['dbservers']}}"

Resources