Ansible: Using Inventory file in shell command - ansible

Playbook below. I'm trying to replace test#ip: with a way to pull from my inventory file the IP from a group I've created.
- hosts: firewall
gather_facts: no
tasks:
- name: run shell script
raw: 'sh /home/test/firewall.sh'
- hosts: localhost
gather_facts: no
tasks:
- name: Copy File to Local Machine
shell: 'scp test#ip:/home/test/test.test /Users/dest/Desktop'

You need to change your task like this:
- hosts: localhost
gather_facts: no
tasks:
- name: Copy File to Local Machine
shell: 'scp test#{{ item }}:/home/test/test.test /Users/dest/Desktop'
with_items: groups['your_group_name']
If you want to run on all the hosts in the inventory then you can use like this:
with_items: groups['all']
Hope that will help you.

Related

Ansible with items loop inside my inventory

I have a task based on a shell command that needs to run on a local computer. As part of the command, I need to add the IP address on part of the groups I have in my inventory file
Inventory file :
[server1]
130.1.1.1
130.1.1.2
[server2]
130.1.1.3
130.1.1.4
[server3]
130.1.1.5
130.1.1.6
I need to run the following command from the local computer on the Ips that are part of the Server 2 + 3 groups
ssh-copy-id user#<IP>
# <IP> should be 130.1.1.3 , 130.1.1.4 , 130.1.1.5 , 130.1.1.6
Playbook - were I'm missing the part of the ip
- hosts: localhost
gather_facts: no
become: yes
tasks:
- name: Generate ssh key for root user
shell: "ssh-copy-id user#{{ item }}"
run_once: True
with_items:
- server2 group
- server3 group
In a nutshell:
- hosts: server1:server2
gather_facts: no
become: true
tasks:
- name: Push local root pub key for remote user
shell: "ssh-copy-id user#{{ inventory_hostname }}"
delegate_to: localhost
Note that I kept your exact shell command which is actually a bad practice since there is a dedicated ansible module to manage that. So this could be translated to something like.
- hosts: server1:server2
gather_facts: no
become: true
tasks:
- name: Push local root pub key for remote user
authorized_key:
user: user
state: present
key: "{{ lookup('file', '/root/.ssh/id_rsa.pub') }}"

how to set a variable such that it can be accessed any where in my ansible-playbook

here is my playbook main.yml. I want to use the playbook_name variable in the import_playbook module. But I am not able to access it. Is there any way i can access this variable?
---
- hosts: localhost
connection: local
tasks:
- name: Get playbook_name
shell: <command>
register: playbook_name
ignore_errors: yes
become: yes
- debug: var=hostvars['localhost']['playbook_name']
ignore_errors: True
- name: Include playbook based on playbook_name
import_playbook: "{{ hostvars['localhost']['playbook_name']['response'] }}"
I would recommend starting a new file under your inventories, like inventory/internal_vars, and then putting your variable in there. Then, you can import that file as variables using the vars_files module in your main.yml playbook like so:
- hosts: localhost
connection: local
vars_files:
- inventory/internal_vars
tasks:
- name: Get playbook_name
shell: <command>
register: playbook_name
ignore_errors: yes
become: yes
Note: The path to the file has to be a relative path from your playbook to the inventory/internal_vars file. The code above assumes the playbook and inventory folder are in the base level folder.
Reference: https://docs.ansible.com/ansible/latest/user_guide/playbooks_variables.html#defining-variables-in-included-files-and-roles

Can ansible variables be used to declare hosts in a playbook?

I have a playbook in the format below:
---
- hosts: myIP
tasks:
- name: Install a yum package in Ansible example
yum:
name: ThePackageIWantToInstall
state: present
where myIP and ThePackageIWantToInstall are variables.
When the job template runs, I would like the user in the extra variables popup to be able to go with:
myIP = 192.168.1.1
ThePackageIWantToInstall = nano
As the documentation doesn't provide an example of supplying a variable via a job template, is this possible?
Yes.
- name: Do The Thing
hosts: "{{ foo }}"
roles:
- "{{ role }}"
Need mustaches and quotes.
to run from popup
(I don't use this, but it was suggested as an edit, thanks...)
foo: value
I have achieved similar thing with add_hosts. Here iam not installing package but creating file with name passed from command line. Any number of hosts (separated by commas can be passed from command line).
# cat addhost2.yml
- hosts: localhost
gather_facts: no
tasks:
- add_host:
name: "{{ item }}"
groups: hosts_from_commandline
with_items: "{{ new_hosts_passed.split(',') }}"
- hosts: hosts_from_commandline
tasks:
- name: Ansible create file with name passed from commandline
file:
path: "/tmp/{{ filename_from_commandline }}"
state: touch
# ansible-playbook -i hosts addhost2.yml --extra-vars='new_hosts_passed=192.168.3.104,192.168.3.113 filename_from_commandline=lathamdkv'
Hope this helps

ansible generate *.retry file error when there are multiple plays in playbook

For example, deploy.yml is a ansible playbook. There are two plays in deploy.yml, play1 and play2.
$ cat deploy.yml
- hosts: nodes
remote_user: cloud
become: yes
tasks:
- name: play1
copy: src=test1 dest=/root
- hosts: nodes
remote_user: cloud
become: yes
tasks:
- name: play2
copy: src=test2 dest=/root
$ cat hosts
[nodes]
192.168.1.12
192.168.1.13
Running
ansible-playbook -i hosts deploy.yml
When play1 failed on 192.168.1.12 but success on 192.168.1.13, the deploy.retry only list 192.168.1.12 but no 192.168.1.13.
$ cat deploy.retry
192.168.1.12
Then I running
ansible-playbook -i hosts deploy.yml --limit #deploy.retry
I got a wrong result of play2 haven't be running on 192.168.1.13! Some people know how to solve this problem?
Problem is in playbok file, in facts, you have two independent playbooks in one file. I tested your setup with ansible 2.2.1.0 and second play run correctly for host without error in play1, but there can be difference in configuration.
Correct playbook format for expecting behavior is
- hosts: nodes
remote_user: cloud
become: yes
tasks:
- name: play1
copy: src=test1 dest=/root
- name: play2
copy: src=test2 dest=/root

How to execute command on localhost depending on a remote host

In my playbook I have this:
- name: compile
hosts: localhost
gather_facts: false
tasks:
- name: compile binary
local_action: command make build FOO=foo1
I want to execute make build FOO=bar1 on localhost once if host is either bar-1 or bar-2 (they're both in group bars, so distinguishing by group is fine too). I tried using when:
- name: compile binary
local_action: command make build FOO=foo1
when: (inventory_hostname != "bar-1") and (inventory_hostname != "bar-2")
- name: compile binary
local_action: command make build FOO=bar1
when: (inventory_hostname == "bar-1") or (inventory_hostname == "bar-2")
But inventory_hostname is always localhost.
In my hosts I have
[foos]
foo-1 ...
foo-2 ...
[bars]
bar-1 ...
bar-2 ...
And I run it as
ansible-playbook -i provision/hosts -l localhost,bars provision/deploy.yml
This works fine for me:
---
- hosts: localhost,test-server
gather_facts: no
tasks:
- shell: echo {{ inventory_hostname }}
delegate_to: localhost
Commands are executed on localhost, but printing localhost and test-server.
This task will run a command on localhost once if the current host being operated on is part of the bars group.
shell: echo {{ inventory_hostname }}
run_once: true
delegate_to: localhost
when: "'bars' in group_names"
Note: If you plan on using serial mode it affects run_once behaviour.
http://docs.ansible.com/ansible/playbooks_delegation.html#run-once
Way to run actions on localhost basis of hostnames in inventory
Command:
ansible-playbook -i ~/inventory_file provision/deploy.yml -e 'host_group=bars'
Add hosts in inventory file
~/inventory_file
[foos]
foo-1 ...
foo-2 ...
[bars]
bar-1 ...
bar-2 ...
deploy.yml
- name: compile
hosts: localhost
gather_facts: false
tasks:
- name: compile binary
local_action: command make build FOO=foo1
when: {{item}} not in "'bar-1','bar-2'"
with_items:
- groups['host_group']
- name: compile binary
local_action: command make build FOO=bar1
when: {{item}} in "'bar-1','bar-2'"
with_items:
- groups[host_group]

Resources