I have to copy files, which are target specific files. I have stored these files in my machine as their target hostnames.
Example:
/tmp/Server1.cfg
/tmp/Server2.cfg
host file has
Server1
Server2
When my playbook runs for Server1 it should copy Server1.cfg.
When my playbook runs for Server2 it should copy Server2.cfg.
How can I achieve this ?
Thanks.
PS: Please be explicit as I am still a toddler in ansible
You may want to read some chapters at docs.ansible.com:
Additionally, inventory_hostname is the name of the hostname as configured in Ansible’s inventory host file. This can be useful for when you don’t want to rely on the discovered hostname ansible_hostname or for other mysterious reasons.
So, in your case:
- copy:
src: "{{ inventory_hostname }}"
dest: "/tmp/{{ inventory_hostname }}.cfg"
Related
Running Ansible 2.9.3
Working in a large environment with hosts coming and going on a daily basis, I need to use wildcard hostnames in a host group: ie:
[excluded_hosts]
host01
host02
host03
[everyone]
host*
in my playbook I have
name: "Test working with host groups"
hosts: everyone,!excluded_hosts
connection: local
tasks:
The problem is, the task is running on hosts in the excluded group.
If I specifically list one of the excluded hosts in the everyone group, that host then gets properly excluded.
So Ansible isn't working as one might assume it would.
What's the best way to get this to work?
I tried:
hosts: "{{ ansible_hostname }}",!excluded_hosts
but it errored as invalid yaml syntax.
requirements: I can not specifically list each host, they come and go too frequently.
The playbooks are going to be automatically copied down to each host and the execution started afterwards, therefore I need to use the same ansible command line on all hosts.
I was able to come up with a solution to my problem:
---
- name: "Add host name to thishost group"
hosts: localhost
connection: local
tasks:
- name: "add host"
ini_file:
path: /opt/ansible/etc/hosts
section: thishost
option: "{{ ansible_hostname }}"
allow_no_value: yes
- meta: refresh_inventory
- name: "Do tasks on all except excluded_hosts"
hosts: thishost,!excluded_hosts
connection: local
tasks:
What this does is it adds the host's name to a group called "thishost" when the playbook runs. Then it refreshs the inventory file and runs the next play.
This avoids a having to constantly update the inventory with thousands of hosts, and avoids the use of wildcards and ranges.
Blaster,
Have you tried assigning hosts by IP address yet?
You can use wildcard patterns ... IP addresses, as long as the hosts are named in your inventory by ... IP address:
192.0.\*
\*.example.com
\*.com**
https://docs.ansible.com/ansible/latest/user_guide/intro_patterns.html
I am limited to using the raw module in this task, which is what is causing the complications.
I simply want to grab a file from the local host, and scp it to the list of hosts in my inventory file. I don't want to have to set up ssh keys from all destination hosts to the source host, so I want the scp to go outwards, from source to list of destination hosts. I could do something along the lines of:
raw: "scp {{ file }} {{ user_id }}#{{ item }}:."
with_items: "{{ list_of_hosts }}"
but then i'd have to define the list of hosts in two places, which, with a dynamic list, you don't want to be doing.
Is there any way to have an inventory file such as:
[src-host]
hostA
[dest-hosts]
host1
host2
host3
[dest-hosts:vars]
user_id="foo"
file="bar"
and a playbook such as:
- hosts: src-host
tasks:
- raw: "scp {{ file }} {{ user_id }}#{{ item }}"
with_items: "{{ dest-hosts }}"
EDIT: To clarify based on comment 1, I may ONLY use the 'raw' module due to the limitations of the target (destination) host.
Ansible version 2.7.9
I'm writing an ansible playbook to deploy an piece of software to a linux environment. SSH access to these systems is protected by a CPM (Cyberark), used as an ssh key manager.
I've got most of the logic figured out, save for one piece. The playbook needs to loop through hosts in an inventory group, lookup the ssh private key in Cyberark for each host and then use each key to ssh into each host in the inventory group to install the software. I'm struggling with how to make that work in ansible.
I've read through the add_host and cyberarkpassword documentation, as well about 4 hours worth of searching stackoverflow and blogs, and couldn't find a single example even close to what I'm trying to do.
As far as how I think it should work:
Using the cyberarkpassword lookup module, loop through hosts in inventory group specified by {{ env }}. Value for this will be passed in through --extra-args.
Retrieve the ssh private key for each host.
Register the output from the lookup, and copy to disk, again looping through each host, and naming the file with {{ inventory_hostname }}.pem
Finally, to consume it in the next play, set a variable ansible_ssh_common_args: "-o StrictHostKeyChecking=no -i {{ deploy_temp_dir}}/keys/{{ inventory_hostname }}.pem"
But I can't figure out how to put the loop-lookup-write to disk piece together.
Sample inventory file
[local]
localhost
[local:vars]
ansible_connection=local
[corp:children]
corp-onprem-dev
corp-onprem-stage
corp-onprem-prod
corp-cloud-dev
corp-cloud-stage
corp-cloud-dev
[corp-onprem-dev]
host1
host2
host3
[corp-onprem-stage]
host1
host2
host3
[corp-onprem-prod]
host1
host2
host3
[corp-cloud-dev]
[corp-cloud-stage]
[corp-cloud-prod]
deploy.yml -- this code does not work, just my attempt at figuring it out.
- name: retrieve ssh keys for hosts in the specified group, and write them to disk
hosts: local
gather_facts: no
tasks:
- name: lookup ssh private key for each host
debug: msg={{ lookup("cyberarkpassword", cyquery)}}
vars:
cyquery:
appid: 'myapp'
query: 'Safe=mysafe;Folder=Root;Object={{ env[0] }}'
output: 'Password'
loop: groups['{{ env }}']
register: sshkeys
- name: Copy ssh key to disk
copy:
content: "{{ sshkeys }}"
dest: "{{ deploy_temp_dir }}/keys/{{ env[0] }}.pem"
mode: 0600
loop: groups['{{ env }}']
It is not clear how to "use each (private) key to ssh into each host".
To loop through hosts in an inventory group, lookup the ssh private key in Cyberark for each host and then use each key to ssh into each host in the inventory group.
Let's assume localhost (controller) is able to connect the hosts.
Take a look at the content of the variable sshkeys
- debug:
var: sshkeys
Among the lists, you'll probably see the 2 items that you're looking for. (Fit the code to what you get.)
sshkeys.results[].item ...... inventory_hostname
sshkeys.results[].password ... ssh private key
Use template to store the keys in the files. Because the play is running at the localhost delegate_to shall be used to store the files at hosts.
- template:
src: hostname.pem.j2
dest: "{{ deploy_temp_dir }}/keys/{{ item.item }}.pem"
loop: "{{ sshkeys.results }}"
delegate_to: "{{ item.item }}"
.
$ cat hostname.pem.j2
{{ item.password }}
(Not tested. I don't have CyberArk. Storing passwords in disk files may violate security standards.)
I'm running an Ansible playbook for host_a. Some tasks I delegate to host_b.
Now I would like to use the synchronize module to copy a directory from localhost to host_b. But delegate_to is the wrong option here, since this results in copying from host_b to host_a.
Is there a possibility to do that?
- hosts: host_a
tasks:
- name: rsync directory from localhost to host_b
synchronize:
# files on localhost
src: files/directory
dest: /directory/on/host_b
# delegate_to does not work here
# delegate_to: host_b
The only solution I can think of is deleting the target directory and then using a recursive copy with the copy module.
I couldn't find anything in the module documentation.
(Using ansible 2.4.2.0)
Doing this task in its own play for host_b is also not really an option because the variables I need for this task depend on host_a.
The easiest solution in this case is to use rsync command with local_action, i.e
- hosts: cache1
tasks:
- name: rsync directory from localhost to host_b
local_action: command rsync -az "{{ playbook_dir }}/files/directory" "{{ hostvars['host_b']['ansible_host'] }}:/directory/on/host_b"
{{ playbook_dir }} helps by not hardcoding paths on local system.
All I could find was this from the docs:
Additionally, inventory_hostname is the name of the hostname as configured in Ansible’s inventory host file. This can be useful for when you don’t want to rely on the discovered hostname ansible_hostname or for other mysterious reasons. If you have a long FQDN, inventory_hostname_short also contains the part up to the first period, without the rest of the domain.
Is there any actual difference between inventory_hostname and ansible_hostname variables in Ansible? If so, then which one should I use and when?
inventory_hostname - As configured in the ansible inventory file (eg: /etc/ansible/hosts). It can be an IP address or a name that can be resolved by the DNS
ansible_hostname - As discovered by ansible. Ansible logs into the host via ssh and gathers some facts. As part of the fact, it also discovers its hostname which is stored in ansible_hostname.
Which one should you use?
hostvars is a dictionary which has an entry for each inventory host. If you want to access host information, you need to use the inventory_hostname. If you want to use/print the name of the host as configured on the host, you should use ansible_hostname since most likely the IP will be used in the inventory file.
Important: To use ansible_hostname, you need to gather facts:
gather_facts: true
Otherwise, you will get a message that ansible_hostname is not defined.
"ansible_hostname": "VARIABLE IS NOT DEFINED!"
Try this with one host to understand the differences
tasks:
- debug: var=inventory_hostname
- debug: var=ansible_hostname
- debug: var=hostvars