Ansible how to reference item in jinja2 template - ansible

I have a play as follows
- name: create the unison preference file
template:
src: default.prf.j2
dest: /root/.unison/{{ item }}.prf
with_items: groups['ndeployslaves']
The contents of the default.prf.j2 file is as follows
root = /home
root = ssh://root#{{ item }}//home
ignore = Path virtfs
ignore = Path */mail
The item variable is not working in the template and I am getting the error
TASK [unison_master : create the unison prefrence file] ************************
fatal: [127.0.0.1]: FAILED! => {"failed": true, "msg": "'item' is undefined"}
How do I reference an item inside a template used in a play?

Since it's not letting you use {{item}} in the template, you could do this:
- name: create the unison preference file
copy:
src: default.prf
dest: "/root/.unison/{{ item }}.prf"
force: no
with_items: "{{ groups['ndeployslaves'] }}"
- name: edit preference file
lineinfile:
dest: "/root/.unison/{{ item }}.prf"
line: "root = ssh://root#{{item}}//home"
regexp: '^root = ssh://'
with_items: "{{ groups['ndeployslaves'] }}"
The contents of default.prf on your local host should be:
root = /home
root = ssh://
ignore = Path virtfs
ignore = Path */mail
However I have {{item}} working in a template. Are you sure your whitespace is correct? src and dest need to be indented one level deeper than template, but with_items needs to be on the same level as template.
- name: create the unison preference file
template:
src: default.prf.j2
dest: "/root/.unison/{{ item }}.prf"
with_items: "{{ groups['ndeployslaves'] }}"

The error was caused by an indentation error.
The with_items: groups['ndeployslaves'] was indented a level deeper than it should have.

Related

Ansible win_file is not deleting dirs when looping over item

I am running into a silly issue when i try to delete some folders with win_file
first i copy some folders on the remote itself from one dir to another
- name: copy folders first
win_copy:
src: '{{ item }}'
dest: 'C:\folders\to\copy'
remote_src: yes
loop: '{{ paths_to_copy }}'
register: copied_folders
then i filter only the 'path' of those folders to be deleted later in the play after executng some other tasks.
- name: filter paths to be deleted after some tasks
set_fact:
paths_to_delete: "{{ copied_folders | json_query('results[*].dest') }}"
i get this results:
ok: [<computer>] => {
"ansible_facts": {
"paths_to_delete": [
"C:\\folders\\to\\copy\\1",
"C:\\folders\\to\\copy\\2",
"C:\\folders\\to\\copy\\3",
"C:\\folders\\to\\copy\\4"
]
},
"changed": false
}
all seems good but the playbook is failing when i loop over 'paths_to_delete' because it returns with all those 4 paths as ONE path.
- name: clean up temporary copied directories
win_file:
path: '{{ item }}'
state: absent
loop:
- '{{ paths_to_delete }}'
"msg": "Get-AnsibleParam: Parameter 'path' has an invalid path '['C:\\\\folders\\\\to\\\\copy\\\\1','C:\\\\folders\\\\to\\\\copy\\\\2','C:\\\\folders\\\\to\\\\copy\\\\3','C:\\\\folders\\\\to\\\\copy\\\\4'] specified."
why it is not looping over this list and deletes them one by one?
i am using the same mechanism in the first copy task, looping over a list and it DOES copy the folder one by one without any issue.
Any help would be much appreciated.
Your loop syntax is incorrect.
loop:
- '{{ paths_to_delete }}'
This nests the list inside another list with a single element. What you want to do is loop over the original list:
loop: '{{ paths_to_delete }}'

Ansible conditionally loop through with_items?

Is it possible to loop through a list of items if a string is defined in a variable i will specify.
Essentially i want to have a list of variables defined and utilized the aws_s3 module to download the files only if they are defined when running the playbook
e.g
say i have the list "var1,var2"
and I have the following variables defined:
apps_location:
- { name: 'vars1', src: 'vars1.tgz', dest: '/tmp/vars1_file.tgz' }
- { name: 'vars2', src: 'vars2.tgz', dest: '/tmp/vars2_file.tgz' }
- { name: 'vars3', src: 'vars3.tgz', dest: '/tmp/vars3_file.tgz' }
Task:
- name: "Splunk Search Head | Download Splunk Apps from S3"
aws_s3:
bucket: "{{ resource_bucket_name }}"
object: "{{ item.src }}"
dest: "{{ item.dest }}"
mode: get
with_items: "{{ apps_location }}"
I want to run the command:
ansible-playbook -i inventory -e "var1,var2"
and download only var1 and var2 on that specific run.
I tried utilizing "lookups" but couldnt get the syntax right. Im not entirely sure if this best way of doing this, but i want to have a predefined list of file locations and only download the ones that i'm passing during runtime.
Note the only reason "name" exists in apps_location is to see if i can do a lookup and only install that one but i couldnt get the syntax right.
Define a variable containing a list of defined apps. I'm trying:
- name: "Set Fact"
set_fact:
dict: "{{ apps_location[item].dest }}"
with_items: "{{ my_vars|default([]) }}"
However whenever I output dict I only get the last value.
Any help would be appreciated :)
The extra-vars must be an assignment of a variable and value. For example
shell> ansible-playbook -i inventory -e "my_vars=['vars1','vars2']"
A more convenient structure of the data would be a dictionary for this purpose. For example
apps_location:
vars1:
src: 'vars1.tgz'
dest: '/tmp/vars1_file.tgz'
vars2:
src: 'vars2.tgz'
dest: '/tmp/vars2_file.tgz'
vars3:
src: 'vars3.tgz'
dest: '/tmp/vars3_file.tgz'
Then the loop might look like
- aws_s3:
bucket: "{{ resource_bucket_name }}"
object: "{{ apps_location[item].src }}"
dest: "{{ apps_location[item].dest }}"
mode: get
loop: "{{ my_vars|default([]) }}"
Q: "Define a variable containing a list of defined apps."
A: Try this
- set_fact:
my_list: "{{ my_list(default([]) +
[apps_location[item].dest] }}"
loop: "{{ my_vars|default([]) }}"
(not tested)

Ansible How to loop inside the files for templates

Need to loop through the source files one by one for all hosts.
- hosts: epson*
become: yes
tasks:
- name: replace id
vars:
id: abc
template:
src: epson1.j2
dest: /home/epson.config
HOSTS FILE
[epson1]
1.1.1.1
[epson2]
1.1.1.1
[epson3]
1.1.1.1
and many more
epson1.j2
create element edge0 {
state="ENABLED"
id="{{ id }}"}
epson2.j2
create element edge1 {
state="ENABLED"
id="{{ id }}"}
I have many template files like epson1.j2, epson2.j2 and so on.
Right now i am able to do template variable replace for 1 host and for 1 file. How can I do for all files for all hosts.
like - host:epson1, src: epson1.j2, dest: /home/epson.config
host:epson2, src: epson2.j2, dest: /home/epson.config
host:epson3, src: epson3.j2, dest: /home/epson.config
need looping inside src for every hosts
You should be able to accomplish this simply by using the inventory_hostname magic variable.
- hosts: epson*
become: yes
tasks:
- name: replace id
vars:
id: abc
template:
src: {{ inventory_hostname }}.j2
dest: /home/epson.config
The Play will run once for each host and the correct .j2 will be used.

Can't copy files using loop

I'm trying to copy many files using ansible.
this is my playbook :
- name: Copy the scenario test
copy:
src: files/{{ scenario_name }}
dest: /home/{{ user }}/scenario_creation
mode: '0644'
run_once: true
loop: "{{ scenario_name }}"
tags:
- user
- scenario
and this is my roles/scenario_test/defaults/main.yml
scenario_name: ['topup-scenario.json', 'test.json']
when I execute my playbook it says:
"msg": "Could not find or access 'files/[u'topup-scenario.json', u'test.json']'\nSearched in:\n\t/home/path/ansible/plays/files/[u'topup-scenario.json', u'test.json']\n\t/home/path/ansible/plays/files/[u'topup-scenario.json', u'test.json'] on the Ansible Controller.\nIf you are using a module and expect the file to exist on the remote, see the remote_src option"
}
any help ?
Change:
src: files/
to
src: ./files/
You need to change your code to this:
- name: Copy the scenario test
copy:
src: files/{{ item }}
dest: /home/{{ user }}/scenario_creation
mode: '0644'
run_once: true
loop: "{{ scenario_name }}"
tags:
- user
- scenario
The loop iterates the list to the term 'item', unless you redefine it with the loop_var option. So when you call scenario_name in your src line, you are actually calling the entire list, not an iteration of it.

How to create the multiple symlinks for the folders/files under the same source

I want to create the folder: temp2 which is able to store all the symlinks of the subfolders/files of other foder: temp1. with_items can help complete this task, but it needs to list down all the folder/file name as below script shown:
- name: "create folder: temp2 to store symlinks"
file:
path: "/etc/temp2"
state: directory
- name: "create symlinks & store in temp2"
file:
src: "/etc/temp1/{{ item.src }}"
dest: "/etc/temp2/{{ item.dest }}"
state: link
force: yes
with_items:
- { src: 'BEAM', dest: 'BEAM' }
- { src: 'cfg', dest: 'cfg' }
- { src: 'Core', dest: 'Core' }
- { src: 'Data', dest: 'Data' }
It is not flexible as the subfolders/files under temp1 would be added or removed, and I need to update above script frequently to keep the symlinks as updated
Is there any way to detect all the files/folder under temp1 automatically instead of maintaining the with_items list?
The following code works under Ansible-2.8:
- name: Find all files in ~/commands
find:
paths: ~/commands
register: find
- name: Create symlinks to /usr/local/bin
become: True
file:
src: "{{ item.path }}"
path: "/usr/local/bin/{{ item.path | basename }}"
state: link
with_items: "{{ find.files }}"
You can create a list of files using find module:
Return a list of files based on specific criteria. Multiple criteria are AND’d together.
You'll likely need to leave recurse set to false (default) since you assume subfolders might exist.
You need to register the results of the module with register declaration:
register: find
In the next step you need to iterate over the files list from the results:
with_items: "{{ find.results.files }}"
and refer to the value of the path key. You already know how to do it.
You will also need to extract the filename from the path, so that you can append it to the destination path. Use basename filter for that.

Resources