How do I loop over each line inside a file with ansible? - ansible

I am looking for something that would be similar to with_items: but that would get the list of items from a file instead of having to include it in the playbook file.
How can I do this in ansible?

I managed to find an easy alternative:
- debug: msg="{{item}}"
with_lines: cat files/branches.txt

Latest Ansible recommends loop instead of with_something. It can be used in combination with lookup and splitlines(), as Ikar Pohorský pointed out:
- debug: msg="{{item}}"
loop: "{{ lookup('file', 'files/branches.txt').splitlines() }}"
files/branches.txt should be relative to the playbook

Lets say you have a file like
item 1
item 2
item 3
And you want to install these items. Simply get the file contents to a variable using register. And use this variable for with_items. Make sure your file has one item per line.
---
- hosts: your-host
remote_user: your-remote_user
tasks:
- name: get the file contents
command: cat /path/to/your/file
register: my_items
- name: install these items
pip: name:{{item}}
with_items: my_items.stdout_lines

I am surprised that nobody mentioned the ansible Lookups, I think that is exactly what you want.
It reads contents that you want to use in your playbook but do not want to include inside the playbook from files, pipe, csv, redis etc from your local control machine(not from remote machine, that is important, since in most cases, these contents are alongside your playbook on your local machine), and it works with ansible loops.
---
- hosts: localhost
gather_facts: no
tasks:
- name: Loop over lines in a file
debug:
var: item
with_lines: cat "./files/lines"
with_lines here is actually loop with lines lookup, to see how the lines lookup works, see the code here, it just runs any commands you give it(so you can give it any thing like echo, cat etc), then split the output into lines and return them.
There are many powerful lookups, to get the comprehensive list, check out the lookup plugins folder.

Related

How can I put the discovered values into loop variables so that they are on one line

How can I put the discovered values into loop variables so that they are on one line using Ansible task? I have now task like this
- name: Updating test.conf
lineinfile:
path: "/root/test.conf"
regexp: "test="
line: "test={{ hostvars[item]['ansible_env'].SSH_CONNECTION.split(' ')[2] }}"
state: present
with_nested:
- "{{groups['app']}}"
It needs that when invoking the job, it takes the IP addresses from the servers that are in the app group and puts them on a single line. Currently, it performs such a substitution twice with which they are replaced and finally there is only one address in the test parameter.
I need format after task like this:
test=1.1.1.1, 2.2.2.2
While jinja2 is heavily inspired from python, its does not allow you to do all the same operations. To join a list your would have to do something like:
- debug:
msg: "{{ myvar | join(',') }}"
vars:
myvar:
- foo
- bar
When in doubt, always use a simple playwook with a debug task to validate your jinja code.

How to substitute values to a conf file using Ansible

I have an ansible playbook that has created a file (/tmp/values.txt) in each server with the following content
1.1.1.1
2.2.2.2
3.3.3.3
4.4.4.4
5.5.5.5
And I have a conf file named etcd.conf in the /tmp/etcd.conf with the following content
and I need to substitute the value of each line from /tmp/values.txt, so /tmp/etcd.conf will look like this:
SELF_IP=1.1.1.1
NODE_1_IP=2.2.2.2
NODE_2_IP=3.3.3.3
NODE_3_IP=4.4.4.4
NODE_4_IP=5.5.5.5
Is there a way I can do this? I used the following lookup method but it only works in the controll server
- name: Create etcd.conf
template:
src: etcd.conf.j2
dest: /tmp/etcd.conf
vars:
my_values: "{{ lookup('file', '/tmp/values.txt').split('\n') }}"
my_vars: [SELF_IP, NODE_1_IP, NODE_2_IP, NODE_3_IP, NODE_4_IP]
my_dict: "{{ dict(my_vars|zip(my_values)) }}"
You can use the slurp: task to grab the content for each machine, or use command: cat /tmp/values.txt and then examine the stdout_lines of its register:-ed variable to achieve that same .split("\n") behavior (I believe you'll still need to use that .split call after | b64decode if you use the slurp approach)
What will likely require some massaging is that you'll need to identify the group (which may very well include all) of inventory hostnames for which that command will produce content, and then grab the hostvar out of them to get all values
Conceptually, similar to [ hostvars[hv].tmp_values.stdout_lines for hv in groups[the_group] ] | join("\n"), but it'll be tedious to write out, so I'd only want to do that if you aren't able to get it to work on your own

Ansible - Log stdout lines from each remote host to a single file on the local server

Is there an easy way to log output from multiple remote hosts to a single file on the server running ansible-playbook?
I have a variable called validate which stores the output of a command executed on each server. I want to take validate.stdout_lines and drop the lines from each host into one file locally.
Here is one of the snippets I wrote but did not work:
- name: Write results to logfile
blockinfile:
create: yes
path: "/var/log/ansible/log"
insertafter: BOF
block: "{{ validate.stdout }}"
delegate_to: localhost
When I executed my playbook w/ the above, it was only able to capture the output from one of the remote hosts. I want to capture the lines from all hosts in that single /var/log/ansible/log file.
One thing you should do is to add a marker to the blockinfile to wrap the result from each single host in a unique block.
The second problem is that the tasks would run in parallel (even with delegate_to: localhost, because the loop here is realised by the Ansible engine) with effectively one task overwriting the other's /var/log/ansible/log file.
As a quick workaround you can serialise the whole play:
- hosts: ...
serial: 1
tasks:
- name: Write results to logfile
blockinfile:
create: yes
path: "/var/log/ansible/log"
insertafter: BOF
block: "{{ validate.stdout }}"
marker: "# {{ inventory_hostname }} {mark}"
delegate_to: localhost
The above produces the intended result, but if serial execution is a problem, you might consider writing your own loop for this single task (for ideas refer to support for "serial" on an individual task #12170).
Speaking of other methods, in two tasks: you can concatenate the results into a single list (no issue with parallel execution then, but pay attention to delegated facts) and then write to a file using copy module (see Write variable to a file in Ansible).

Log variables to a logfile on ansible host

I have a playbook that registers three variables. I want to produce a CSV report of those three variables on all hosts in my inventory.
This SO answer suggests to use:
- local_action: copy content={{ foo_result }} dest=/path/to/destination/file
But that does not append to the csv file.
Also, I have to manually compose by comma separators in this case.
Any ideas on how to log (append) variables to a local file?
If you are wanting to append a line to a file rather than replace it's contents then this is probably best suited to the lineinfile module and utilising the module's ability to insert a line at the end of the file.
The equivalent task to the copy one that you used would be something like:
- name: log foo_result to file
lineinfile:
line: "{{ foo_result }}"
insertafter: EOF
dest: /path/to/destination/file
delegate_to: 127.0.0.1
Note that I've used the long hand for delegating tasks locally rather than local_action. I personally feel that the syntax reads a lot clearer but you could easily use the following instead if you prefer the more compact syntax of local_action:
- local_action: lineinfile line={{ foo_result }} insertafter=EOF dest=/path/to/destination/file

Finding file name in files section of current Ansible role

I'm fairly new to Ansible and I'm trying to create a role that copies a file to a remote server. The local file can have a different name every time I'm running the playbook, but it needs to be copied to the same name remotely, something like this:
- name: copy file
copy:
src=*.txt
dest=/path/to/fixedname.txt
Ansible doesn't allow wildcards, so when I wrote a simple playbook with the tasks in the main playbook I could do:
- name: find the filename
connection: local
shell: "ls -1 files/*.txt"
register: myfile
- name: copy file
copy:
src="files/{{ item }}"
dest=/path/to/fixedname.txt
with_items:
- myfile.stdout_lines
However, when I moved the tasks to a role, the first action didn't work anymore, because the relative path is relative to the role while the playbook executes in the root dir of the 'roles' directory. I could add the path to the role's files dir, but is there a more elegant way?
It looks like you need access to a task that looks up information locally, and then uses that information as input to the copy module.
There are two ways to get local information.
use local_action:. That's shorthand for running the task agains 127.0.0.1, more info found here. (this is what you've been using)
use a lookup. This is a plugin system specifically designed for getting information locally. More info here.
In your case, I would go for the second method, using lookup. You could set it up like this example:
vars:
local_file_name: "{{ lookup('pipe', 'ls -1 files/*.txt') }}"
tasks:
- name: copy file
copy: src="{{ local_file_name }}" dest=/path/to/fixedname.txt
Or, more directly:
tasks:
- name: copy file
copy: src="{{ lookup('pipe', 'ls -1 files/*.txt') }}" dest=/path/to/fixedname.txt
With regards to paths
the lookup plugin is run from the context of the task (playbook vs role). This means that it will behave differently depending on where it's used.
In the setup above, the tasks are run directly from a playbook, so the working dir will be:
/path/to/project -- this is the folder where your playbook is.
If you where to add the task to a role, the working dir would be:
/path/to/project/roles/role_name/tasks
In addition, the file and pipe plugins run from within the role/files folder if it exists:
/path/to/project/roles/role_name/files -- this means your command is ls -1 *.txt
caveat:
The plugin is called every time you access the variable. This means you cannot trust debugging the variable in your playbook, and then relying on the variable to have the same value when used later in a role!
I do wonder though, about the use-case for a file that resides inside a projects ansible folders, but who's name is not known in advance. Where does such a file come from? Isn't it possible to add a layer in between the generation of the file and using it in Ansible... or having a fixed local path as a variable? Just curious ;)
Just wanted to throw in an additional answer... I have the same problem as you, where I build an ansible bundle on the fly and copy artifacts (rpms) into a role's files folder, and my rpms have versions in the filename.
When I run the ansible play, I want it to install all rpms, regardless of filenames.
I solved this by using the with_fileglob mechanism in ansible:
- name: Copy RPMs
copy: src="{{ item }}" dest="{{ rpm_cache }}"
with_fileglob: "*.rpm"
register: rpm_files
- name: Install RPMs
yum: name={{ item }} state=present
with_items: "{{ rpm_files.results | map(attribute='dest') | list }}"
I find it a little bit cleaner than the lookup mechanism.

Resources