A file has the following contents
com.dkr.container.id=a43019cc-d4a4-4acb-83dd-defd76443c6a
com.dkr.container.account=12HJB
I need to fetch a43019cc-d4a4-4acb-83dd-defd76443c6a and write it to a variable using an Ansible task. This value need to be passed to other tasks in the same Ansible file.
Can someone show me the required task to achieve this.
If your file is on the controller, you can use the file lookup to get its content.
If the file is on the node, you will have to use something like the slurp module.
Then, when you have the file content, you can use the regex_search filter to extract your required text.
With the file on the controller:
- set_fact:
com_dkr_container_id: >-
{{
lookup('file', '/path/to/file')
| regex_search('com\.dkr\.container\.id=(.*)', '\1')
| first
}}
With the file on the node(s):
- slurp:
src: /path/to/file
register: file_content
- set_fact:
com_dkr_container_id: >-
{{
file_content.content
| b64decode
| regex_search('com\.dkr\.container\.id=(.*)', '\1')
| first
}}
This is the job for the ini lookup plugin. See
shell> ansible-doc -t lookup ini
For example, given the file
shell> cat container.properties
com.dkr.container.id=a43019cc-d4a4-4acb-83dd-defd76443c6a
com.dkr.container.account=12HJB
The playbook
- hosts: localhost
tasks:
- set_fact:
id: "{{ lookup('ini', 'com.dkr.container.id
type=properties
file=container.properties') }}"
- debug:
var: id
gives
id: a43019cc-d4a4-4acb-83dd-defd76443c6a
The lookup plugins work on the controller only. If the file is at the remote host fetch it, e.g. given the file
shell> ssh admin#test_11 cat container.properties
com.dkr.container.id=a43019cc-d4a4-4acb-83dd-defd76443c6a
com.dkr.container.account=12HJB
The playbook
- hosts: test_11
tasks:
- fetch:
src: container.properties
dest: /tmp/fetched
- set_fact:
id: "{{ lookup('ini', 'com.dkr.container.id
type=properties
file=/tmp/fetched/{{ inventory_hostname }}/container.properties') }}"
- debug:
var: id
gives the same result
id: a43019cc-d4a4-4acb-83dd-defd76443c6a
The playbook above is idempotent. The file will be stored at the controller
shell> tree /tmp/fetched/
/tmp/fetched/
└── test_11
└── container.properties
Related
I've got an Ansible playbook that pulls interface descriptions from two routers and writes the results to a CSV file. When it iterates through the interfaces it writes one interface per router to the file
---
- name: Cisco get ints
hosts: test
gather_facts: false
connection: local
become: false
vars:
csv_path: /tmp
csv_filename: int_audit.csv
headers: Hostname,Interface,Description
tasks:
- name: Save CSV headers
ansible.builtin.lineinfile:
dest: "{{ csv_path }}/{{ csv_filename }}"
line: "{{ headers }}"
create: true
state: present
delegate_to: localhost
run_once: true
- name: run show inventory on remote device
iosxr_facts:
gather_subset: interfaces
register: output
- name: Write int desc to csv file
loop: "{{ output.ansible_facts.ansible_net_interfaces | dict2items }}"
lineinfile:
dest: "{{ csv_path }}/{{ csv_filename }}"
line: "{{ output.ansible_facts.ansible_net_hostname }},{{ item.key }},{{ item.value.description }}"
create: true
state: present
delegate_to: localhost
so I end up with a list that has no order.
$ cat /tmp/int_audit.csv
Hostname,Interface,Description
RTR1.LAB1,BVI13,LOCAL:RTR2.LAB1:[L3]
RTR1.LAB1,Bundle-Ether1100.128,LOCAL:RTR2.LAB1:BUNDLE1100:20GE[UTIL]
RTR2.LAB1,Bundle-Ether1100.128,LOCAL:RTR1.LAB1:BUNDLE1100:20GE[UTIL]
RTR1.LAB1,Loopback0,LOOP:LOOP0-RTR1.LAB1:[N/A]
RTR2.LAB1,Loopback0,LOOP:LOOP0-RTR2.LAB1:[N\A]
I'd like to have it sort the list by router name.
Any help is appreciated.
You could in example achieve you goal by simply post-processing the file on the Control Node.
For the test file
cat test.csv
Hostname,Interface,Description
RTR1.LAB1,BVI13,LOCAL:RTR2.LAB1:[L3]
RTR1.LAB1,Bundle-Ether1100.128,LOCAL:RTR2.LAB1:BUNDLE1100:20GE[UTIL]
RTR2.LAB1,Bundle-Ether1100.128,LOCAL:RTR1.LAB1:BUNDLE1100:20GE[UTIL]
RTR1.LAB1,Loopback0,LOOP:LOOP0-RTR1.LAB1:[N/A]
RTR2.LAB1,Loopback0,LOOP:LOOP0-RTR2.LAB1:[N\A]
the sort command will result into
sort -k1 -n -t, test.csv
Hostname,Interface,Description
RTR1.LAB1,Bundle-Ether1100.128,LOCAL:RTR2.LAB1:BUNDLE1100:20GE[UTIL]
RTR1.LAB1,BVI13,LOCAL:RTR2.LAB1:[L3]
RTR1.LAB1,Loopback0,LOOP:LOOP0-RTR1.LAB1:[N/A]
RTR2.LAB1,Bundle-Ether1100.128,LOCAL:RTR1.LAB1:BUNDLE1100:20GE[UTIL]
RTR2.LAB1,Loopback0,LOOP:LOOP0-RTR2.LAB1:[N\A]
Similar Q&A
Sort CSV file based on first column
How to sort CSV by specific column
and more
Sort CSV file by multiple columns using the sort command
Thanks all, I've written a perl script (which I call in ansible) to do the sort after it's stored in the csv file
I know the Ansible fetch-module can copy a file from remote to local, but what if I only need the contents (in my case a tmp file holding the ip address) appended into a local file?
Fetch module does this:
- name: Store file into /tmp/fetched/
ansible.builtin.fetch:
src: /tmp/somefile
dest: /tmp/fetched
I need it to do something like this:
- name: Store file into /tmp/fetched/
ansible.builtin.fetch:
src: /tmp/somefile.txt
dest: cat src >> /tmp/fetched.txt
In a nutshell:
- name: Get remote file content
ansible.builtin.slurp:
src: /tmp/somefile.txt
register: somefile
- name: Append remote file content to a local file
vars:
target_file: /tmp/fetched.txt
ansible.builtin.copy:
content: |-
{{ lookup('file', target_file) }}
{{ somefile.content | b64decode }}
dest: "{{ target_file }}"
# Fix write concurrency when running on multiple targets
throttle: 1
delegate_to: localhost
Notes:
the second task isn't idempotent (will modify the file on each run even with the same content to append)
this will work for small target files. If that file becomes huge and you experience high execution times / memory consumptions, you might want to switch to shell for the second task:
- name: Append remote file content to a local file
ansible.builtin.shell:
cmd: echo "{{ somefile.content | b64decode }}" >> /tmp/fetched
# You might still want to avoid concurrency with multiple targets
throttle: 1
delegate_to: localhost
Alternatively, you could write all contents from all fetched files from all your targets in one go to avoid the concurrency problem and gain some time.
# Copy solution
- name: Append remote files contents to a local file
vars:
target_file: /tmp/fetched.txt
fetched_content: "{{ ansible_play_hosts
| map('extract', hostvars, 'somefile.content')
| map('b64decode')
| join('\n') }}"
ansible.builtin.copy:
content: |-
{{ lookup('file', target_file) }}
{{ fetched_content }}
dest: "{{ target_file }}"
delegate_to: localhost
run_once: true
# Shell solution
- name: Append remote files contents to a local file
vars:
fetched_content: "{{ ansible_play_hosts
| map('extract', hostvars, 'somefile.content')
| map('b64decode')
| join('\n') }}"
ansible.builtin.shell:
cmd: echo "{{ fetched_content }}" >> /tmp/fetched
delegate_to: localhost
run_once: true
I have multiple .json files on local host where I place my playbook:
json-file-path/{{ testName }}.json
{{ testName }}.json are: testA.json, testB.json, testC.json ... etc.
All .json files have same keys with different values like this:
json-file-path/testA.json:
{
“a_key”: “a_value1”
“b_key”: “b_value1”
}
json-file-path/testB.json:
{
“a_key”: “a_value2”
“b_key”: “b_value2”
}
json-file-path/testC.json:
{
“a_key”: “a_value3”
“b_key”: “b_value3”
}
.....
I need to access the key-value variables from all .json files and if the values meet some condition, I will perform some task in target host. For example, I have:
a_value1=3
a_value2=4
a_value3=1
I go through my .json file one by one, if a_key[value]>3, I will copy this .json file to target host, otherwise skip the task. In this case, I will only copy testC.json to target host.
How would I achieve this? I was thinking of re-constructing my .json files using {{ testName }} as dynamic key of dict like this:
{
“testName”: “testA”
{
“a_key”: “a_value1”
“b_key”: “b_value1”
}
So I can access my variable as {{ testName}}.a_key. So far I haven’t been able to achieve this.
I have tried the following in my playbook:
—-
- host: localhost
tasks:
- name: construct json files
vars:
my_vars:
a_key: “{{ a_value }}”
b_key: “{{ b_value }}”
with_dict: “{{ testName }}”
copy:
content: “{{ my_vars | to_nice_json }}”
dest: /json-file-path/{{ testName }}.json
My updated playbook are:
/mypath/tmp/include.yaml:
—-
- hosts: remote_hostName
tasks:
- name: load json files
set_fact:
json_data: “{{ lookup(‘file’, item) | from_json }}”
- name: copy json file if condition meets
copy:
src: “{{ item }}”
dest: “{{ /remote_host_path/tmp}}/{{item | basename }}”
delegate_to: “{{ remote_hostName }}”
when: json_data.a_key|int>5
/mypath/test.yml:
—-
- hosts: localhost
vars:
local_src_ dir: /mypath/tmp
remote_host: remote_hostName
remote_dest_dir: /remote_host_path/tmp
tasks:
- name: looping
include: include.yaml
with_fileglob:
- “{{ local_src_dir }}/*json”
All json files on localhost under /mypath/tmp/.
Latest version of playbook. It is working now:
/mypath/tmp/include.yaml:
—-
- name: loafing json flies
include_vars:
file: “{{ item }}”
name: json_data
- name: copy json file to remote if condition meets
copy:
src: “{{ item }}”
dest: ‘/remote_host_path/tmp/{{item | basename}}’
delegate_to: “{{ remote_host }}”
when: json_data.a_key > 5
/mypath/test.yml:
—-
- hosts: localhost
vars:
local_src_dir: /mypath/tmp
remote_host: remote_hostName
remote_dest_dir: /remote_host_path/tmp
tasks:
- name: looping json files
include: include.yaml
with_fileglob:
- “{{ local_src_dir }}”/*json”
I am hoping that I have understood your requirements correctly, and that this helps move you forward.
Fundamentally, you can load each of the JSON files so you can query the values as native Ansible variables. Therefore you can loop through all the files, read each one, compare the value you are interested in and then conditionally copy to your remote host via a delegated task. Therefore, give this a try:
Create an include file include.yaml:
---
# 'item' contains a path to a local JSON file on each pass of the loop
- name: Load the json file
set_fact:
json_data: "{{ lookup('file', item) | from_json }}"
- name: Delegate a copy task to the remote host conditionally
copy:
src: "{{ item }}"
dest: "{{ remote_dest_dir }}/{{ item | basename }}"
delegate_to: "{{ remote_host }}"
when: json_data.a_key > value_threshold
then in your playbook:
---
- hosts: localhost
connection: local
# Set some example vars, tho these could be placed in a variety of places
vars:
local_src_dir: /some/local/path
remote_host: <some_inventory_hostname>
remote_dest_dir: /some/remote/path
value_threshold: 3
tasks:
- name: Loop through all *json files, passing matches to include.yaml
include: include.yaml
loop: "{{ lookup('fileglob', local_src_dir + '/*json').split(',') }}"
Note: As you are running an old version of Ansible, you may need older alternate syntax for all of this to work:
In your include file:
- name: Load the json file
set_fact:
include_vars: "{{ item }}"
- name: Delegate a copy task to the remote host conditionally
copy:
src: "{{ item }}"
dest: "{{ remote_dest_dir }}/{{ item | basename }}"
delegate_to: "{{ remote_host }}"
when: a_key > value_threshold
and in your playbook:
- name: Loop through all *json files, passing matches to include.yaml
include: include.yaml
with_fileglob:
- "{{ local_src_dir }}/*json"
I have seen how to register variables within tasks in an ansible playbook and then use those variables elsewhere in the same playbook, but can you register a variable in an included playbook and then access those variables back in the original playbook?
Here is what I am trying to accomplish:
This is my main playbook:
- include: sub-playbook.yml job_url="http://some-jenkins-job"
- hosts: localhost
roles:
- some_role
sub-playbook.yml:
---
- hosts: localhost
tasks:
- name: Collect info from Jenkins Job
script: whatever.py --url "{{ job_url }}"
register: jenkins_artifacts
I'd like to be able to access the jenkins_artifacts results back in main_playbook if possible. I know you can access it from other hosts in the same playbook like this: "{{ hostvars['localhost']['jenkins_artifacts'].stdout_lines }}"
Is it the same idea for sharing across playbooks?
I'm confused what this question is about. Just use the variable name jenkins_artifacts:
- include: sub-playbook.yml job_url="http://some-jenkins-job"
- hosts: localhost
debug:
var: jenkins_artifacts
This might seem complicated but I love doing this in my Playbooks:
rc defines the name of the variable which contains the return value
ar gives the arguments to the include tasks
master.yml:
- name: verify_os
include_tasks: "verify_os/main.yml"
vars:
verify_os:
rc: "isos_present"
ar:
image: "{{ os.ar.to_os }}"
verify_os/main.yml:
---
- name: check image on device
ios_command:
commands:
- "sh bootflash: | inc {{ verify_os.ar.image }}"
register: image_check
- name: check if available
shell: "printf '{{ image_check.stdout_lines[0][0] }}\n' | grep {{ verify_os.ar.image }} | wc -l"
register: image_available
delegate_to: localhost
- set_fact: { "{{ verify_os.rc }}": "{{ true if image_available.stdout == '1' else false }}" }
...
I can now use the isos_present variable anywhere in the master.yml to access the returned value.
Using ansible, I need to put a list of hosts in line in a file like so:
["127.0.0.1", "127.0.0.2", "127.0.0.3"]
But whenever I achieve this format, ansible interprets it as a list and the content of the file is this pythonic version:
['127.0.0.1', '127.0.0.2', '127.0.0.3']
Here's my attempts to get it out thus far:
---
- hosts: all
gather_facts: False
tasks:
- set_fact:
myhosts:
- 127.0.0.1
- 127.0.0.2
- 127.0.0.3
# This comes out as a list, I need a string
- set_fact:
var: "[ \"{{ myhosts | join('\", \"')}}\" ]"
- debug: var=var
# This comes out as a string, but I need no underscore on it
- set_fact:
var: "_[ \"{{ myhosts | join('\", \"')}}\" ]"
- debug: var=var
# This also comes out as a list
- set_fact:
var: >
[ "{{ myhosts | join('", "')}}" ]
- debug: var=var
# Also parsed as a list
- set_fact:
var: "{{ myhosts | to_json }}"
- debug: var=var
# ansible-playbook -i "localhost," this_file.yml
There are some filters that prevent Ansible template engine from doing string evaluation.
This list of filters is stored in STRING_TYPE_FILTERS setting.
In Ansible 2.1 it contains: string, to_json, to_nice_json, to_yaml, ppretty, json.
So, you can do this:
- lineinfile: line="{{ myhosts | to_json }}" dest=output.txt
This will add ["127.0.0.1", "127.0.0.2", "127.0.0.3"] line to the file.
And don't believe debug's output when dealing with exact string formatting.
Always use copy: content="{{ string_output_to_test | string }}" dest=test.txt and check file contents to be sure.
debug: var=myvar will always template with evaluation, so your string will always be printed as a list.
debug: msg="{{ myvar | string }}" will print myvar as JSON encoded string.