I have multiple .json files on local host where I place my playbook:
json-file-path/{{ testName }}.json
{{ testName }}.json are: testA.json, testB.json, testC.json ... etc.
All .json files have same keys with different values like this:
json-file-path/testA.json:
{
“a_key”: “a_value1”
“b_key”: “b_value1”
}
json-file-path/testB.json:
{
“a_key”: “a_value2”
“b_key”: “b_value2”
}
json-file-path/testC.json:
{
“a_key”: “a_value3”
“b_key”: “b_value3”
}
.....
I need to access the key-value variables from all .json files and if the values meet some condition, I will perform some task in target host. For example, I have:
a_value1=3
a_value2=4
a_value3=1
I go through my .json file one by one, if a_key[value]>3, I will copy this .json file to target host, otherwise skip the task. In this case, I will only copy testC.json to target host.
How would I achieve this? I was thinking of re-constructing my .json files using {{ testName }} as dynamic key of dict like this:
{
“testName”: “testA”
{
“a_key”: “a_value1”
“b_key”: “b_value1”
}
So I can access my variable as {{ testName}}.a_key. So far I haven’t been able to achieve this.
I have tried the following in my playbook:
—-
- host: localhost
tasks:
- name: construct json files
vars:
my_vars:
a_key: “{{ a_value }}”
b_key: “{{ b_value }}”
with_dict: “{{ testName }}”
copy:
content: “{{ my_vars | to_nice_json }}”
dest: /json-file-path/{{ testName }}.json
My updated playbook are:
/mypath/tmp/include.yaml:
—-
- hosts: remote_hostName
tasks:
- name: load json files
set_fact:
json_data: “{{ lookup(‘file’, item) | from_json }}”
- name: copy json file if condition meets
copy:
src: “{{ item }}”
dest: “{{ /remote_host_path/tmp}}/{{item | basename }}”
delegate_to: “{{ remote_hostName }}”
when: json_data.a_key|int>5
/mypath/test.yml:
—-
- hosts: localhost
vars:
local_src_ dir: /mypath/tmp
remote_host: remote_hostName
remote_dest_dir: /remote_host_path/tmp
tasks:
- name: looping
include: include.yaml
with_fileglob:
- “{{ local_src_dir }}/*json”
All json files on localhost under /mypath/tmp/.
Latest version of playbook. It is working now:
/mypath/tmp/include.yaml:
—-
- name: loafing json flies
include_vars:
file: “{{ item }}”
name: json_data
- name: copy json file to remote if condition meets
copy:
src: “{{ item }}”
dest: ‘/remote_host_path/tmp/{{item | basename}}’
delegate_to: “{{ remote_host }}”
when: json_data.a_key > 5
/mypath/test.yml:
—-
- hosts: localhost
vars:
local_src_dir: /mypath/tmp
remote_host: remote_hostName
remote_dest_dir: /remote_host_path/tmp
tasks:
- name: looping json files
include: include.yaml
with_fileglob:
- “{{ local_src_dir }}”/*json”
I am hoping that I have understood your requirements correctly, and that this helps move you forward.
Fundamentally, you can load each of the JSON files so you can query the values as native Ansible variables. Therefore you can loop through all the files, read each one, compare the value you are interested in and then conditionally copy to your remote host via a delegated task. Therefore, give this a try:
Create an include file include.yaml:
---
# 'item' contains a path to a local JSON file on each pass of the loop
- name: Load the json file
set_fact:
json_data: "{{ lookup('file', item) | from_json }}"
- name: Delegate a copy task to the remote host conditionally
copy:
src: "{{ item }}"
dest: "{{ remote_dest_dir }}/{{ item | basename }}"
delegate_to: "{{ remote_host }}"
when: json_data.a_key > value_threshold
then in your playbook:
---
- hosts: localhost
connection: local
# Set some example vars, tho these could be placed in a variety of places
vars:
local_src_dir: /some/local/path
remote_host: <some_inventory_hostname>
remote_dest_dir: /some/remote/path
value_threshold: 3
tasks:
- name: Loop through all *json files, passing matches to include.yaml
include: include.yaml
loop: "{{ lookup('fileglob', local_src_dir + '/*json').split(',') }}"
Note: As you are running an old version of Ansible, you may need older alternate syntax for all of this to work:
In your include file:
- name: Load the json file
set_fact:
include_vars: "{{ item }}"
- name: Delegate a copy task to the remote host conditionally
copy:
src: "{{ item }}"
dest: "{{ remote_dest_dir }}/{{ item | basename }}"
delegate_to: "{{ remote_host }}"
when: a_key > value_threshold
and in your playbook:
- name: Loop through all *json files, passing matches to include.yaml
include: include.yaml
with_fileglob:
- "{{ local_src_dir }}/*json"
Related
I know the Ansible fetch-module can copy a file from remote to local, but what if I only need the contents (in my case a tmp file holding the ip address) appended into a local file?
Fetch module does this:
- name: Store file into /tmp/fetched/
ansible.builtin.fetch:
src: /tmp/somefile
dest: /tmp/fetched
I need it to do something like this:
- name: Store file into /tmp/fetched/
ansible.builtin.fetch:
src: /tmp/somefile.txt
dest: cat src >> /tmp/fetched.txt
In a nutshell:
- name: Get remote file content
ansible.builtin.slurp:
src: /tmp/somefile.txt
register: somefile
- name: Append remote file content to a local file
vars:
target_file: /tmp/fetched.txt
ansible.builtin.copy:
content: |-
{{ lookup('file', target_file) }}
{{ somefile.content | b64decode }}
dest: "{{ target_file }}"
# Fix write concurrency when running on multiple targets
throttle: 1
delegate_to: localhost
Notes:
the second task isn't idempotent (will modify the file on each run even with the same content to append)
this will work for small target files. If that file becomes huge and you experience high execution times / memory consumptions, you might want to switch to shell for the second task:
- name: Append remote file content to a local file
ansible.builtin.shell:
cmd: echo "{{ somefile.content | b64decode }}" >> /tmp/fetched
# You might still want to avoid concurrency with multiple targets
throttle: 1
delegate_to: localhost
Alternatively, you could write all contents from all fetched files from all your targets in one go to avoid the concurrency problem and gain some time.
# Copy solution
- name: Append remote files contents to a local file
vars:
target_file: /tmp/fetched.txt
fetched_content: "{{ ansible_play_hosts
| map('extract', hostvars, 'somefile.content')
| map('b64decode')
| join('\n') }}"
ansible.builtin.copy:
content: |-
{{ lookup('file', target_file) }}
{{ fetched_content }}
dest: "{{ target_file }}"
delegate_to: localhost
run_once: true
# Shell solution
- name: Append remote files contents to a local file
vars:
fetched_content: "{{ ansible_play_hosts
| map('extract', hostvars, 'somefile.content')
| map('b64decode')
| join('\n') }}"
ansible.builtin.shell:
cmd: echo "{{ fetched_content }}" >> /tmp/fetched
delegate_to: localhost
run_once: true
I'm looking for a way to sync files between 2 hosts in ansible. The scenario is as follows. I have a CSV file which contains 3 columns indicating directories which needs to be synced between 2 servers. the first 2 columns indicate the source and target servers and the third column indicates the directory
source, target, directory
src01, tgt02, dir0003
src02, tgt05, dir0004
src10, tgt68, dir1022
I found this answer for syncing files between 2 hosts - How to copy files between two nodes using ansible
Is there any way to parameterize this using a csv config file?
Yes. It's possible. In the first play read the CSV file and create group of targets. Use the new group in the second play and loop the synchronize module. For example the playbook
- hosts: localhost
tasks:
- read_csv:
path: db.csv
register: my_db
- add_host:
hostname: "{{ item.target }}"
groups: my_targets
my_list: "{{ my_db.list }}"
loop: "{{ my_db.list }}"
- hosts: my_targets
tasks:
- debug:
msg: "Copy {{ item.directory }} from {{ item.source }} to {{ inventory_hostname }}"
loop: "{{ my_list|json_query(query) }}"
vars:
query: "[?target == '{{ inventory_hostname }}']"
- name: Copy
synchronize:
src: "{{ item.directory }}"
dest: "{{ item.directory }}"
delegate_to: "{{ item.source }}"
loop: "{{ my_list|json_query(query) }}"
vars:
query: "[?target == '{{ inventory_hostname }}']"
(not tested)
gives
"msg": "Copy dir0004 from src02 to tgt05"
"msg": "Copy dir0003 from src01 to tgt02"
"msg": "Copy dir1022 from src10 to tgt68"
I have several files that I need to backup in different directories. I have tried the code below and not working for me.
vars:
file_vars:
- {name: /file1}
- {name: /etc/file2}
- {name: /etc/file/file3}
tasks:
- name: "Checking if config files exists"
stat:
path: "{{ item.name }}"
with_items: "{{ file_vars }}"
register: stat_result
- name: Backup Files
copy: src={{ item.name }} dest={{ item.name }}{{ ansible_date_time.date }}.bak
with_items: "{{ file_vars }}"
remote_src: yes
when: stat_result.stat.exists == True
The problem is the condition
when: stat_result.stat.exists == True
There is no attribute stat_result.stat. Instead, the attribute stat_result.results is a list of the results from the loop. It's possible to create a dictionary of files and their statuses. For example
- set_fact:
files_stats: "{{ dict(my_files|zip(my_stats)) }}"
vars:
my_files: "{{ stat_result.results|json_query('[].item.name') }}"
my_stats: "{{ stat_result.results|json_query('[].stat.exists') }}"
Then simply use this dictionary in the condition
when: files_stats[item.name]
Below is a shorter version which creates the dictionary more efficiently
- set_fact:
files_stats: "{{ dict(stat_result.results|
json_query('[].[item.name, stat.exists]')) }}"
Please try using below worked for me:
---
- name: Copy files
hosts: localhost
become: yes
become_user: root
vars_files:
- files.yml
tasks:
- name: "Checking if config files exists"
stat:
path: "{{ item }}"
with_items: "{{ files }}"
register: stat_result
- name: Ansible
debug:
msg: "{{ stat_result }}"
- name: Backup Files
copy:
src: "{{ item }}"
dest: "{{ item.bak }}"
with_items: "{{ files }}"
when: stat_result == "True"
and files.yml will look like:
---
files:
- /tmp/file1
- /tmp/file2
you can check you playbook syntax using below command:
ansible-playbook copy.yml --syntax-check
Also you do dry run your playbook before actual execution.
ansible-playbook -i localhost copy.yml --check
I have created playbook which will run on a remote host and check whether the files exist or not. I want to extract the only files which are not present on the remote host. But my playbook giving all paths whether they are present or not.
Playbook:-
- name: Playbook for files not present on remote hosts
hosts: source
gather_facts: false
vars:
Filepath: /opt/webapps/obiee/oracle_common/inventory/ContentsXML/comps.xml
tasks:
- name: Getting files location path
shell: grep -i "COMP NAME" {{ Filepath }} |sed 's/^.*INST_LOC="//'|cut -f1 -d'"' | sed '/^$/d;s/[[:blank:]]//g' // extract files from comps.xml
register: get_element_attribute
- name: check path present or not
stat:
path: "{{ item }}"
with_items:
- "{{ get_element_attribute.stdout_lines }}"
register: path_output
- name: path exists or not
set_fact:
path_item: "{{ item }}" # here i am getting the output as expected that's files not present on remote host
with_items: "{{ path_output.results }}"
register: final_output
when: item.stat.exists == False
- debug:
var: final_output # giving both output i.e. files present and absent
- name: Create a fact list
set_fact:
paths: "{{ final_output.results | map(attribute='item.item') | list }}" # i have add this condition " item.stat.exists == False' inside this stmt
- name: Print Fact
debug:
var: paths
The issue resolved by using below command:
- name: Create a fact list
set_fact:
paths: "{{ final_output.results | selectattr('item.stat.exists', 'equalto', false) | map(attribute='item.item') | list }}"
register: config_facts
The following query should get all the file names which don't exsist on the remote host and store them in the fact 'paths':
- name: Create a fact list
set_fact:
paths: "{{ final_output | json_query(query)}}"
vars:
query: "results[?(#._ansible_item_label.stat.exists==`false`)]._ansible_item_label.item"
I am storing all ansible variables to a yaml file (filtering out those that starting with 'ansible_') with this playbook:
- hosts: localhost
tasks:
- set_fact:
all_vars: "{{all_vars | default({}) |combine({item.key: item.value})}}"
when: "{{not item.key.startswith('ansible_')}}"
with_dict: "{{vars}}"
- copy:
content: "{{ all_vars }}"
dest: "/tmp/tmp.yml"
This is group_vars/all/defaults.yml
SOME_FACT1: "some-fact"
SOME_FACT2: "{{ SOME_FACT1 }}"
SOME_FACT3: "{{ SOME_FACT2 }}"
This works perfectly with ansible 2.2. But with ansible 2.3 (2.3.1.0) the variables are not rendered.
I get results like this:
... "SOME_FACT1": "some-fact", "SOME_FACT3": "{{ SOME_FACT2 }}", "SOME_FACT2": "{{ SOME_FACT1 }}" ...
How can i force ansible 2.3 to render the variables?
The problem seems, that ansible will not render vars and (I do not know why) all_vars. But any variable inside vars/all_vars is rendered properly when used directly.
So this works:
- hosts: localhost
tasks:
- set_fact:
all_vars: "{{all_vars | default([]) |union([item.key + ':{{' + item.key + '|to_json}}'])}}"
when: "{{not item.key.startswith('ansible_')}}"
with_dict: "{{vars}}"
- copy:
content: "{{ all_vars | join('\n') }}"
dest: "/tmp/tmp1.yml"
- template:
src: "/tmp/tmp1.yml"
dest: "/tmp/tmp.yml"
The idea is:
Create a file that lists all variables in the format
SOME_VAR: {{ SOME_VAR | to_json }}
...
Render that file using template.
Not very nice, but it works.