Ansible: Use of Diff command using Ansible - ansible

I am trying to one simple task which is find out the difference between the two files and store it in notepad. I am not able to do it with command as well as shell. Please suggest where i am going wrong-
---
- hosts: myserver
tasks:
- name: get the difference
command: diff hosts.new hosts.mod
register: diff
- debug: var=diff.cmd
Error -
fatal: [zlp12037]: FAILED! => {"changed": true, "cmd": ["diff", "hosts.new", "hosts.mod"], "delta": "0:00:00.003102", "end": "2017-03-29 10:17:34.448063", "failed": true, "rc": 1, "start": "2017-03-29 10:17:34.444961", "stderr": "", "stdout":

I'm not quite sure what your input play looks like with your formatting. But the following should be a solution:
- name: "Get difference from two files"
command: diff filea fileb
args:
chdir: "/home/user/"
failed_when: "diff.rc > 1"
register: diff
- name: debug output
debug: msg="{{ diff.stdout }}"
Some explanation:
If something fails with the diff command, the return code is > 1. We evaluate this by the "failed_when".
To get the output of the command, we print the ".stdout" element.
To make sure we're in the folder where the files are, we use "chdir".

I would move the hosts.new or hosts.mod to the ansible control machine.
Run the copy module with the src as hosts.new and the dest as hosts.mod with --check and --diff. I find this method most useful to spot differences in files across a large enterprise.
Run:
ansible all -m copy -a "src=hosts.new dest=/tmp/hosts.mod" --check --diff -i hosts
Output:
--- before: /tmp/hosts.mod
+++ after: /home/ansible/hosts.new
## -1,5 +1,5 ##
host1
+host2
host3
host4
-host6
-host99
+host5
test10 | SUCCESS => {
"changed": true,
"failed": false
}

Related

Not able to gather facts of ansible host machine

Set up module in ansible gives an error when i tried to set custom facts on host machine using control machine
---
- hosts: test-servers
gather_facts: false
tasks:
- name: deleting Facts directory
file:
path: /etc/ansible/facts.d/
state: absent
- name: Creates a directiory
file:
path: /etc/ansible/facts.d/
recurse: yes
state: directory
- name: Copy custom date facts to host machine
copy:
src: /app/ansible_poc/roles/custom_facts/templates/facts.d/getdate.fact
dest: /etc/ansible/facts.d/getdate.fact
mode: 0755
- name: Copy custom role facts to host machine
copy:
src: /app/ansible_poc/roles/custom_facts/templates/facts.d/getrole.fact
dest: /etc/ansible/facts.d/getrole.fact
mode: 0755
- name: Reloading facts
setup:
- name: Display message
debug:
msg: "{{ ansible_local.getdate.date.date }}"
- name: Display message
debug:
msg: "{{ ansible_local.getrole.role.role }}"
I get following error when i tried to collect facts of ansible host machine. I have set up a file getdate.fact and getrole.fact which has code respectively
#############getdate.fact###############
echo [date]
echo date= `date`
########################################
#############getrole.fact###############
echo [role]
echo role= `whoami`
########################################
and when i tried to run the playbook main.yml then it following error.
[root#ansibletower tasks]# ansible -m setup test-servers
192.168.111.28 | FAILED! => {
"changed": false,
"cmd": "/etc/ansible/facts.d/getdate.fact",
"msg": "[Errno 8] Exec format error",
"rc": 8
}
192.168.111.27 | FAILED! => {
"changed": false,
"cmd": "/etc/ansible/facts.d/getdate.fact",
"msg": "[Errno 8] Exec format error",
"rc": 8
}
If I recall correctly, executables are expected to return JSON:
#!/bin/bash
echo '{ "date" : "'$( date )'" }'
You probably need to add "shebang" line to your fact scripts. I.e., getdate.fact should look like:
#!/bin/sh
echo [date]
echo date=`date`

when condition to evaluate list of values inside with_items

My intention with below playbook is to run a shell command only when it finds any of the value from disk_list. I need help to frame out when condition as told above.
---
- hosts: localhost
connection: local
gather_facts: false
tasks:
- set_fact:
disk_list:
- sda
- sdb
- sdc
- name: Get df -hT output to know XFS file system
shell: df -hT |grep xfs|awk '{print $1}'
register: df_result
- name: Run shell command on each XFS file system
shell: ls -l {{ item }} | awk '{print $1}'
with_items: "{{ df_result.stdout_lines }}"
when: "{{ disk_list.[] }} in {{ item }}"
BTW, in my system, "df_result" variable looks as below:
TASK [debug] ***************************************************************************************************************************************
ok: [localhost] => {
"df_result": {
"changed": true,
"cmd": "df -hT |grep xfs|awk '{print $1}'",
"delta": "0:00:00.017588",
"end": "2019-03-01 23:55:21.318871",
"failed": false,
"rc": 0,
"start": "2019-03-01 23:55:21.301283",
"stderr": "",
"stderr_lines": [],
"stdout": "/dev/sda3\n/dev/sda1",
"stdout_lines": [
"/dev/sda3",
"/dev/sda1"
]
}
}
Please help !
Some notes on the playbook.
For localhost, connection: local is implicit. So there is no technical need to specify this.
set_fact is correct, but in this case I believe is more appropiate to use "vars" at a play level instead of the set_fact module in a task. This also allows you to use "vars_files" for easier use of disk device lists.
I would reccomend to try to keep your task names "easy", as you may want to use the --start-at-task option at some point.
with_items is depricated in favor of "loop".
The conditional "when" works at a task level. So it will execute the task (for all its items) if the condition is met.
Here is a working version of what you need which also reflects the (5) recommendations made:
---
- hosts: localhost
gather_facts: false
vars:
disk_list:
- sda
- sdb
- sdc
tasks:
- name: cleans previous runs
shell: cat /dev/null > /tmp/df_result
- name: creates temporary file
shell: df -hT | grep ext4 | grep -i {{ item }} | awk '{print $1}' >> /tmp/df_result
loop: "{{ disk_list }}"
- name: creates variable
shell: cat /tmp/df_result
register: df_result
- name: shows info
shell: ls -l {{ item }} | awk '{print $1}'
loop: "{{ df_result.stdout_lines }}"
I basically just use a temporary file (/tmp/df_result) with the results you need already filtered by the "grep -i {{ item }}" used in the loop on task "creates temporary file". Then, the loop in "shows info" just iterates over an already clean list of items.
To see the result on screen, you could use "-v" option when running the playbook or if you want to save the result in a file, you could add " >> /tmp/df_final" at the end of the shell line in "shows info" task.
I accidently step on this post, which is kind of old. I'm sure you already fixed this in your environment, maybe you find a better way to do it, hopefully you did.
Regards,

Passing External Vars with Ansible to a Playbook

Ansible variables passed via command line are not getting defined in the playbook.
I'm looking to pass external variables via command line to an ansible playbook. It is not working as expected using the -e, which is to call external variables based on the ansible documentation.
ansible-playbook /opt/playbooks/shutdown.yaml -f 10 -i /opt/inventory/hosts -e 'logPath=/my/log/path logName=shutdown.log logDir=shutdown'
---
- name: Transfer and execute a script.
hosts: all
remote_user: ansible
sudo: yes
tasks:
- name: Transfer the script
copy: src=/opt/files/shutdown.sh dest=/tmp/ mode=0777
- name: Execute the script
command: sh /tmp/shutdown.sh logPath logName logDir
- name: cat log output
command: cat logDir
register: myoutput
- name: get stout of execution of script
debug: msg={{ myoutput.stdout_lines }}
Here is my output, I'm expecting LogPath to be defined as the variable using key:value pair
: FAILED! => {"changed": true, "cmd": ["cat", "logPath"], "delta": "0:00:00.005258", "end": "2019-02-06 13:30:03.551631", "failed": true, "rc": 1, "start": "2019-02-06 13:30:03.546373", "stderr": "cat: logPath: No such file or directory", "stderr_lines": ["cat: logPath: No such file or directory"], "stdout": "", "stdout_lines": []}
to retry, use: --limit #/opt/playbooks/shutdown.retry
your command task seems wrong, you need to use curly brackets for ansible to treat the enclosed string as variable (and replace it with its value). Try this syntax:
- name: Execute the script
command: sh /tmp/shutdown.sh {{ logPath }} {{ logName }} {{ logDir }}
hope it helps
these should be passed in JSON notation, which would support passing other data-types than string:
-e '{"log_path":"/my/log/path","log_name":"shutdown.log","log_dir":"shutdown"}'
and then substitute accordingly:
- name: Execute the script
command: sh /tmp/shutdown.sh log_path log_name log_dir
snake-case is rather the default for variable names, than camel-case.
see the documentation.

group_names variable in ansible

I am running some issues when I execute this playbook:
- hosts: all
connection: local
tasks:
- template: src=/etc/ansible/{{group_names}}/common.j2 dest=/etc/ansible/configs/{{inventory_hostname}}.txt
name: create common config snippets
the error that I am getting is:
fatal: [R1]: FAILED! => {"changed": false, "failed": true, "msg": "Unable to find '/etc/ansible/[u'ios']/common.j2' in expected paths."}
fatal: [R2]: FAILED! => {"changed": false, "failed": true, "msg": "Unable to find '/etc/ansible/[u'ios1']/common.j2' in expected paths."}
and here are my groups:
/etc/ansible# cat hosts | grep ios
[ios]
[ios1]
and here are my common.j2 files:
/etc/ansible# ls ios1/
common.j2
/etc/ansible# ls ios/
common.j2
Could someone elaborate why the group_names returns [u'group_names] please?
Because group_names a list (that's why it is surrounded by [ ]) -- a host can belong to multiple groups.
You need to decide, what is your objective:
If you wanted to include files for all groups, you have to add a loop:
- hosts: all
connection: local
tasks:
- name: create common config snippets
template:
src: /etc/ansible/{{item}}/common.j2
dest: /etc/ansible/configs/{{inventory_hostname}}.txt
with_items: "{{group_names}}"
If you wanted to add a single group, you could refer to a single element (group_names[0]), but that doesn't seem practical...

Ansible copy from the remote server to ansible host fails

I need to copy the latest log file from remote linux server to the ansible host. This is what I have tried so far.
- hosts: [host]
remote_user: root
tasks:
- name: Copy the file
command: bash -c "ls -rt | grep install | tail -n1"
register: result
args:
chdir: /root
- name: Copying the file
copy:
src: "/root/{{ result.stdout }}"
dest: /home
But I am getting the following error .
TASK [Gathering Facts] ********************************************************************************************************************************************************************************************
ok
TASK [Copy the file] **********************************************************************************************************************************************************************************************
changed: => {"changed": true, "cmd": ["bash", "-c", "ls -rt | grep install | tail -n1"], "delta": "0:00:00.011388", "end": "2017-06-14 07:53:26.475344", "rc": 0, "start": "2017-06-14 07:53:26.463956", "stderr": "", "stdout": "install.20170614-051027.log", "stdout_lines": ["install.20170614-051027.log"], "warnings": []}
TASK [Copying the file] *******************************************************************************************************************************************************************************************
fatal: FAILED! => {"changed": false, "failed": true, "msg": "Unable to find 'install.20170614-051027.log' in expected paths."}
PLAY RECAP ********************************************************************************************************************************************************************************************************
: ok=2 changed=1 unreachable=0 failed=1
But that file is right there.Please help me resolve this issue.
Ansible Copy copies files from ansible host to remote host. Use Ansible fetch instead.
http://docs.ansible.com/ansible/fetch_module.html
This one works , i have to use fetch instead of copy to get the file from remote .
- name: Copy the file
command: bash -c "ls -rt | grep install | tail -n1"
register: result
args:
chdir: /root
- name: Copying the file
fetch:
src: "/root/{{ result.stdout }}"
dest: /home
flat: yes

Resources