Shared connection to host closed - ansible

I am trying to change user for certain task unable to make it work. Getting the error Shared connection to host closed
Here is my playbook:
cat test1.yml
- hosts: hcmbox
tasks:
- name: Find out my identity Bourne
command: "whoami"
register: idoutput
become: true
become_user: ebs1
- debug: msg="{{ idoutput.stdout }}"
I get this error:
ansible-playbook -i inventory test1.yml
PLAY [hcmbox] *************************************************************************************************************************************************************************
TASK [Gathering Facts] ****************************************************************************************************************************************************************
ok: [srvdb1.localdomain]
TASK [Find out my identity Bourne] ****************************************************************************************************************************************************
fatal: [srvdb1.localdomain]: FAILED! => {"changed": false, "module_stderr": "Shared connection to srvdb1.localdomain closed.\r\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
PLAY RECAP ****************************************************************************************************************************************************************************
srvdb1.localdomain : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Removed following two lines from playbook
become: true
become_user: ebs1
and executed again, here is the output:
PLAY [hcmbox] *************************************************************************************************************************************************************************
TASK [Gathering Facts] ****************************************************************************************************************************************************************
ok: [srvdb1.localdomain]
TASK [Find out my identity Bourne] ****************************************************************************************************************************************************
changed: [srvdb1.localdomain]
TASK [debug] **************************************************************************************************************************************************************************
ok: [srvdb1.localdomain] => {
"msg": "oracle"
}
PLAY RECAP ****************************************************************************************************************************************************************************
srvdb1.localdomain : ok=3 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
The trust and everything looks ok to me:
[oracle#ansctrlsrv epd3]$ ansible all -i 'srvdb1,' -m command -a 'whoami' -u ebs1
srvdb1 | CHANGED | rc=0 >>
ebs1
Even ssh works fine I checked it.
[oracle#ansctrlsrv epd3]$ ssh ebs1#srvdb1
Last login: Tue Feb 4 10:35:51 2020 from ansctrlsrv.localdomain
[ebs1#srvdb1 ~]$
I am at a loss why I am getting this error: Shared connection to srvdb1.localdomain closed
When I run with -vvv This is what I see:
ansible-playbook -i inventory -vvv test3.yml
ansible-playbook 2.8.4
config file = /stage/ap/ansible/epd3/ansible.cfg
configured module search path = [u'/oracle/app/oracle/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /bin/ansible-playbook
python version = 2.7.5 (default, Aug 7 2019, 08:19:52) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39.0.1)]
Using /stage/ap/ansible/epd3/ansible.cfg as config file
host_list declined parsing /stage/ap/ansible/epd3/inventory as it did not pass it's verify_file() method
script declined parsing /stage/ap/ansible/epd3/inventory as it did not pass it's verify_file() method
auto declined parsing /stage/ap/ansible/epd3/inventory as it did not pass it's verify_file() method
Parsed /stage/ap/ansible/epd3/inventory inventory source with ini plugin
PLAYBOOK: test3.yml *******************************************************************************************************************************************************************
1 plays in test3.yml
PLAY [hcmbox] *************************************************************************************************************************************************************************
META: ran handlers
TASK [command] ************************************************************************************************************************************************************************
task path: /stage/ap/ansible/epd3/test3.yml:10
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d dbsrv1.localdomain '/bin/sh -c '"'"'echo ~ && sleep 0'"'"''
<dbsrv1.localdomain> (0, '/oracle/app/oracle\n', '')
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d dbsrv1.localdomain '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /var/tmp/ansible-tmp-1580841289.87-193361522034615 `" && echo ansible-tmp-1580841289.87-193361522034615="` echo /var/tmp/ansible-tmp-1580841289.87-193361522034615 `" ) && sleep 0'"'"''
<dbsrv1.localdomain> (0, 'ansible-tmp-1580841289.87-193361522034615=/var/tmp/ansible-tmp-1580841289.87-193361522034615\n', '')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<dbsrv1.localdomain> PUT /oracle/app/oracle/.ansible/tmp/ansible-local-23567JjjVQn/tmpwYZDSF TO /var/tmp/ansible-tmp-1580841289.87-193361522034615/AnsiballZ_command.py
<dbsrv1.localdomain> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d '[dbsrv1.localdomain]'
<dbsrv1.localdomain> (0, 'sftp> put /oracle/app/oracle/.ansible/tmp/ansible-local-23567JjjVQn/tmpwYZDSF /var/tmp/ansible-tmp-1580841289.87-193361522034615/AnsiballZ_command.py\n', '')
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d dbsrv1.localdomain '/bin/sh -c '"'"'setfacl -m u:ebs1:r-x /var/tmp/ansible-tmp-1580841289.87-193361522034615/ /var/tmp/ansible-tmp-1580841289.87-193361522034615/AnsiballZ_command.py && sleep 0'"'"''
<dbsrv1.localdomain> (0, '', '')
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d -tt dbsrv1.localdomain '/bin/sh -c '"'"'sudo -H -S -n -u ebs1 /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-qtuzljmaohdcqewvlstmcepjcdwdxsiy ; /usr/bin/python /var/tmp/ansible-tmp-1580841289.87-193361522034615/AnsiballZ_command.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<dbsrv1.localdomain> (1, '', 'Shared connection to dbsrv1.localdomain closed.\r\n')
<dbsrv1.localdomain> Failed to connect to the host via ssh: Shared connection to dbsrv1.localdomain closed.
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d dbsrv1.localdomain '/bin/sh -c '"'"'rm -f -r /var/tmp/ansible-tmp-1580841289.87-193361522034615/ > /dev/null 2>&1 && sleep 0'"'"''
<dbsrv1.localdomain> (0, '', '')
failed: [dbsrv1.localdomain] (item=ebs1) => {
"ansible_loop_var": "item",
"changed": false,
"item": "ebs1",
"module_stderr": "Shared connection to dbsrv1.localdomain closed.\r\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
PLAY RECAP ****************************************************************************************************************************************************************************
dbsrv1.localdomain : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0

Which Python version is installed on your server?
Can you try to install python3 and add the path to python 3 in your inventory file.
It would look something like this:
srvdb1.localdomain ansible_python_interpreter=/usr/bin/python3

Related

unable to ssh to remote host via ansible playbook

I have two playbooks
Playbook1.yaml which installs dependencies as root user and is working as expected but playbook2 is giving errors. Can some one help look at it on why playbook2 is failing to run when majority of the code is same for both playbook 1 and 2?
Playbook1 yaml file
---
- name: Install Cognos Analytics
hosts: all
become_method: dzdo
become_user: root
become_flags: 'su -'
tasks:
- name: Install Cognos Analytics Dependencies
yum:
name:
- java-1.8.0-openjdk
- glibc.i686
- glibc.x86_64
- libstdc++.i686
- libstdc++.x86_64
- nspr.i686
- nspr.x86_64
- nss.i686
- nss.x86_64
Now Playbook2 yaml below is giving the following error when I try to run can someone help me on this ?
---
- name: Install Cognos Analytics
hosts: all
become_method: dzdo
become_user: root
become_flags: 'su -'
tasks:
- name: Installing Cognos Analytics
command: /apps/Softwares/ca_instl_lnxi38664_2.0.2003191.bin -f /apps/Softwares/cognosresponsefile.properties -i silent
args:
chdir: /apps/SilentInstall
Error Log:
TASK [Installing Cognos Analytics] **************************************************************************************************************************
task path: /etc/ansible/Cognos.yml:9
<10.x.x.x> ESTABLISH SSH CONNECTION FOR USER: jughead
<10.x.x.x> SSH: EXEC sshpass -d8 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="jughead"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/a67e55b20e 10.x.x.x '/bin/sh -c '"'"'echo ~jughead && sleep 0'"'"''
<10.x.x.x> (0, '/home/jughead\n', '')
<10.x.x.x> ESTABLISH SSH CONNECTION FOR USER: jughead
<10.x.x.x> SSH: EXEC sshpass -d8 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="jughead"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/a67e55b20e 10.x.x.x '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/jughead/.ansible/tmp `"&& mkdir "` echo /home/jughead/.ansible/tmp/ansible-tmp-1624909377.53-12390-42703578539020 `" && echo ansible-tmp-1624909377.53-12390-42703578539020="` echo /home/jughead/.ansible/tmp/ansible-tmp-1624909377.53-12390-42703578539020 `" ) && sleep 0'"'"''
<10.x.x.x> (0, 'ansible-tmp-1624909377.53-12390-42703578539020=/home/jughead/.ansible/tmp/ansible-tmp-1624909377.53-12390-42703578539020\n', '')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<10.x.x.x> PUT /root/.ansible/tmp/ansible-local-12346tPaVOe/tmpXlWUhD TO /home/jughead/.ansible/tmp/ansible-tmp-1624909377.53-12390-42703578539020/AnsiballZ_command.py
<10.x.x.x> SSH: EXEC sshpass -d8 sftp -o BatchMode=no -b - -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="jughead"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/a67e55b20e '[10.x.x.x]'
<10.x.x.x> (0, 'sftp> put /root/.ansible/tmp/ansible-local-12346tPaVOe/tmpXlWUhD /home/jughead/.ansible/tmp/ansible-tmp-1624909377.53-12390-42703578539020/AnsiballZ_command.py\n', '')
<10.x.x.x> ESTABLISH SSH CONNECTION FOR USER: jughead
<10.x.x.x> SSH: EXEC sshpass -d8 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="jughead"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/a67e55b20e 10.x.x.x '/bin/sh -c '"'"'chmod u+x /home/jughead/.ansible/tmp/ansible-tmp-1624909377.53-12390-42703578539020/ /home/jughead/.ansible/tmp/ansible-tmp-1624909377.53-12390-42703578539020/AnsiballZ_command.py && sleep 0'"'"''
<10.x.x.x> (0, '', '')
<10.x.x.x> ESTABLISH SSH CONNECTION FOR USER: jughead
<10.x.x.x> SSH: EXEC sshpass -d8 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="jughead"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/a67e55b20e -tt 10.x.x.x '/bin/sh -c '"'"'/usr/bin/python /home/jughead/.ansible/tmp/ansible-tmp-1624909377.53-12390-42703578539020/AnsiballZ_command.py && sleep 0'"'"''
<10.x.x.x> (1, '\r\n{"changed": true, "end": "2021-06-28 15:43:16.115767", "stdout": "", "cmd": ["/apps/Softwares/ca_instl_lnxi38664_2.0.2003191.bin", "-f", "/apps/Softwares/cognosresponsefile.properties", "-i", "silent"], "failed": true, "delta": "0:00:18.049758", "stderr": "", "rc": 255, "invocation": {"module_args": {"creates": null, "executable": null, "_uses_shell": false, "strip_empty_ends": true, "_raw_params": "/apps/Softwares/ca_instl_lnxi38664_2.0.2003191.bin -f /apps/Softwares/cognosresponsefile.properties -i silent", "removes": null, "argv": null, "warn": true, "chdir": "/apps/SilentInstall", "stdin_add_newline": true, "stdin": null}}, "start": "2021-06-28 15:42:58.066009", "msg": "non-zero return code"}\r\n', 'Shared connection to 10.x.x.x closed.\r\n')
<10.x.x.x> Failed to connect to the host via ssh: Shared connection to 10.x.x.x closed.
<10.x.x.x> ESTABLISH SSH CONNECTION FOR USER: jughead
<10.x.x.x> SSH: EXEC sshpass -d8 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="jughead"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/a67e55b20e 10.x.x.x '/bin/sh -c '"'"'rm -f -r /home/jughead/.ansible/tmp/ansible-tmp-1624909377.53-12390-42703578539020/ > /dev/null 2>&1 && sleep 0'"'"''
<10.x.x.x> (0, '', '')
fatal: [10.x.x.x]: FAILED! => {
"changed": true,
"cmd": [
"/apps/Softwares/ca_instl_lnxi38664_2.0.2003191.bin",
"-f",
"/apps/Softwares/cognosresponsefile.properties",
"-i",
"silent"
],
"delta": "0:00:18.049758",
"end": "2021-06-28 15:43:16.115767",
"invocation": {
"module_args": {
"_raw_params": "/apps/Softwares/ca_instl_lnxi38664_2.0.2003191.bin -f /apps/Softwares/cognosresponsefile.properties -i silent",
"_uses_shell": false,
"argv": null,
"chdir": "/apps/SilentInstall",
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"msg": "non-zero return code",
"rc": 255,
"start": "2021-06-28 15:42:58.066009",
"stderr": "",
"stderr_lines": [],
"stdout": "",
"stdout_lines": []
}
Seeing as the only thing that makes sense is that Playbook 1 somehow affects Playbook 2, I searched for your issue and found this: https://access.redhat.com/solutions/475513
And this:
https://support.oracle.com/knowledge/Oracle%20Database%20Products/2543805_1.html
Unfortunately none of these offer any solution, but just hints. Assuming that the installation of NSS is the issue, you could try to find out why the sshd might be failing from its logs.

Ansible how to connect controller host to another host both running in same virtualbox?

I have deployed 2 Ubuntu 18.04 Machine in virtualbox running in Windows 10
Ansible 2.9.6 is installed in controller host.
Now I am stuck in trying to connect controller host to other host.
My etc/ansible/hosts are defined as below. localhost is controller, Staging is another ubuntu in virtualbox.
localhost ansible_connection=local ansible_python_interpreter=/usr/bin/python3
staging ansible_host=10.0.2.15 ansible_ssh_pass=1234 ansible_ssh_user=dinesh ansible_python_interpreter=/usr/bin/python3
My ansible.cfg are defined as below
[defaults]
host_key_checking = false
[defaults]
transport = ssh
[ssh_connection]
#ssh_args = -o ForwardAgent=yes
My playbook cloning.yml is below. I am just trying to clone a public git repo.
First I give the permission for the folder where I am trying to clone. as the error stated permission denied. But I think it is not the actual solution
---
- hosts: staging
become: true
become_user: dinesh
gather_facts: no
tasks:
- name: Change file permission to liberal
command: /usr/bin/find . -type f -exec chmod 777 -- {} +
args:
chdir: /usr/local/src/
register: output
- debug: var=output.stdout_lines
- name: pull from git
git:
repo: https://github.com/fossology/fossology.git
dest: /usr/local/src/fossology
update: yes
version: master
force: yes
- name: git status
command: /usr/bin/git rev-parse HEAD
args:
chdir: /usr/local/src/fossology
register: output
- debug: var=output.stdout_lines
- name: start the docker
docker_compose:
project_src: usr/local/src/fossology
state: present
Error part
TASK [pull from git] ***************************************************************************************************************************************************************************************
task path: /home/dinesh/Documents/fossy_initial_setup.yml:13
<10.0.2.15> ESTABLISH SSH CONNECTION FOR USER: dinesh
<10.0.2.15> SSH: EXEC sshpass -d10 ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o 'User="dinesh"' -o ConnectTimeout=10 10.0.2.15 '/bin/sh -c '"'"'echo ~dinesh && sleep 0'"'"''
<10.0.2.15> (0, '/home/dinesh\n', '')
<10.0.2.15> ESTABLISH SSH CONNECTION FOR USER: dinesh
<10.0.2.15> SSH: EXEC sshpass -d10 ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o 'User="dinesh"' -o ConnectTimeout=10 10.0.2.15 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/dinesh/.ansible/tmp/ansible-tmp-1584285396.54-187145294823947 `" && echo ansible-tmp-1584285396.54-187145294823947="` echo /home/dinesh/.ansible/tmp/ansible-tmp-1584285396.54-187145294823947 `" ) && sleep 0'"'"''
<10.0.2.15> (0, 'ansible-tmp-1584285396.54-187145294823947=/home/dinesh/.ansible/tmp/ansible-tmp-1584285396.54-187145294823947\n', '')
Using module file /usr/lib/python2.7/dist-packages/ansible/modules/source_control/git.py
<10.0.2.15> PUT /home/dinesh/.ansible/tmp/ansible-local-21868PFeQ7k/tmpssQrbR TO /home/dinesh/.ansible/tmp/ansible-tmp-1584285396.54-187145294823947/AnsiballZ_git.py
<10.0.2.15> SSH: EXEC sshpass -d10 sftp -o BatchMode=no -b - -o ForwardAgent=yes -o StrictHostKeyChecking=no -o 'User="dinesh"' -o ConnectTimeout=10 '[10.0.2.15]'
<10.0.2.15> (0, 'sftp> put /home/dinesh/.ansible/tmp/ansible-local-21868PFeQ7k/tmpssQrbR /home/dinesh/.ansible/tmp/ansible-tmp-1584285396.54-187145294823947/AnsiballZ_git.py\n', '')
<10.0.2.15> ESTABLISH SSH CONNECTION FOR USER: dinesh
<10.0.2.15> SSH: EXEC sshpass -d10 ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o 'User="dinesh"' -o ConnectTimeout=10 10.0.2.15 '/bin/sh -c '"'"'chmod u+x /home/dinesh/.ansible/tmp/ansible-tmp-1584285396.54-187145294823947/ /home/dinesh/.ansible/tmp/ansible-tmp-1584285396.54-187145294823947/AnsiballZ_git.py && sleep 0'"'"''
<10.0.2.15> (0, '', '')
<10.0.2.15> ESTABLISH SSH CONNECTION FOR USER: dinesh
<10.0.2.15> SSH: EXEC sshpass -d10 ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o 'User="dinesh"' -o ConnectTimeout=10 -tt 10.0.2.15 '/bin/sh -c '"'"'/usr/bin/python3 /home/dinesh/.ansible/tmp/ansible-tmp-1584285396.54-187145294823947/AnsiballZ_git.py && sleep 0'"'"''
<10.0.2.15> (1, '\r\n{"cmd": "/usr/bin/git clone --origin origin https://github.com/fossology/fossology.git /usr/local/src/fossology", "rc": 1, "stdout": "", "stderr": "Cloning into \'/usr/local/src/fossology\'...\\n/usr/local/src/fossology/.git: Permission denied\\n", "msg": "Cloning into \'/usr/local/src/fossology\'...\\n/usr/local/src/fossology/.git: Permission denied", "failed": true, "invocation": {"module_args": {"force": true, "dest": "/usr/local/src/fossology", "update": true, "repo": "https://github.com/fossology/fossology.git", "version": "master", "remote": "origin", "clone": true, "verify_commit": false, "gpg_whitelist": [], "accept_hostkey": false, "bare": false, "recursive": true, "track_submodules": false, "refspec": null, "reference": null, "depth": null, "key_file": null, "ssh_opts": null, "executable": null, "umask": null, "archive": null, "separate_git_dir": null}}}\r\n', 'Connection to 10.0.2.15 closed.\r\n')
<10.0.2.15> Failed to connect to the host via ssh: Connection to 10.0.2.15 closed.
<10.0.2.15> ESTABLISH SSH CONNECTION FOR USER: dinesh
<10.0.2.15> SSH: EXEC sshpass -d10 ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o 'User="dinesh"' -o ConnectTimeout=10 10.0.2.15 '/bin/sh -c '"'"'rm -f -r /home/dinesh/.ansible/tmp/ansible-tmp-1584285396.54-187145294823947/ > /dev/null 2>&1 && sleep 0'"'"''
<10.0.2.15> (0, '', '')
fatal: [fossology_staging]: FAILED! => {
"changed": false,
"cmd": "/usr/bin/git clone --origin origin https://github.com/fossology/fossology.git /usr/local/src/fossology",
"invocation": {
"module_args": {
"accept_hostkey": false,
"archive": null,
"bare": false,
"clone": true,
"depth": null,
"dest": "/usr/local/src/fossology",
"executable": null,
"force": true,
"gpg_whitelist": [],
"key_file": null,
"recursive": true,
"reference": null,
"refspec": null,
"remote": "origin",
"repo": "https://github.com/fossology/fossology.git",
"separate_git_dir": null,
"ssh_opts": null,
"track_submodules": false,
"umask": null,
"update": true,
"verify_commit": false,
"version": "master"
}
},
"msg": "Cloning into '/usr/local/src/fossology'...\n/usr/local/src/fossology/.git: Permission denied",
"rc": 1,
"stderr": "Cloning into '/usr/local/src/fossology'...\n/usr/local/src/fossology/.git: Permission denied\n",
"stderr_lines": [
"Cloning into '/usr/local/src/fossology'...",
"/usr/local/src/fossology/.git: Permission denied"
],
"stdout": "",
"stdout_lines": []
}
PLAY RECAP *************************************************************************************************************************************************************************************************
fossology_staging : ok=2 changed=1 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Either the connection is not happening or connection via ssh is wrong. But when I do
dinesh#dinesh-VirtualBox:~/Documents$ ansible staging -m ping
staging | SUCCESS => {
"changed": false,
"ping": "pong"
}
dinesh#dinesh-VirtualBox:~/Documents$
What am I doing wrong here?

Shell script execution is not working in remote server Ansible (previous tasks executed successfully)

I am not able to execute shell script remotely in Ansible. However, there are previous tasks in the same role (filebeat) that are executed in remote server successfully. I am running the following in local server 172.28.28.6 server to install and run filebeat in remote server 172.28.28.81
Playbook: install-filebeat.yml:
hosts: filebeat-servers
remote_user: wwwadm
sudo: yes
roles:
- { role: /vagrant/roles/filebeat}
Role filebeat: main.yml:
---
# tasks file for filebeat
- name: "Extract Filebeat"
unarchive:
src: "{{ tmp_artifact_cache }}/{{ filebeat_archive }}"
remote_src: yes
dest: "{{ filebeat_root_dir }}"
extra_opts: ['--transform=s,/*[^/]*,{{ filebeat_ver }},i', '--show-stored-names']
become: yes
become_user: "{{ filebeat_install_as }}"
when: not ansible_check_mode
tags: [ 'filebeat' ]
- name: Configure Filebeat
template:
src: "filebeat.yml.j2"
dest: "{{ filebeat_install_dir }}/filebeat.yml"
mode: 0775
become: yes
become_user: "{{ filebeat_install_as }}"
tags: [ 'filebeat' ]
- name: 'Filebeat startup script'
template:
src: "startup.sh.j2"
dest: "{{ filebeat_install_dir }}/bin/startup.sh"
mode: 0755
become: yes
become_user: "{{ filebeat_install_as }}"
tags: [ 'filebeat', 'start' ]
#This one does not get executed at all:
- name: "Start Filebeat"
# shell: "{{ filebeat_install_dir }}/bin/startup.sh"
command: "sh {{ filebeat_install_dir }}/bin/startup.sh"
become: yes
become_user: "{{ filebeat_install_as }}"
defaults:
# defaults file for filebeat
filebeat_ver: "6.6.0"
filebeat_archive: "filebeat-{{ filebeat_ver }}-linux-x86_64.tar.gz"
filebeat_archive_checksum : "sha1:d38d8fea7e9915582720280eb0118b7d92569b23"
filebeat_url: "https://artifacts.elastic.co/downloads/beats/filebeat/{{ filebeat_archive }}"
filebeat_root_dir: "{{ apps_home }}/filebeat"
filebeat_data_dir: "{{ apps_data }}/filebeat"
filebeat_log_dir: "{{ apps_logs }}/filebeat"
filebeat_install_dir: "{{ filebeat_root_dir }}/{{ filebeat_ver }}"
filebeat_cert_dir: "/etc/pki/tls/certs"
filebeat_ssl_certificate_file: "logstash.crt"
filebeat_ssl_key_file: "logstash.key"
filebeat_install_as: "{{ install_user | default('wwwadm') }}"
filebeat_set_as_current: yes
filebeat_force_clean_install: no
filebeat_java_home: "{{ sw_home }}/jdk"
inventory/local/hosts:
localhost ansible_connection=local
[filebeat-servers]
172.28.28.81 ansible_user=vagrant ansible_connection=ssh
Filebeat is installed and changes are done in the remote server except the last step which is the execution of shell script
When running the playbook as follows:
ansible-playbook -i /vagrant/inventory/local install-filebeat.yml -vvv
Getting the following output related to the shell execution:
TASK [/vagrant/roles/filebeat : Start Filebeat] ***************************************************************************************************************************************************************
task path: /vagrant/roles/filebeat/tasks/main.yml:184
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 172.28.28.81 '/bin/sh -c '"'"'echo ~vagrant && sleep 0'"'"''
<172.28.28.81> (0, '/home/vagrant\n', '')
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 172.28.28.81 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /var/tmp/ansible-tmp-1550178583.24-35955954120606 `" && echo ansible-tmp-1550178583.24-35955954120606="` echo /var/tmp/ansible-tmp-1550178583.24-35955954120606 `" ) && sleep 0'"'"''
<172.28.28.81> (0, 'ansible-tmp-1550178583.24-35955954120606=/var/tmp/ansible-tmp-1550178583.24-35955954120606\n', '')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<172.28.28.81> PUT /home/vagrant/.ansible/tmp/ansible-local-13658UX7cBC/tmpFzf2Ll TO /var/tmp/ansible-tmp-1550178583.24-35955954120606/AnsiballZ_command.py
<172.28.28.81> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 '[172.28.28.81]'
<172.28.28.81> (0, 'sftp> put /home/vagrant/.ansible/tmp/ansible-local-13658UX7cBC/tmpFzf2Ll /var/tmp/ansible-tmp-1550178583.24-35955954120606/AnsiballZ_command.py\n', '')
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 172.28.28.81 '/bin/sh -c '"'"'setfacl -m u:wwwsvr:r-x /var/tmp/ansible-tmp-1550178583.24-35955954120606/ /var/tmp/ansible-tmp-1550178583.24-35955954120606/AnsiballZ_command.py && sleep 0'"'"''
<172.28.28.81> (0, '', '')
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 -tt 172.28.28.81 '/bin/sh -c '"'"'sudo -H -S -n -u wwwsvr /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-ntzchfzqggiteuqwzpiurlloddbdhevp; /usr/bin/python /var/tmp/ansible-tmp-1550178583.24-35955954120606/AnsiballZ_command.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<172.28.28.81> (0, '\r\n{"changed": true, "end": "2019-02-14 13:09:44.800191", "stdout": "Starting Filebeat", "cmd": ["sh", "/apps_ux/filebeat/6.6.0/bin/startup.sh"], "rc": 0, "start": "2019-02-14 13:09:43.792122", "stderr": "+ export JAVA_HOME=/sw_ux/jdk\\n+ JAVA_HOME=/sw_ux/jdk\\n+ echo \'Starting Filebeat\'\\n+ /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat", "delta": "0:00:01.008069", "invocation": {"module_args": {"warn": true, "executable": null, "_uses_shell": false, "_raw_params": "sh /apps_ux/filebeat/6.6.0/bin/startup.sh", "removes": null, "argv": null, "creates": null, "chdir": null, "stdin": null}}}\r\n', 'Shared connection to 172.28.28.81 closed.\r\n')
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 172.28.28.81 '/bin/sh -c '"'"'rm -f -r /var/tmp/ansible-tmp-1550178583.24-35955954120606/ > /dev/null 2>&1 && sleep 0'"'"''
<172.28.28.81> (0, '', '')
changed: [172.28.28.81] => {
"changed": true,
"cmd": [
"sh",
"/apps_ux/filebeat/6.6.0/bin/startup.sh"
],
"delta": "0:00:01.008069",
"end": "2019-02-14 13:09:44.800191",
"invocation": {
"module_args": {
"_raw_params": "sh /apps_ux/filebeat/6.6.0/bin/startup.sh",
"_uses_shell": false,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"warn": true
}
},
"rc": 0,
"start": "2019-02-14 13:09:43.792122",
"stderr": "+ export JAVA_HOME=/sw_ux/jdk\n+ JAVA_HOME=/sw_ux/jdk\n+ echo 'Starting Filebeat'\n+ /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat",
"stderr_lines": [
"+ export JAVA_HOME=/sw_ux/jdk",
"+ JAVA_HOME=/sw_ux/jdk",
"+ echo 'Starting Filebeat'",
"+ /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat"
],
"stdout": "Starting Filebeat",
"stdout_lines": [
"Starting Filebeat"
]
}
META: ran handlers
META: ran handlers
PLAY RECAP ****************************************************************************************************************************************************************************************************
172.28.28.81 : ok=18 changed=7 unreachable=0 failed=0
On remote server:
[6.6.0:vagrant]$ cd bin
[bin:vagrant]$ ls -ltr
total 36068
-rwxr-xr-x. 1 wwwadm wwwadm 36927014 Jan 24 02:30 filebeat
-rwxr-xr-x. 1 wwwadm wwwadm 478 Feb 14 12:54 startup.sh
[bin:vagrant]$ pwd
/apps_ux/filebeat/6.6.0/bin
[bin:vagrant]$ more startup.sh
#!/usr/bin/env bash
set -x
export JAVA_HOME="/sw_ux/jdk"
#To save pid into a file is an open feature: https://github.com/elastic/logstash/issues/3577. There is no -p flag for filebeat to save the pid and then kill it.
echo 'Starting Filebeat'
/apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /a
pps_data/logs/filebeat &
No process running found by executing ps command
[bin:vagrant]$ ps -fea | grep filebeat | grep -v grep
However, if I connect to the remote server, I am able to run filebeat by executing the script with the user wwwadm and filebeat starts successfully:
[bin:wwwadm]$ pwd
/apps_ux/filebeat/6.6.0/bin
[bin:wwwadm]$ id
uid=778(wwwadm) gid=778(wwwadm) groups=778(wwwadm) context=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
[bin:wwwadm]$ ./startup.sh
+ export JAVA_HOME=/sw_ux/jdk
+ JAVA_HOME=/sw_ux/jdk
+ echo 'Starting Filebeat'
Starting Filebeat
+ /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat
[bin:wwwadm]$ ps -fea | grep filebeat | grep -v grep
wwwadm 19160 1 0 15:12 pts/0 00:00:00 /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat
Thanks
You should use nohup to run it in background.
because when ansible exits, all processes associated with the session
will be terminated. To avoid this you should use nohup.
Correct command is:
- name: "Start Filebeat"
# shell: "{{ filebeat_install_dir }}/bin/startup.sh"
command: "nohup sh {{ filebeat_install_dir }}/bin/startup.sh &>> startup.log &"
become: yes
become_user: "{{ filebeat_install_as }}"
You have to use the disown built-in command to inform the shell that it should not kill background processes when you disconnect; you can also use nohup for that same effect
Having said that, you are for sure solving the wrong problem, because if^H^Hwhen filebeat falls over, there is nothing monitoring that service to keep it alive. You'll want to use systemd (or its equivalent on your system) to ensure that filebeat stays running, and by using the mechanism designed for that stuff, you side-step all the "disown or nohup" business that causes you to ask S.O. questions.

Ansible EC2 add multiple hosts

trying to find and add hosts dynamically like so
---
- hosts: localhost
gather_facts: no
tasks:
- name: Gather EC2 remote facts.
ec2_remote_facts:
region: 'us-east-1'
register: ec2_remote_facts
- name: Debug.
debug:
msg: "{{ ec2_remote_facts }}"
- name: get instances for tags
add_host:
name: "{{ item }}"
group: dynamically_created_hosts
with_items: |
"{{ ec2_remote_facts.instances |
selectattr('tags.AppName', 'defined') | selectattr('tags.AppName', 'equalto', 'sql') |
selectattr('tags.AppType', 'defined') | selectattr('tags.AppType', 'equalto', 'infra') |
map(attribute='private_ip_address') | list }}"
- hosts:
- dynamically_created_hosts
become: yes
become_user: root
serial: 1
vars_files:
- group_vars/all
tasks:
- name: run command
shell: "uname -a"
I get following when i run in verbose mode
TASK [get instances for tags] **************************************************
task path: /Users/me/gitfork2/fornax/dynhst.yml:39
creating host via 'add_host': hostname="[u'10.112.114.241']"
changed: [localhost] => (item="[u'10.112.114.241']") => {"add_host": {"groups": ["dynamically_created_hosts"], "host_name": "\"[u'10.112.114.241']\"", "host_vars": {"group": "dynamically_created_hosts"}}, "changed": true, "invocation": {"module_args": {"group": "dynamically_created_hosts", "hostname": "\"[u'10.112.114.241']\""}, "module_name": "add_host"}, "item": "\"[u'10.112.114.241']\""}
PLAY [dynamically_created_hosts] ***********************************************
TASK [setup] *******************************************************************
<"[u'10.112.114.241']"> ESTABLISH SSH CONNECTION FOR USER: None
<"[u'10.112.114.241']"> SSH: ansible.cfg set ssh_args: (-F)(/Users/me/.ssh/config)
<"[u'10.112.114.241']"> SSH: ANSIBLE_HOST_KEY_CHECKING/host_key_checking disabled: (-o)(StrictHostKeyChecking=no)
<"[u'10.112.114.241']"> SSH: ansible_password/ansible_ssh_pass not set: (-o)(KbdInteractiveAuthentication=no)(-o)(PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey)(-o)(PasswordAuthentication=no)
<"[u'10.112.114.241']"> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=10)
<"[u'10.112.114.241']"> SSH: PlayContext set ssh_common_args: ()
<"[u'10.112.114.241']"> SSH: PlayContext set ssh_extra_args: ()
<"[u'10.112.114.241']"> SSH: EXEC ssh -C -vvv -F /Users/me/.ssh/config -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 '"[u'"'"'10.112.114.241'"'"']"' '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1509843772.58-176166317659656 `" && echo ansible-tmp-1509843772.58-176166317659656="` echo $HOME/.ansible/tmp/ansible-tmp-1509843772.58-176166317659656 `" ) && sleep 0'"'"''
fatal: ["[u'10.112.114.241']"]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh.", "unreachable": true}
to retry, use: --limit #./dynhst.retry
The odd thing here I see is
SSH: EXEC ssh -C -vvv -F /Users/me/.ssh/config -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 '"[u'"'"'10.112.114.241'"'"']"' '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1509843772.58-176166317659656 `" && echo ansible-tmp-1509843772.58-176166317659656="` echo $HOME/.ansible/tmp/ansible-tmp-1509843772.58-176166317659656 `" ) && sleep 0'"'"''
Seems like it is trying to ssh into '"[u'"'"'10.112.114.241'"'"']"' ... seems like the dynamically_created_hosts is being used as a string and not as a list
Any ideas why?
You pass a list (of IP addresses) to an argument name which requires a string:
hostname="[u'10.112.114.241']"
[ ] is a JSON representation of a list (single element in the example above).
If you want the first address from the list (and there seems to be no more for any of your hosts), then:
add_host:
name: "{{ item[0] }}"
group: dynamically_created_hosts
with_items: ...

Not able to print complete output via Ansible

I have a command that needs to be executed remotely.
find /opt/cac/CI/release/releases -name "*.tar" -exec md5sum {} \;
the complete output is ::
e5af5514887e8cbc08815936558a3220 /opt/cac/CI/release/releases/permission/4.00.00.04/CXP_902_8059_4.00.00.04_20161007-131208.tar
d35eae58399770627ba42f5b538a9cab /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-052332.tar
784d035f9959f6a3006aaab202c13015 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-060837.tar
3234cc8944c1d26c1ff0fac844ae8674 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-062202.tar
431368042e9f5e62de37787cd2d05b08 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20161008-173030.tar
I am trying to do the same using Ansible :
- hosts: tmoittecac
tasks:
- name: Report md5sum of Release files.
shell: find /opt/cac/CI/release/releases -name "*.tar" -exec md5sum {} \;
register: pre_md5_check
tags: a
- debug: msg={{ pre_md5_check.stdout_lines }}
tags: b
But the output that I get is ::
TASK: [debug msg={{ pre_md5_check.stdout_lines }}] ****************************
ok: [tmoittecac] => {
"msg": "['e5af5514887e8cbc08815936558a3220 /opt/cac/CI/release/releases/permission/4.00.00.04/CXP_902_8059_4.00.00.04_20161007-131208.tar',"
}
I am only getting the first line of the actual output.
Running the playbook in verbose mode gives me.
TASK: [Report md5sum of Release files.] ***************************************
<147.128.72.59> ESTABLISH CONNECTION FOR USER: local
<147.128.72.59> REMOTE_MODULE command find /opt/cac/CI/release/releases -name "*.tar" -exec md5sum {} \; #USE_SHELL
<147.128.72.59> EXEC sshpass -d7 ssh -C -tt -vvv -o ControlMaster=auto -o ControlPersist=60s -o ControlPath="/root/.ansible/cp/ansible-ssh-%h-%p-%r" -o StrictHostKeyChecking=no -o GSSAPIAuthentication=no -o PubkeyAuthentication=no -o User=local -o ConnectTimeout=10 147.128.72.59 /bin/sh -c 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055 && echo $HOME/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055'
<147.128.72.59> PUT /tmp/tmpm6eavD TO /opt/home/local/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055/command
<147.128.72.59> EXEC sshpass -d7 ssh -C -tt -vvv -o ControlMaster=auto -o ControlPersist=60s -o ControlPath="/root/.ansible/cp/ansible-ssh-%h-%p-%r" -o StrictHostKeyChecking=no -o GSSAPIAuthentication=no -o PubkeyAuthentication=no -o User=local -o ConnectTimeout=10 147.128.72.59 /bin/sh -c 'LANG=C LC_CTYPE=C /usr/bin/python /opt/home/local/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055/command; rm -rf /opt/home/local/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055/ >/dev/null 2>&1'
changed: [tmoittecac] => {"changed": true, "cmd": "find /opt/cac/CI/release/releases -name \"*.tar\" -exec md5sum {} \\;", "delta": "0:00:01.172660", "end": "2016-10-12 03:53:14.020937", "rc": 0, "start": "2016-10-12 03:53:12.848277", "stderr": "", "stdout": "e5af5514887e8cbc08815936558a3220 /opt/cac/CI/release/releases/permission/4.00.00.04/CXP_902_8059_4.00.00.04_20161007-131208.tar\nd35eae58399770627ba42f5b538a9cab /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-052332.tar\n784d035f9959f6a3006aaab202c13015 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-060837.tar\n3234cc8944c1d26c1ff0fac844ae8674 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-062202.tar\n431368042e9f5e62de37787cd2d05b08 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20161008-173030.tar\n7f637400276cb25f7f3b2f869d915dc7 /opt/cac/CI/release/releases/notification/4.00.00.03/CXP_902_8347_4.00.00.03_20160928-070050.tar\nfa4a0aea0c096c703f2a9a741d2d1152 /opt/cac/CI/release/releases/user-preference/4.08.01.00/CXP_902_8717_4.08.01.00_20160929-034340.tar\n34f084c617f49123fc0edef358d15784 /opt/cac/CI/release/releases/captcha/2.08.01.00/CXP_902_8881_2.08.01.00_20160929-043449.tar\n2041d873f0619f1c9a4c8e419156753e /opt/cac/CI/release/releases/consumerProfile/3.08.02.00/CXP_902_8057_3.08.02.00_20161008-063249.tar\n7a0a9efe4232eebbb5c05e5b84fb6bec /opt/cac/CI/release/releases/consumerProfile/3.08.02.00/CXP_902_8057_3.08.02.00_20161008-071824.tar", "warnings": []}
TASK: [debug msg={{ pre_md5_check.stdout_lines }}] ****************************
<147.128.72.59> ESTABLISH CONNECTION FOR USER: local
ok: [tmoittecac] => {
"msg": "['e5af5514887e8cbc08815936558a3220 /opt/cac/CI/release/releases/permission/4.00.00.04/CXP_902_8059_4.00.00.04_20161007-131208.tar',"
}
Is these something I am missing ?
More INFO ::
tasks:
- name: Check md5sums of WAR files.
shell: find /opt/cac/CI/release/releases -name "*.tar" -exec md5sum {} \;
register: MD5SUMS
tags: a
- name: Output md5sum of WAR files.
debug: var={{ MD5SUMS }}
tags: b
Output ::
GATHERING FACTS ***************************************************************
ok: [tmoittecac]
TASK: [Check md5sums of WAR files.] *******************************************
changed: [tmoittecac]
TASK: [Output md5sum of WAR files.] *******************************************
ok: [tmoittecac] => {
"var": {
"{'changed':": "{'changed':"
}
}
PLAY RECAP ********************************************************************
tmoittecac : ok=3 changed=1 unreachable=0 failed=0
Ansible version is :
ansible --version
ansible 1.9.4
Issue was solved by upgrading Ansible version to 2.1.

Resources