Privilieges escalation with ansible using bastion/jump server - ansible

Is using:
become: true
become_user: root
Something that is ignored when using jump host ? I'm connecting to my destination server using:
"ANSIBLE_SSH_COMMON_ARGS": "-o ProxyCommand='ssh -p 22 -W %h:%p -q user#ip_obfuscated -i /home/semaphore/.ssh/jump.priv'"
The whole connection when i run debug looks like:
<ip_obfuscated> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'IdentityFile="/tmp/semaphore/access_key_136706914"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="terraform"' -o ConnectTimeout=10 -o 'ProxyCommand=ssh -p 22 -W %h:%p -q user#ip_obfuscated -i /home/semaphore/.ssh/jump.priv' -o 'ControlPath="/tmp/semaphore/.ansible/cp/1713c02767"' ip_obfuscated '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/terraform/.ansible/tmp `"&& mkdir "` echo /home/terraform/.ansible/tmp/ansible-tmp-1668542130.3374832-2167-135826106011908 `" && echo ansible-tmp-1668542130.3374832-2167-135826106011908="` echo /home/terraform/.ansible/tmp/ansible-tmp-1668542130.3374832-2167-135826106011908 `" ) && sleep 0'"'"''
Which works just fine, but I'm always connected only as an simple user... I can't escalate the privileges... look at this result:
---
- name: Remove user from Linux server
hosts: all
become: true
tasks:
# Use whoami to determine the current user
- name: Determine current user
shell: whoami
register: current_user
# print the current user
- name: Print current user
debug:
msg: "Current user is {{ current_user.stdout }}"
- name: Determine current user
shell: sudo whoami
register: current_user2
# print the current user
- name: Print current user
debug:
msg: "Current user is {{ current_user2.stdout }}"
Result:
9:12:34 PM
9:12:34 PM
TASK [Print current user] ******************************************************
9:12:34 PM
ok: [ip_obfuscated] => {
9:12:34 PM
"msg": "Current user is terraform"
9:12:34 PM
}
9:12:38 PM
9:12:38 PM
TASK [Determine current user] **************************************************
9:12:38 PM
changed: [ip_obfuscated]
9:12:38 PM
9:12:38 PM
TASK [Print current user] ******************************************************
9:12:38 PM
ok: [ip_obfuscated] => {
9:12:38 PM
"msg": "Current user is root"
9:12:38 PM
}
9:12:38 PM
I'm using up to date ansible.
I tried so many combinations of become, become_user, become_method... set it globally, set it on task it self, even in inventory... no idea why it wont become root on its own...

Related

Shared connection to host closed

I am trying to change user for certain task unable to make it work. Getting the error Shared connection to host closed
Here is my playbook:
cat test1.yml
- hosts: hcmbox
tasks:
- name: Find out my identity Bourne
command: "whoami"
register: idoutput
become: true
become_user: ebs1
- debug: msg="{{ idoutput.stdout }}"
I get this error:
ansible-playbook -i inventory test1.yml
PLAY [hcmbox] *************************************************************************************************************************************************************************
TASK [Gathering Facts] ****************************************************************************************************************************************************************
ok: [srvdb1.localdomain]
TASK [Find out my identity Bourne] ****************************************************************************************************************************************************
fatal: [srvdb1.localdomain]: FAILED! => {"changed": false, "module_stderr": "Shared connection to srvdb1.localdomain closed.\r\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
PLAY RECAP ****************************************************************************************************************************************************************************
srvdb1.localdomain : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Removed following two lines from playbook
become: true
become_user: ebs1
and executed again, here is the output:
PLAY [hcmbox] *************************************************************************************************************************************************************************
TASK [Gathering Facts] ****************************************************************************************************************************************************************
ok: [srvdb1.localdomain]
TASK [Find out my identity Bourne] ****************************************************************************************************************************************************
changed: [srvdb1.localdomain]
TASK [debug] **************************************************************************************************************************************************************************
ok: [srvdb1.localdomain] => {
"msg": "oracle"
}
PLAY RECAP ****************************************************************************************************************************************************************************
srvdb1.localdomain : ok=3 changed=1 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
The trust and everything looks ok to me:
[oracle#ansctrlsrv epd3]$ ansible all -i 'srvdb1,' -m command -a 'whoami' -u ebs1
srvdb1 | CHANGED | rc=0 >>
ebs1
Even ssh works fine I checked it.
[oracle#ansctrlsrv epd3]$ ssh ebs1#srvdb1
Last login: Tue Feb 4 10:35:51 2020 from ansctrlsrv.localdomain
[ebs1#srvdb1 ~]$
I am at a loss why I am getting this error: Shared connection to srvdb1.localdomain closed
When I run with -vvv This is what I see:
ansible-playbook -i inventory -vvv test3.yml
ansible-playbook 2.8.4
config file = /stage/ap/ansible/epd3/ansible.cfg
configured module search path = [u'/oracle/app/oracle/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /bin/ansible-playbook
python version = 2.7.5 (default, Aug 7 2019, 08:19:52) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39.0.1)]
Using /stage/ap/ansible/epd3/ansible.cfg as config file
host_list declined parsing /stage/ap/ansible/epd3/inventory as it did not pass it's verify_file() method
script declined parsing /stage/ap/ansible/epd3/inventory as it did not pass it's verify_file() method
auto declined parsing /stage/ap/ansible/epd3/inventory as it did not pass it's verify_file() method
Parsed /stage/ap/ansible/epd3/inventory inventory source with ini plugin
PLAYBOOK: test3.yml *******************************************************************************************************************************************************************
1 plays in test3.yml
PLAY [hcmbox] *************************************************************************************************************************************************************************
META: ran handlers
TASK [command] ************************************************************************************************************************************************************************
task path: /stage/ap/ansible/epd3/test3.yml:10
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d dbsrv1.localdomain '/bin/sh -c '"'"'echo ~ && sleep 0'"'"''
<dbsrv1.localdomain> (0, '/oracle/app/oracle\n', '')
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d dbsrv1.localdomain '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /var/tmp/ansible-tmp-1580841289.87-193361522034615 `" && echo ansible-tmp-1580841289.87-193361522034615="` echo /var/tmp/ansible-tmp-1580841289.87-193361522034615 `" ) && sleep 0'"'"''
<dbsrv1.localdomain> (0, 'ansible-tmp-1580841289.87-193361522034615=/var/tmp/ansible-tmp-1580841289.87-193361522034615\n', '')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<dbsrv1.localdomain> PUT /oracle/app/oracle/.ansible/tmp/ansible-local-23567JjjVQn/tmpwYZDSF TO /var/tmp/ansible-tmp-1580841289.87-193361522034615/AnsiballZ_command.py
<dbsrv1.localdomain> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d '[dbsrv1.localdomain]'
<dbsrv1.localdomain> (0, 'sftp> put /oracle/app/oracle/.ansible/tmp/ansible-local-23567JjjVQn/tmpwYZDSF /var/tmp/ansible-tmp-1580841289.87-193361522034615/AnsiballZ_command.py\n', '')
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d dbsrv1.localdomain '/bin/sh -c '"'"'setfacl -m u:ebs1:r-x /var/tmp/ansible-tmp-1580841289.87-193361522034615/ /var/tmp/ansible-tmp-1580841289.87-193361522034615/AnsiballZ_command.py && sleep 0'"'"''
<dbsrv1.localdomain> (0, '', '')
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d -tt dbsrv1.localdomain '/bin/sh -c '"'"'sudo -H -S -n -u ebs1 /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-qtuzljmaohdcqewvlstmcepjcdwdxsiy ; /usr/bin/python /var/tmp/ansible-tmp-1580841289.87-193361522034615/AnsiballZ_command.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<dbsrv1.localdomain> (1, '', 'Shared connection to dbsrv1.localdomain closed.\r\n')
<dbsrv1.localdomain> Failed to connect to the host via ssh: Shared connection to dbsrv1.localdomain closed.
<dbsrv1.localdomain> ESTABLISH SSH CONNECTION FOR USER: None
<dbsrv1.localdomain> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/oracle/app/oracle/.ansible/cp/9c1b54644d dbsrv1.localdomain '/bin/sh -c '"'"'rm -f -r /var/tmp/ansible-tmp-1580841289.87-193361522034615/ > /dev/null 2>&1 && sleep 0'"'"''
<dbsrv1.localdomain> (0, '', '')
failed: [dbsrv1.localdomain] (item=ebs1) => {
"ansible_loop_var": "item",
"changed": false,
"item": "ebs1",
"module_stderr": "Shared connection to dbsrv1.localdomain closed.\r\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
PLAY RECAP ****************************************************************************************************************************************************************************
dbsrv1.localdomain : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
Which Python version is installed on your server?
Can you try to install python3 and add the path to python 3 in your inventory file.
It would look something like this:
srvdb1.localdomain ansible_python_interpreter=/usr/bin/python3

Shell script execution is not working in remote server Ansible (previous tasks executed successfully)

I am not able to execute shell script remotely in Ansible. However, there are previous tasks in the same role (filebeat) that are executed in remote server successfully. I am running the following in local server 172.28.28.6 server to install and run filebeat in remote server 172.28.28.81
Playbook: install-filebeat.yml:
hosts: filebeat-servers
remote_user: wwwadm
sudo: yes
roles:
- { role: /vagrant/roles/filebeat}
Role filebeat: main.yml:
---
# tasks file for filebeat
- name: "Extract Filebeat"
unarchive:
src: "{{ tmp_artifact_cache }}/{{ filebeat_archive }}"
remote_src: yes
dest: "{{ filebeat_root_dir }}"
extra_opts: ['--transform=s,/*[^/]*,{{ filebeat_ver }},i', '--show-stored-names']
become: yes
become_user: "{{ filebeat_install_as }}"
when: not ansible_check_mode
tags: [ 'filebeat' ]
- name: Configure Filebeat
template:
src: "filebeat.yml.j2"
dest: "{{ filebeat_install_dir }}/filebeat.yml"
mode: 0775
become: yes
become_user: "{{ filebeat_install_as }}"
tags: [ 'filebeat' ]
- name: 'Filebeat startup script'
template:
src: "startup.sh.j2"
dest: "{{ filebeat_install_dir }}/bin/startup.sh"
mode: 0755
become: yes
become_user: "{{ filebeat_install_as }}"
tags: [ 'filebeat', 'start' ]
#This one does not get executed at all:
- name: "Start Filebeat"
# shell: "{{ filebeat_install_dir }}/bin/startup.sh"
command: "sh {{ filebeat_install_dir }}/bin/startup.sh"
become: yes
become_user: "{{ filebeat_install_as }}"
defaults:
# defaults file for filebeat
filebeat_ver: "6.6.0"
filebeat_archive: "filebeat-{{ filebeat_ver }}-linux-x86_64.tar.gz"
filebeat_archive_checksum : "sha1:d38d8fea7e9915582720280eb0118b7d92569b23"
filebeat_url: "https://artifacts.elastic.co/downloads/beats/filebeat/{{ filebeat_archive }}"
filebeat_root_dir: "{{ apps_home }}/filebeat"
filebeat_data_dir: "{{ apps_data }}/filebeat"
filebeat_log_dir: "{{ apps_logs }}/filebeat"
filebeat_install_dir: "{{ filebeat_root_dir }}/{{ filebeat_ver }}"
filebeat_cert_dir: "/etc/pki/tls/certs"
filebeat_ssl_certificate_file: "logstash.crt"
filebeat_ssl_key_file: "logstash.key"
filebeat_install_as: "{{ install_user | default('wwwadm') }}"
filebeat_set_as_current: yes
filebeat_force_clean_install: no
filebeat_java_home: "{{ sw_home }}/jdk"
inventory/local/hosts:
localhost ansible_connection=local
[filebeat-servers]
172.28.28.81 ansible_user=vagrant ansible_connection=ssh
Filebeat is installed and changes are done in the remote server except the last step which is the execution of shell script
When running the playbook as follows:
ansible-playbook -i /vagrant/inventory/local install-filebeat.yml -vvv
Getting the following output related to the shell execution:
TASK [/vagrant/roles/filebeat : Start Filebeat] ***************************************************************************************************************************************************************
task path: /vagrant/roles/filebeat/tasks/main.yml:184
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 172.28.28.81 '/bin/sh -c '"'"'echo ~vagrant && sleep 0'"'"''
<172.28.28.81> (0, '/home/vagrant\n', '')
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 172.28.28.81 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /var/tmp/ansible-tmp-1550178583.24-35955954120606 `" && echo ansible-tmp-1550178583.24-35955954120606="` echo /var/tmp/ansible-tmp-1550178583.24-35955954120606 `" ) && sleep 0'"'"''
<172.28.28.81> (0, 'ansible-tmp-1550178583.24-35955954120606=/var/tmp/ansible-tmp-1550178583.24-35955954120606\n', '')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<172.28.28.81> PUT /home/vagrant/.ansible/tmp/ansible-local-13658UX7cBC/tmpFzf2Ll TO /var/tmp/ansible-tmp-1550178583.24-35955954120606/AnsiballZ_command.py
<172.28.28.81> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 '[172.28.28.81]'
<172.28.28.81> (0, 'sftp> put /home/vagrant/.ansible/tmp/ansible-local-13658UX7cBC/tmpFzf2Ll /var/tmp/ansible-tmp-1550178583.24-35955954120606/AnsiballZ_command.py\n', '')
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 172.28.28.81 '/bin/sh -c '"'"'setfacl -m u:wwwsvr:r-x /var/tmp/ansible-tmp-1550178583.24-35955954120606/ /var/tmp/ansible-tmp-1550178583.24-35955954120606/AnsiballZ_command.py && sleep 0'"'"''
<172.28.28.81> (0, '', '')
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 -tt 172.28.28.81 '/bin/sh -c '"'"'sudo -H -S -n -u wwwsvr /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-ntzchfzqggiteuqwzpiurlloddbdhevp; /usr/bin/python /var/tmp/ansible-tmp-1550178583.24-35955954120606/AnsiballZ_command.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<172.28.28.81> (0, '\r\n{"changed": true, "end": "2019-02-14 13:09:44.800191", "stdout": "Starting Filebeat", "cmd": ["sh", "/apps_ux/filebeat/6.6.0/bin/startup.sh"], "rc": 0, "start": "2019-02-14 13:09:43.792122", "stderr": "+ export JAVA_HOME=/sw_ux/jdk\\n+ JAVA_HOME=/sw_ux/jdk\\n+ echo \'Starting Filebeat\'\\n+ /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat", "delta": "0:00:01.008069", "invocation": {"module_args": {"warn": true, "executable": null, "_uses_shell": false, "_raw_params": "sh /apps_ux/filebeat/6.6.0/bin/startup.sh", "removes": null, "argv": null, "creates": null, "chdir": null, "stdin": null}}}\r\n', 'Shared connection to 172.28.28.81 closed.\r\n')
<172.28.28.81> ESTABLISH SSH CONNECTION FOR USER: vagrant
<172.28.28.81> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 -o ControlPath=/home/vagrant/.ansible/cp/f66f05c055 172.28.28.81 '/bin/sh -c '"'"'rm -f -r /var/tmp/ansible-tmp-1550178583.24-35955954120606/ > /dev/null 2>&1 && sleep 0'"'"''
<172.28.28.81> (0, '', '')
changed: [172.28.28.81] => {
"changed": true,
"cmd": [
"sh",
"/apps_ux/filebeat/6.6.0/bin/startup.sh"
],
"delta": "0:00:01.008069",
"end": "2019-02-14 13:09:44.800191",
"invocation": {
"module_args": {
"_raw_params": "sh /apps_ux/filebeat/6.6.0/bin/startup.sh",
"_uses_shell": false,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"warn": true
}
},
"rc": 0,
"start": "2019-02-14 13:09:43.792122",
"stderr": "+ export JAVA_HOME=/sw_ux/jdk\n+ JAVA_HOME=/sw_ux/jdk\n+ echo 'Starting Filebeat'\n+ /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat",
"stderr_lines": [
"+ export JAVA_HOME=/sw_ux/jdk",
"+ JAVA_HOME=/sw_ux/jdk",
"+ echo 'Starting Filebeat'",
"+ /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat"
],
"stdout": "Starting Filebeat",
"stdout_lines": [
"Starting Filebeat"
]
}
META: ran handlers
META: ran handlers
PLAY RECAP ****************************************************************************************************************************************************************************************************
172.28.28.81 : ok=18 changed=7 unreachable=0 failed=0
On remote server:
[6.6.0:vagrant]$ cd bin
[bin:vagrant]$ ls -ltr
total 36068
-rwxr-xr-x. 1 wwwadm wwwadm 36927014 Jan 24 02:30 filebeat
-rwxr-xr-x. 1 wwwadm wwwadm 478 Feb 14 12:54 startup.sh
[bin:vagrant]$ pwd
/apps_ux/filebeat/6.6.0/bin
[bin:vagrant]$ more startup.sh
#!/usr/bin/env bash
set -x
export JAVA_HOME="/sw_ux/jdk"
#To save pid into a file is an open feature: https://github.com/elastic/logstash/issues/3577. There is no -p flag for filebeat to save the pid and then kill it.
echo 'Starting Filebeat'
/apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /a
pps_data/logs/filebeat &
No process running found by executing ps command
[bin:vagrant]$ ps -fea | grep filebeat | grep -v grep
However, if I connect to the remote server, I am able to run filebeat by executing the script with the user wwwadm and filebeat starts successfully:
[bin:wwwadm]$ pwd
/apps_ux/filebeat/6.6.0/bin
[bin:wwwadm]$ id
uid=778(wwwadm) gid=778(wwwadm) groups=778(wwwadm) context=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
[bin:wwwadm]$ ./startup.sh
+ export JAVA_HOME=/sw_ux/jdk
+ JAVA_HOME=/sw_ux/jdk
+ echo 'Starting Filebeat'
Starting Filebeat
+ /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat
[bin:wwwadm]$ ps -fea | grep filebeat | grep -v grep
wwwadm 19160 1 0 15:12 pts/0 00:00:00 /apps_ux/filebeat/6.6.0/bin/filebeat -c /apps_ux/filebeat/6.6.0/config/filebeat.yml -path.home /apps_ux/filebeat/6.6.0 -path.config /apps_ux/filebeat/6.6.0/config -path.data /apps_data/filebeat -path.logs /apps_data/logs/filebeat
Thanks
You should use nohup to run it in background.
because when ansible exits, all processes associated with the session
will be terminated. To avoid this you should use nohup.
Correct command is:
- name: "Start Filebeat"
# shell: "{{ filebeat_install_dir }}/bin/startup.sh"
command: "nohup sh {{ filebeat_install_dir }}/bin/startup.sh &>> startup.log &"
become: yes
become_user: "{{ filebeat_install_as }}"
You have to use the disown built-in command to inform the shell that it should not kill background processes when you disconnect; you can also use nohup for that same effect
Having said that, you are for sure solving the wrong problem, because if^H^Hwhen filebeat falls over, there is nothing monitoring that service to keep it alive. You'll want to use systemd (or its equivalent on your system) to ensure that filebeat stays running, and by using the mechanism designed for that stuff, you side-step all the "disown or nohup" business that causes you to ask S.O. questions.

Ansible EC2 add multiple hosts

trying to find and add hosts dynamically like so
---
- hosts: localhost
gather_facts: no
tasks:
- name: Gather EC2 remote facts.
ec2_remote_facts:
region: 'us-east-1'
register: ec2_remote_facts
- name: Debug.
debug:
msg: "{{ ec2_remote_facts }}"
- name: get instances for tags
add_host:
name: "{{ item }}"
group: dynamically_created_hosts
with_items: |
"{{ ec2_remote_facts.instances |
selectattr('tags.AppName', 'defined') | selectattr('tags.AppName', 'equalto', 'sql') |
selectattr('tags.AppType', 'defined') | selectattr('tags.AppType', 'equalto', 'infra') |
map(attribute='private_ip_address') | list }}"
- hosts:
- dynamically_created_hosts
become: yes
become_user: root
serial: 1
vars_files:
- group_vars/all
tasks:
- name: run command
shell: "uname -a"
I get following when i run in verbose mode
TASK [get instances for tags] **************************************************
task path: /Users/me/gitfork2/fornax/dynhst.yml:39
creating host via 'add_host': hostname="[u'10.112.114.241']"
changed: [localhost] => (item="[u'10.112.114.241']") => {"add_host": {"groups": ["dynamically_created_hosts"], "host_name": "\"[u'10.112.114.241']\"", "host_vars": {"group": "dynamically_created_hosts"}}, "changed": true, "invocation": {"module_args": {"group": "dynamically_created_hosts", "hostname": "\"[u'10.112.114.241']\""}, "module_name": "add_host"}, "item": "\"[u'10.112.114.241']\""}
PLAY [dynamically_created_hosts] ***********************************************
TASK [setup] *******************************************************************
<"[u'10.112.114.241']"> ESTABLISH SSH CONNECTION FOR USER: None
<"[u'10.112.114.241']"> SSH: ansible.cfg set ssh_args: (-F)(/Users/me/.ssh/config)
<"[u'10.112.114.241']"> SSH: ANSIBLE_HOST_KEY_CHECKING/host_key_checking disabled: (-o)(StrictHostKeyChecking=no)
<"[u'10.112.114.241']"> SSH: ansible_password/ansible_ssh_pass not set: (-o)(KbdInteractiveAuthentication=no)(-o)(PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey)(-o)(PasswordAuthentication=no)
<"[u'10.112.114.241']"> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=10)
<"[u'10.112.114.241']"> SSH: PlayContext set ssh_common_args: ()
<"[u'10.112.114.241']"> SSH: PlayContext set ssh_extra_args: ()
<"[u'10.112.114.241']"> SSH: EXEC ssh -C -vvv -F /Users/me/.ssh/config -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 '"[u'"'"'10.112.114.241'"'"']"' '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1509843772.58-176166317659656 `" && echo ansible-tmp-1509843772.58-176166317659656="` echo $HOME/.ansible/tmp/ansible-tmp-1509843772.58-176166317659656 `" ) && sleep 0'"'"''
fatal: ["[u'10.112.114.241']"]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh.", "unreachable": true}
to retry, use: --limit #./dynhst.retry
The odd thing here I see is
SSH: EXEC ssh -C -vvv -F /Users/me/.ssh/config -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 '"[u'"'"'10.112.114.241'"'"']"' '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1509843772.58-176166317659656 `" && echo ansible-tmp-1509843772.58-176166317659656="` echo $HOME/.ansible/tmp/ansible-tmp-1509843772.58-176166317659656 `" ) && sleep 0'"'"''
Seems like it is trying to ssh into '"[u'"'"'10.112.114.241'"'"']"' ... seems like the dynamically_created_hosts is being used as a string and not as a list
Any ideas why?
You pass a list (of IP addresses) to an argument name which requires a string:
hostname="[u'10.112.114.241']"
[ ] is a JSON representation of a list (single element in the example above).
If you want the first address from the list (and there seems to be no more for any of your hosts), then:
add_host:
name: "{{ item[0] }}"
group: dynamically_created_hosts
with_items: ...

Ansible throwing "Failed to connect to the host via ssh: Permission denied (publickey)." error with remote connections

I have the following playbook:
- hosts: localhost
connection: local
remote_user: test
gather_facts: no
vars_files:
- files/aws_creds.yml
- files/info.yml
environment:
AWS_ACCESS_KEY_ID: "{{ aws_id }}"
AWS_SECRET_ACCESS_KEY: "{{ aws_key }}"
s3cmd_access_key: "{{ aws_id }}"
s3cmd_secret_key: "{{ aws_key }}"
tasks:
- name: Basic provisioning of EC2 instance
ec2:
assign_public_ip: no
aws_access_key: "{{ aws_id }}"
aws_secret_key: "{{ aws_key }}"
region: "{{ aws_region }}"
image: "{{image_instance }}"
instance_type: "{{ free_instance }}"
key_name: "{{ ssh_keyname }}"
count: 3
state: present
group_id: "{{ secgroup_id }}"
vpc_subnet_id: "{{ private_subnet_id }}"
wait: no
instance_tags:
Name: Dawny33Template
#delete_on_termination: yes
register: ec2
- name: Add new instance to host group
add_host:
hostname: "{{ item.private_ip }}"
groupname: launched
with_items: "{{ ec2.instances }}"
- name: Wait for SSH to come up
wait_for:
host: "{{ item.private_ip }}"
port: 22
delay: 60
timeout: 320
state: started
with_items: "{{ ec2.instances }}"
- hosts: launched
sudo: true
remote_user: test
gather_facts: yes
vars_files:
- files/aws_creds.yml
- files/info.yml
environment:
AWS_ACCESS_KEY_ID: "{{ aws_id }}"
AWS_SECRET_ACCESS_KEY: "{{ aws_key }}"
s3cmd_access_key: "{{ aws_id }}"
s3cmd_secret_key: "{{ aws_key }}"
tasks:
- name: Add file system for the volume
command: mkfs -t ext4 /dev/xvdb
sudo: yes
- name: Create a directory for mounting
command: mkdir /home/ec2-user/EncryptedEBS
- name: Mount the volume
command: mount /dev/xvdb /home/ec2-user/EncryptedEBS
sudo: yes
- name: Owning the mounted folder
command: chown ec2-user /home/ec2-user/EncryptedEBS/lost+found/
sudo: yes
- name: check out a git repository
git: repo={{ repo_url }} dest=/home/ec2-user/EncryptedEBS/GitRepo accept_hostkey=yes force=yes
vars:
repo_url: https://github.com/Dawny33/AnsibleExperiments
become: yes
- name: Go to the folder and execute command
command: chmod 0755 /home/ec2-user/EncryptedEBS/GitRepo/processing.py
become: yes
become_user: root
- name: Run Py script
command: /home/ec2-user/EncryptedEBS/GitRepo/processing.py {{ N }} {{ bucket_name }}
become: yes
become_user: root
However, I get the "Permission denied" error, when Ansible tries to connect to my remote hosts, even though I have defined the env. variables in environment
Is there anything which I did wrong here?
Error:
fatal: [10.0.1.62]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey).\r\n",
"unreachable": true
}
fatal: [10.0.1.177]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey).\r\n",
"unreachable": true
}
fatal: [10.0.1.151]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey).\r\n",
"unreachable": true
}
Adding the complete -vvv output:
Using module file /usr/local/lib/python2.7/site-packages/ansible/modules/core/system/setup.py
<10.0.1.170> ESTABLISH SSH CONNECTION FOR USER: ec2-user
Using module file /usr/local/lib/python2.7/site-packages/ansible/modules/core/system/setup.py
<10.0.1.11> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<10.0.1.170> SSH: EXEC ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ec2-user -o ConnectTimeout=10 10.0.1.170 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" && echo ansible-tmp-1487158610.11-137345507492691="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" ) && sleep 0'"'"''
<10.0.1.11> SSH: EXEC ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ec2-user -o ConnectTimeout=10 10.0.1.11 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" && echo ansible-tmp-1487158610.11-2307895121172="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" ) && sleep 0'"'"''
Using module file /usr/local/lib/python2.7/site-packages/ansible/modules/core/system/setup.py
<10.0.1.45> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<10.0.1.45> SSH: EXEC ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ec2-user -o ConnectTimeout=10 10.0.1.45 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" && echo ansible-tmp-1487158610.12-3620848798638="` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" ) && sleep 0'"'"''
<10.0.1.170> ssh_retry: attempt: 0, ssh return code is 255. cmd (/bin/sh -c '( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" && echo ansible-tmp-1487158610.11-137345507492691="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" ) && sleep 0'...), pausing for 0 seconds
<10.0.1.170> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<10.0.1.170> SSH: EXEC ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ec2-user -o ConnectTimeout=10 10.0.1.170 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" && echo ansible-tmp-1487158610.11-137345507492691="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" ) && sleep 0'"'"''
<10.0.1.11> ssh_retry: attempt: 0, ssh return code is 255. cmd (/bin/sh -c '( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" && echo ansible-tmp-1487158610.11-2307895121172="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" ) && sleep 0'...), pausing for 0 seconds
<10.0.1.11> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<10.0.1.11> SSH: EXEC ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ec2-user -o ConnectTimeout=10 10.0.1.11 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" && echo ansible-tmp-1487158610.11-2307895121172="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" ) && sleep 0'"'"''
<10.0.1.45> ssh_retry: attempt: 0, ssh return code is 255. cmd (/bin/sh -c '( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" && echo ansible-tmp-1487158610.12-3620848798638="` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" ) && sleep 0'...), pausing for 0 seconds
<10.0.1.45> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<10.0.1.45> SSH: EXEC ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ec2-user -o ConnectTimeout=10 10.0.1.45 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" && echo ansible-tmp-1487158610.12-3620848798638="` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" ) && sleep 0'"'"''
<10.0.1.170> ssh_retry: attempt: 1, ssh return code is 255. cmd (/bin/sh -c '( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" && echo ansible-tmp-1487158610.11-137345507492691="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" ) && sleep 0'...), pausing for 1 seconds
<10.0.1.11> ssh_retry: attempt: 1, ssh return code is 255. cmd (/bin/sh -c '( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" && echo ansible-tmp-1487158610.11-2307895121172="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" ) && sleep 0'...), pausing for 1 seconds
<10.0.1.45> ssh_retry: attempt: 1, ssh return code is 255. cmd (/bin/sh -c '( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" && echo ansible-tmp-1487158610.12-3620848798638="` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" ) && sleep 0'...), pausing for 1 seconds
<10.0.1.170> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<10.0.1.170> SSH: EXEC ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ec2-user -o ConnectTimeout=10 10.0.1.170 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" && echo ansible-tmp-1487158610.11-137345507492691="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-137345507492691 `" ) && sleep 0'"'"''
<10.0.1.11> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<10.0.1.11> SSH: EXEC ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ec2-user -o ConnectTimeout=10 10.0.1.11 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" && echo ansible-tmp-1487158610.11-2307895121172="` echo ~/.ansible/tmp/ansible-tmp-1487158610.11-2307895121172 `" ) && sleep 0'"'"''
<10.0.1.45> ESTABLISH SSH CONNECTION FOR USER: ec2-user
<10.0.1.45> SSH: EXEC ssh -o ForwardAgent=yes -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ec2-user -o ConnectTimeout=10 10.0.1.45 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" && echo ansible-tmp-1487158610.12-3620848798638="` echo ~/.ansible/tmp/ansible-tmp-1487158610.12-3620848798638 `" ) && sleep 0'"'"''
fatal: [10.0.1.11]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey).\r\n",
"unreachable": true
}
fatal: [10.0.1.170]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey).\r\n",
"unreachable": true
}
fatal: [10.0.1.45]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey).\r\n",
"unreachable": true
}
Don't forget, when you use ec2.py you should add your pem first, like this:
ssh-add /home/yourusername/.ssh/your.pem
Here, I was not sure why the key is not even allowing manual ssh. So, I generated a new key(pem file) and worked with it. [Manual ssh worked with that file]
Now, the problem lies in the following block:
- hosts: launched
sudo: true
remote_user: test
gather_facts: yes
I edited it to be:
- hosts: launched
sudo: no
connection: ssh
remote_user: ec2-user
gather_facts: yes
and it worked. The reason must be obvious. The connection have to be an ssh and not local, and the username should be ec2-user for an Amazon Linux Instance and Ubuntu for an ubuntu instance.
Generate ssh public key using ssh-keygen tool
and copy the ~/.ssh/id_rsa.pub key into ~/.ssh/authorized_keys file.

Not able to print complete output via Ansible

I have a command that needs to be executed remotely.
find /opt/cac/CI/release/releases -name "*.tar" -exec md5sum {} \;
the complete output is ::
e5af5514887e8cbc08815936558a3220 /opt/cac/CI/release/releases/permission/4.00.00.04/CXP_902_8059_4.00.00.04_20161007-131208.tar
d35eae58399770627ba42f5b538a9cab /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-052332.tar
784d035f9959f6a3006aaab202c13015 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-060837.tar
3234cc8944c1d26c1ff0fac844ae8674 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-062202.tar
431368042e9f5e62de37787cd2d05b08 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20161008-173030.tar
I am trying to do the same using Ansible :
- hosts: tmoittecac
tasks:
- name: Report md5sum of Release files.
shell: find /opt/cac/CI/release/releases -name "*.tar" -exec md5sum {} \;
register: pre_md5_check
tags: a
- debug: msg={{ pre_md5_check.stdout_lines }}
tags: b
But the output that I get is ::
TASK: [debug msg={{ pre_md5_check.stdout_lines }}] ****************************
ok: [tmoittecac] => {
"msg": "['e5af5514887e8cbc08815936558a3220 /opt/cac/CI/release/releases/permission/4.00.00.04/CXP_902_8059_4.00.00.04_20161007-131208.tar',"
}
I am only getting the first line of the actual output.
Running the playbook in verbose mode gives me.
TASK: [Report md5sum of Release files.] ***************************************
<147.128.72.59> ESTABLISH CONNECTION FOR USER: local
<147.128.72.59> REMOTE_MODULE command find /opt/cac/CI/release/releases -name "*.tar" -exec md5sum {} \; #USE_SHELL
<147.128.72.59> EXEC sshpass -d7 ssh -C -tt -vvv -o ControlMaster=auto -o ControlPersist=60s -o ControlPath="/root/.ansible/cp/ansible-ssh-%h-%p-%r" -o StrictHostKeyChecking=no -o GSSAPIAuthentication=no -o PubkeyAuthentication=no -o User=local -o ConnectTimeout=10 147.128.72.59 /bin/sh -c 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055 && echo $HOME/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055'
<147.128.72.59> PUT /tmp/tmpm6eavD TO /opt/home/local/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055/command
<147.128.72.59> EXEC sshpass -d7 ssh -C -tt -vvv -o ControlMaster=auto -o ControlPersist=60s -o ControlPath="/root/.ansible/cp/ansible-ssh-%h-%p-%r" -o StrictHostKeyChecking=no -o GSSAPIAuthentication=no -o PubkeyAuthentication=no -o User=local -o ConnectTimeout=10 147.128.72.59 /bin/sh -c 'LANG=C LC_CTYPE=C /usr/bin/python /opt/home/local/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055/command; rm -rf /opt/home/local/.ansible/tmp/ansible-tmp-1476216014.64-144818429840055/ >/dev/null 2>&1'
changed: [tmoittecac] => {"changed": true, "cmd": "find /opt/cac/CI/release/releases -name \"*.tar\" -exec md5sum {} \\;", "delta": "0:00:01.172660", "end": "2016-10-12 03:53:14.020937", "rc": 0, "start": "2016-10-12 03:53:12.848277", "stderr": "", "stdout": "e5af5514887e8cbc08815936558a3220 /opt/cac/CI/release/releases/permission/4.00.00.04/CXP_902_8059_4.00.00.04_20161007-131208.tar\nd35eae58399770627ba42f5b538a9cab /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-052332.tar\n784d035f9959f6a3006aaab202c13015 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-060837.tar\n3234cc8944c1d26c1ff0fac844ae8674 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20160927-062202.tar\n431368042e9f5e62de37787cd2d05b08 /opt/cac/CI/release/releases/privacyvault/4.08.01.00/CXP_902_8185_4.08.01.00_20161008-173030.tar\n7f637400276cb25f7f3b2f869d915dc7 /opt/cac/CI/release/releases/notification/4.00.00.03/CXP_902_8347_4.00.00.03_20160928-070050.tar\nfa4a0aea0c096c703f2a9a741d2d1152 /opt/cac/CI/release/releases/user-preference/4.08.01.00/CXP_902_8717_4.08.01.00_20160929-034340.tar\n34f084c617f49123fc0edef358d15784 /opt/cac/CI/release/releases/captcha/2.08.01.00/CXP_902_8881_2.08.01.00_20160929-043449.tar\n2041d873f0619f1c9a4c8e419156753e /opt/cac/CI/release/releases/consumerProfile/3.08.02.00/CXP_902_8057_3.08.02.00_20161008-063249.tar\n7a0a9efe4232eebbb5c05e5b84fb6bec /opt/cac/CI/release/releases/consumerProfile/3.08.02.00/CXP_902_8057_3.08.02.00_20161008-071824.tar", "warnings": []}
TASK: [debug msg={{ pre_md5_check.stdout_lines }}] ****************************
<147.128.72.59> ESTABLISH CONNECTION FOR USER: local
ok: [tmoittecac] => {
"msg": "['e5af5514887e8cbc08815936558a3220 /opt/cac/CI/release/releases/permission/4.00.00.04/CXP_902_8059_4.00.00.04_20161007-131208.tar',"
}
Is these something I am missing ?
More INFO ::
tasks:
- name: Check md5sums of WAR files.
shell: find /opt/cac/CI/release/releases -name "*.tar" -exec md5sum {} \;
register: MD5SUMS
tags: a
- name: Output md5sum of WAR files.
debug: var={{ MD5SUMS }}
tags: b
Output ::
GATHERING FACTS ***************************************************************
ok: [tmoittecac]
TASK: [Check md5sums of WAR files.] *******************************************
changed: [tmoittecac]
TASK: [Output md5sum of WAR files.] *******************************************
ok: [tmoittecac] => {
"var": {
"{'changed':": "{'changed':"
}
}
PLAY RECAP ********************************************************************
tmoittecac : ok=3 changed=1 unreachable=0 failed=0
Ansible version is :
ansible --version
ansible 1.9.4
Issue was solved by upgrading Ansible version to 2.1.

Resources