pipelining is being ignored by Ansible Tower - ansible

I am trying to enable pipelining but no matter what i'm trying, it doesn't work.
ansible.cfg contents:
[defaults]
transport = ssh
host_key_checking = false
retry_files_enabled = false
nocows=true
remote_user=ansible
# display_skipped_hosts = false
allow_world_readable_tmpfiles = True
warning = False
roles_path = ./roles
ansible_managed = Ansible managed: {file} modified on %Y-%m-%d %H:%M:%S by {uid} on {host}
callback_whitelist = profile_tasks
[privilege_escalation]
become_method = su
become_exe = 'sx su -'
[ssh_connection]
pipelining = true
ssh_args = -C -o ControlMaster=auto -o ControlPersist=600s -o ControlPath=/tmp/ansible-ssh-%%h-%%p-%%r
Ansible Tower version: 3.4.1 (by Redhat)
Ansible env version used: 2.9, 2.8 and 2.7
Ansible Tower runs on RHEL 6 and the VM against which i am trying to run the playbook is on RHEL 7.
I do not have access on the Tower VM to check configs.
This is the log of my playbook running:
TASK [pipelining : Check Date] *************************************************
task path: /var/lib/awx/projects/_32459__project/roles/pipelining/tasks/main.yml:1
Friday 06 March 2020 14:46:33 +0000 (0:00:00.210) 0:00:00.210 **********
<myVM> ESTABLISH SSH CONNECTION FOR USER: ansible
<myVM> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=600s -o ControlPath=/tmp/ansible-ssh-%%h-%%p-%%r -o StrictHostKeyChecking=no -o 'User="ansible"' -o ConnectTimeout=10 myVM '/bin/sh -c '"'"'echo ~ansible && sleep 0'"'"''
<myVM> (0, '/home/ansible\n', "Warning: Permanently added 'myVM,10.37.30.170' (RSA) to the list of known hosts.")
<myVM> ESTABLISH SSH CONNECTION FOR USER: ansible
<myVM> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=600s -o ControlPath=/tmp/ansible-ssh-%%h-%%p-%%r -o StrictHostKeyChecking=no -o 'User="ansible"' -o ConnectTimeout=10 myVM '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658 `" && echo ansible-tmp-1583505993.07-223021018622658="` echo /home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658 `" ) && sleep 0'"'"''
<myVM> (0, 'ansible-tmp-1583505993.07-223021018622658=/home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658\n', '')
<VMhost> Attempting python interpreter discovery
<myVM> ESTABLISH SSH CONNECTION FOR USER: ansible
<myVM> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=600s -o ControlPath=/tmp/ansible-ssh-%%h-%%p-%%r -o StrictHostKeyChecking=no -o 'User="ansible"' -o ConnectTimeout=10 myVM '/bin/sh -c '"'"'echo PLATFORM; uname; echo FOUND; command -v '"'"'"'"'"'"'"'"'/usr/bin/python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.5'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/libexec/platform-python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/bin/python3'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python'"'"'"'"'"'"'"'"'; echo ENDFOUND && sleep 0'"'"''
<myVM> (0, 'PLATFORM\nLinux\nFOUND\n/usr/bin/python\n/usr/bin/python2.7\n/usr/libexec/platform-python\n/usr/bin/python\nENDFOUND\n', '')
<myVM> ESTABLISH SSH CONNECTION FOR USER: ansible
<myVM> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=600s -o ControlPath=/tmp/ansible-ssh-%%h-%%p-%%r -o StrictHostKeyChecking=no -o 'User="ansible"' -o ConnectTimeout=10 myVM '/bin/sh -c '"'"'/usr/bin/python && sleep 0'"'"''
<myVM> (0, '{"osrelease_content": "NAME=\\"Red Hat Enterprise Linux Server\\"\\nVERSION=\\"7.7 (Maipo)\\"\\nID=\\"rhel\\"\\nID_LIKE=\\"fedora\\"\\nVARIANT=\\"Server\\"\\nVARIANT_ID=\\"server\\"\\nVERSION_ID=\\"7.7\\"\\nPRETTY_NAME=\\"Red Hat Enterprise Linux Server 7.7 (Maipo)\\"\\nANSI_COLOR=\\"0;31\\"\\nCPE_NAME=\\"cpe:/o:redhat:enterprise_linux:7.7:GA:server\\"\\nHOME_URL=\\"https://www.redhat.com/\\"\\nBUG_REPORT_URL=\\"https://bugzilla.redhat.com/\\"\\n\\nREDHAT_BUGZILLA_PRODUCT=\\"Red Hat Enterprise Linux 7\\"\\nREDHAT_BUGZILLA_PRODUCT_VERSION=7.7\\nREDHAT_SUPPORT_PRODUCT=\\"Red Hat Enterprise Linux\\"\\nREDHAT_SUPPORT_PRODUCT_VERSION=\\"7.7\\"\\n", "platform_dist_result": ["redhat", "7.7", "Maipo"]}\n', '')
Using module file /var/lib/awx/venv/ansible_2_8/lib/python2.7/site-packages/ansible/modules/commands/command.py
<myVM> PUT /var/lib/awx/.ansible/tmp/ansible-local-3omzq9J/tmpquCP2N TO /home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658/AnsiballZ_command.py
<myVM> SSH: EXEC sshpass -d10 sftp -o BatchMode=no -b - -C -o ControlMaster=auto -o ControlPersist=600s -o ControlPath=/tmp/ansible-ssh-%%h-%%p-%%r -o StrictHostKeyChecking=no -o 'User="ansible"' -o ConnectTimeout=10 '[myVM]'
<myVM> (0, 'sftp> put /var/lib/awx/.ansible/tmp/ansible-local-3omzq9J/tmpquCP2N /home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658/AnsiballZ_command.py\nUploading /var/lib/awx/.ansible/tmp/ansible-local-3omzq9J/tmpquCP2N to /home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658/AnsiballZ_command.py\n', '')
<myVM> ESTABLISH SSH CONNECTION FOR USER: ansible
<myVM> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=600s -o ControlPath=/tmp/ansible-ssh-%%h-%%p-%%r -o StrictHostKeyChecking=no -o 'User="ansible"' -o ConnectTimeout=10 myVM '/bin/sh -c '"'"'chmod u+x /home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658/ /home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658/AnsiballZ_command.py && sleep 0'"'"''
<myVM> (0, '', '')
<myVM> ESTABLISH SSH CONNECTION FOR USER: ansible
<myVM> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=600s -o ControlPath=/tmp/ansible-ssh-%%h-%%p-%%r -o StrictHostKeyChecking=no -o 'User="ansible"' -o ConnectTimeout=10 -tt myVM '/bin/sh -c '"'"'sx su - root -c '"'"'"'"'"'"'"'"'/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-iaimhvmledvpyuy ; /usr/bin/python /home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658/AnsiballZ_command.py'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"''"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<myVM> (0, 'Last login: Fri Mar 6 15:12:26 2020 from 10.77.228.130\r\r\n\r\n{"changed": true, "end": "2020-03-06 15:46:35.478978", "stdout": "Fri Mar 6 15:46:35 CET 2020", "cmd": "date", "rc": 0, "start": "2020-03-06 15:46:35.475245", "stderr": "", "delta": "0:00:00.003733", "invocation": {"module_args": {"creates": null, "executable": null, "_uses_shell": true, "strip_empty_ends": true, "_raw_params": "date", "removes": null, "argv": null, "warn": true, "chdir": null, "stdin_add_newline": true, "stdin": null}}}\r\n', 'Shared connection to myVM closed.\r\n')
<myVM> ESTABLISH SSH CONNECTION FOR USER: ansible
<myVM> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=600s -o ControlPath=/tmp/ansible-ssh-%%h-%%p-%%r -o StrictHostKeyChecking=no -o 'User="ansible"' -o ConnectTimeout=10 myVM '/bin/sh -c '"'"'rm -f -r /home/ansible/.ansible/tmp/ansible-tmp-1583505993.07-223021018622658/ > /dev/null 2>&1 && sleep 0'"'"''
<myVM> (0, '', '')
I was expecting to see only one SSH: EXEC and no PUT.
Am i doing something wrong? I did try mitogen and it works, except the Become part, so i can't use it. I was hoping to be able to enable pipelining instead.
Another thing to mention, the RHEL7 VM, where the playbook will check the date, write a file and delete it, has this line in /etc/sudoers: Defaults !requiretty
I also tried using vars per task and set ansible_ssh_pipelining: true but with the same result.
Could the pipelining be disabled on a higher level in Tower? Am i tired and doing mistakes? I am going crazy with this...

The reason is the usage of "su" as a become_method. It seems that it doesn't support pipelining.
Following comment can be found in the code:
# su does not work with pipelining
https://github.com/ansible/ansible/blob/v2.10.3/lib/ansible/plugins/action/init.py
Ref: https://github.com/ansible/ansible/issues/35698

Related

ansible succeeds but doesn't start flocked, piped, detached docker compose via bash script

Expected behaviour
ansible starts docker compose with flock and logging to file using the bash script without delay.
Observed behaviours
service does not get launched via ansible. Monit starts service correctly via bash script but delayed.
Description
I run updates via ansible and want to restart/rebuild docker-compose with the bash script via ansible without delay if possible.
Bash script
#! /bin/bash
lockfile="/var/run/lock/captain-hook"
export ENVIRONMENT=production
cd /home/captain-hook/captain-hook
flock -n "$lockfile" -c "docker-compose up --build --force-recreate >> /var/log/captain-hook/log 2>&1 &"
exit 0
Ansible task file
- name: stop container
command: docker stop captain_hook
- name: Sleep for 10 seconds
wait_for:
timeout: 10
- name: start docker-compose
shell: |
/etc/monit/scripts/start_captain_hook.sh
args:
chdir: /home/captain-hook/captain-hook
debug output from ansible
TASK [captain-hook : start docker-compose] *****************************************************************************************************
task path: /home/axel/Dropbox/0_Programming/trading/trade-cloud-ansible/roles/captain-hook/tasks/code-update.yml:24
<168.119.100.44> ESTABLISH SSH CONNECTION FOR USER: root
<168.119.100.44> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/tmp/%h-%r 168.119.100.44 '/bin/sh -c '"'"'echo ~root && sleep 0'"'"''
<168.119.100.44> (0, b'/root\n', b'')
<168.119.100.44> ESTABLISH SSH CONNECTION FOR USER: root
<168.119.100.44> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/tmp/%h-%r 168.119.100.44 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1663325556.880422-52394-195122740762188 `" && echo ansible-tmp-1663325556.880422-52394-195122740762188="` echo /root/.ansible/tmp/ansible-tmp-1663325556.880422-52394-195122740762188 `" ) && sleep 0'"'"''
<168.119.100.44> (0, b'ansible-tmp-1663325556.880422-52394-195122740762188=/root/.ansible/tmp/ansible-tmp-1663325556.880422-52394-195122740762188\n', b'')
Using module file /usr/lib/python3/dist-packages/ansible/modules/command.py
<168.119.100.44> PUT /home/axel/.ansible/tmp/ansible-local-52322ijhvkvis/tmplnis9k7e TO /root/.ansible/tmp/ansible-tmp-1663325556.880422-52394-195122740762188/AnsiballZ_command.py
<168.119.100.44> SSH: EXEC sshpass -d10 sftp -o BatchMode=no -b - -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/tmp/%h-%r '[168.119.100.44]'
<168.119.100.44> (0, b'sftp> put /home/axel/.ansible/tmp/ansible-local-52322ijhvkvis/tmplnis9k7e /root/.ansible/tmp/ansible-tmp-1663325556.880422-52394-195122740762188/AnsiballZ_command.py\n', b'')
<168.119.100.44> ESTABLISH SSH CONNECTION FOR USER: root
<168.119.100.44> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/tmp/%h-%r 168.119.100.44 '/bin/sh -c '"'"'chmod u+x /root/.ansible/tmp/ansible-tmp-1663325556.880422-52394-195122740762188/ /root/.ansible/tmp/ansible-tmp-1663325556.880422-52394-195122740762188/AnsiballZ_command.py && sleep 0'"'"''
<168.119.100.44> (0, b'', b'')
<168.119.100.44> ESTABLISH SSH CONNECTION FOR USER: root
<168.119.100.44> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/tmp/%h-%r -tt 168.119.100.44 '/bin/sh -c '"'"'/usr/bin/python3 /root/.ansible/tmp/ansible-tmp-1663325556.880422-52394-195122740762188/AnsiballZ_command.py && sleep 0'"'"''
<168.119.100.44> (0, b'\r\n{"cmd": "/etc/monit/scripts/start_captain_hook.sh\\n", "stdout": "", "stderr": "", "rc": 0, "start": "2022-09-16 10:52:37.660030", "end": "2022-09-16 10:52:37.669701", "delta": "0:00:00.009671", "changed": true, "invocation": {"module_args": {"chdir": "/home/captain-hook/captain-hook", "_raw_params": "/etc/monit/scripts/start_captain_hook.sh\\n", "_uses_shell": true, "warn": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}\r\n', b'Shared connection to 168.119.100.44 closed.\r\n')
<168.119.100.44> ESTABLISH SSH CONNECTION FOR USER: root
<168.119.100.44> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/tmp/%h-%r 168.119.100.44 '/bin/sh -c '"'"'rm -f -r /root/.ansible/tmp/ansible-tmp-1663325556.880422-52394-195122740762188/ > /dev/null 2>&1 && sleep 0'"'"''
<168.119.100.44> (0, b'', b'')
changed: [trade-cloud] => {
"changed": true,
"cmd": "/etc/monit/scripts/start_captain_hook.sh\n",
"delta": "0:00:00.009671",
"end": "2022-09-16 10:52:37.669701",
"invocation": {
"module_args": {
"_raw_params": "/etc/monit/scripts/start_captain_hook.sh\n",
"_uses_shell": true,
"argv": null,
"chdir": "/home/captain-hook/captain-hook",
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"rc": 0,
"start": "2022-09-16 10:52:37.660030",
"stderr": "",
"stderr_lines": [],
"stdout": "",
"stdout_lines": []
}
TASK [captain-hook : report result] ************************************************************************************************************
task path: /home/axel/Dropbox/0_Programming/trading/trade-cloud-ansible/roles/captain-hook/tasks/code-update.yml:31
ok: [trade-cloud] => {
"shell_result.stdout_lines": []
}
monit file to launch start script
# {{ ansible_managed }}
#==================== check start-captain-hook is running =======================
CHECK PROGRAM captainHook WITH PATH /etc/monit/scripts/check_captain_hook.sh
START PROGRAM = "/etc/monit/scripts/start_captain_hook.sh"
IF status != 0 FOR 1 CYCLES THEN START
Is there a way to make this work?
Thanks https://stackoverflow.com/users/2123530/%ce%b2-%ce%b5%ce%b7%ce%bf%ce%b9%cf%84-%ce%b2%ce%b5 and https://stackoverflow.com/users/9401096/zeitounator for your input!
I do now run docker compose detached and log to journald instead of piping to log file.
bash script
#! /bin/bash
lockfile="/var/run/lock/captain-hook"
export ENVIRONMENT=production
cd /home/captain-hook/captain-hook
flock -n "$lockfile" -c "docker-compose up --build --force-recreate -d"
exit 0
Logging to journald via this answer: Access logs of a killed docker container and documentation: https://docs.docker.com/config/containers/logging/configure/
/etc/docker/daemon.json
{
"log-driver": "journald"
}
Changed behavior
Journald will only record the docker logs, therefore I am losing information how the container is build when monit starts the service. When ansible uses the bash script I now get the build info the ansible machine (my laptop^^).
Ansible does not flock successfully via the bash script but thats ok.
Monit also checks if the container is up via docker top captain_hook. If that ever fails (which it has in the past) then monit will correctly use flock.
The worst thing that can happen is that docker top captain_hook fails. The container would get killed once by restart via monit. I can live with that.

Ansible: Error "Line has invalid autocommand"

I just installed Ansible and trying a simple ping but getting errors.
below is the output of the command:
test-switch | FAILED! => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python"
},
"changed": false,
"module_stderr": "Shared connection to test-switch closed.\r\n",
"module_stdout": "\r\nLine has invalid autocommand \"/bin/sh -c '/usr/bin/python '\"'\"'Line has invalid autocommand \"/bin/sh -c '\"'\"'\"'\"'\"'\"'\"'\"'( umask 77 && mkdir -p \"` echo Line has invalid autocommand \"/bin/sh -c '\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'echo ~**** && sleep 0'\"'\"'\"'\"'\"",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 0
}
Here is some extra info:
user#server:~$ sudo ansible test -m ping -i test-hosts -vvv
ansible [core 2.13.3]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3/dist-packages/ansible
ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.10.4 (main, Jun 29 2022, 12:14:53) [GCC 11.2.0]
jinja version = 3.0.3
libyaml = True
Using /etc/ansible/ansible.cfg as config file
host_list declined parsing /home/user/test-hosts as it did not pass its verify_file() method
script declined parsing /home/user/test-hosts as it did not pass its verify_file() method
auto declined parsing /home/user/test-hosts as it did not pass its verify_file() method
Parsed /home/user/test-hosts inventory source with ini plugin
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
META: ran handlers
<test-switch> ESTABLISH SSH CONNECTION FOR USER: ****
<test-switch> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'User="****"' -o ConnectTimeout=30 -o 'ControlPath="/root/.ansible/cp/e7f6d09650"' test-switch '/bin/sh -c '"'"'echo ~**** && sleep 0'"'"''
<test-switch> (0, b'\r\nLine has invalid autocommand "/bin/sh -c \'echo ~**** && sleep 0\'"', b'')
<test-switch> ESTABLISH SSH CONNECTION FOR USER: ****
<test-switch> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'User="****"' -o ConnectTimeout=30 -o 'ControlPath="/root/.ansible/cp/e7f6d09650"' test-switch '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"/.ansible/tmp/ansible-tmp-1662641931.939476-3032-59266234632066 `" && echo ansible-tmp-1662641931.939476-3032-59266234632066="` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"/.ansible/tmp/ansible-tmp-1662641931.939476-3032-59266234632066 `" ) && sleep 0'"'"''
<test-switch> (0, b'\r\nLine has invalid autocommand "/bin/sh -c \'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c \'"\'"\'echo ~**** && sleep 0\'"\'"\'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c \'"\'"\'echo ~**** && sleep 0\'"\'"\'"/.ansible/tmp/a"', b'')
<test-switch> Attempting python interpreter discovery
<test-switch> ESTABLISH SSH CONNECTION FOR USER: ****
<test-switch> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'User="****"' -o ConnectTimeout=30 -o 'ControlPath="/root/.ansible/cp/e7f6d09650"' test-switch '/bin/sh -c '"'"'echo PLATFORM; uname; echo FOUND; command -v '"'"'"'"'"'"'"'"'python3.10'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.9'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.8'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.5'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/bin/python3'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/libexec/platform-python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/bin/python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python'"'"'"'"'"'"'"'"'; echo ENDFOUND && sleep 0'"'"''
<test-switch> (0, b'\r\nLine has invalid autocommand "/bin/sh -c \'echo PLATFORM; uname; echo FOUND; command -v \'"\'"\'python3.10\'"\'"\'; command -v \'"\'"\'python3.9\'"\'"\'; command -v \'"\'"\'python3.8\'"\'"\'; command -v \'"\'"\'python3.7\'"\'"\'; command -v \'"\'"\'python3.6\'"\'"\'; command -v \'"\'"\'python3.5\'"\'"\'; command -v \'"\'"\'"', b'')
[WARNING]: Unhandled error in Python interpreter discovery for host test-switch: unexpected output from Python interpreter discovery
Using module file /usr/lib/python3/dist-packages/ansible/modules/ping.py
<test-switch> PUT /root/.ansible/tmp/ansible-local-3029qijo7fpq/tmp0dni2qsr TO Line has invalid autocommand "/bin/sh -c '( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c '"'"'echo ~**** && sleep 0'"'"'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c '"'"'echo ~**** && sleep 0'"'"'"/.ansible/tmp/a"/AnsiballZ_ping.py
<test-switch> SSH: EXEC sshpass -d10 sftp -o BatchMode=no -b - -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'User="****"' -o ConnectTimeout=30 -o 'ControlPath="/root/.ansible/cp/e7f6d09650"' '[test-switch]'
[WARNING]: sftp transfer mechanism failed on [test-switch]. Use ANSIBLE_DEBUG=1 to see detailed information
<test-switch> SSH: EXEC sshpass -d10 scp -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'User="****"' -o ConnectTimeout=30 -o 'ControlPath="/root/.ansible/cp/e7f6d09650"' /root/.ansible/tmp/ansible-local-3029qijo7fpq/tmp0dni2qsr '[test-switch]:'"'"'Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"/.ansible/tmp/a"/AnsiballZ_ping.py'"'"''
[WARNING]: scp transfer mechanism failed on [test-switch]. Use ANSIBLE_DEBUG=1 to see detailed information
<test-switch> ESTABLISH SSH CONNECTION FOR USER: ****
<test-switch> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'User="****"' -o ConnectTimeout=30 -o 'ControlPath="/root/.ansible/cp/e7f6d09650"' test-switch 'dd of=Line has invalid autocommand "/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"/.ansible/tmp/a"/AnsiballZ_ping.py bs=65536'
<test-switch> (0, b'\r\nLine has invalid autocommand "dd of=Line has invalid autocommand "/bin/sh -c \'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c \'"\'"\'echo ~**** && sleep 0\'"\'"\'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c \'"\'"\'echo ~svc_ansib"', b'')
<test-switch> (0, b'\r\nLine has invalid autocommand "dd of=Line has invalid autocommand "/bin/sh -c \'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c \'"\'"\'echo ~**** && sleep 0\'"\'"\'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c \'"\'"\'echo ~svc_ansib"', b'')
<test-switch> ESTABLISH SSH CONNECTION FOR USER: ****
<test-switch> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'User="****"' -o ConnectTimeout=30 -o 'ControlPath="/root/.ansible/cp/e7f6d09650"' test-switch '/bin/sh -c '"'"'chmod u+x '"'"'"'"'"'"'"'"'Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"/.ansible/tmp/a"/'"'"'"'"'"'"'"'"' '"'"'"'"'"'"'"'"'Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"/.ansible/tmp/a"/AnsiballZ_ping.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
<test-switch> (0, b'\r\nLine has invalid autocommand "/bin/sh -c \'chmod u+x \'"\'"\'Line has invalid autocommand "/bin/sh -c \'"\'"\'"\'"\'"\'"\'"\'"\'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c \'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'echo ~**** && sleep 0\'"\'"\'"\'"\'"\'"\'"\'"', b'')
<test-switch> ESTABLISH SSH CONNECTION FOR USER: ****
<test-switch> SSH: EXEC sshpass -d10 ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'User="****"' -o ConnectTimeout=30 -o 'ControlPath="/root/.ansible/cp/e7f6d09650"' -tt test-switch '/bin/sh -c '"'"'/usr/bin/python '"'"'"'"'"'"'"'"'Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"/.ansible/tmp `"&& mkdir "` echo Line has invalid autocommand "/bin/sh -c '"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'echo ~**** && sleep 0'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"'"/.ansible/tmp/a"/AnsiballZ_ping.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
<test-switch> (0, b'\r\nLine has invalid autocommand "/bin/sh -c \'/usr/bin/python \'"\'"\'Line has invalid autocommand "/bin/sh -c \'"\'"\'"\'"\'"\'"\'"\'"\'( umask 77 && mkdir -p "` echo Line has invalid autocommand "/bin/sh -c \'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'"\'echo ~**** && sleep 0\'"\'"\'"\'"\'"', b'Shared connection to test-switch closed.\r\n')
[WARNING]: Platform unknown on host test-switch is using the discovered Python interpreter at /usr/bin/python, but future installation of another Python interpreter could change the meaning of that path. See
https://docs.ansible.com/ansible-core/2.13/reference_appendices/interpreter_discovery.html for more information.
test-switch | FAILED! => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python"
},
"changed": false,
"module_stderr": "Shared connection to test-switch closed.\r\n",
"module_stdout": "\r\nLine has invalid autocommand \"/bin/sh -c '/usr/bin/python '\"'\"'Line has invalid autocommand \"/bin/sh -c '\"'\"'\"'\"'\"'\"'\"'\"'( umask 77 && mkdir -p \"` echo Line has invalid autocommand \"/bin/sh -c '\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'\"'echo ~**** && sleep 0'\"'\"'\"'\"'\"",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 0
}
user#server:~$
the command actually works when I add some extra info in the inventory file
these are the lines I added:
ansible_connection=network_cli
ansible_network_os=ios
ansible_port=22
and the ping worked:
user#server:~$ ansible test -m ping -i test-hosts
[WARNING]: ansible-pylibssh not installed, falling back to paramiko
test-switch | SUCCESS => {
"changed": false,
"ping": "pong"
}
According the output provided it seems that you try to establish a SSH connection to a switch. Such devices may not have all capabilities for Python scripts.
Because of ping module – Try to connect to host, verify a usable python and return pong on success it "is NOT ICMP ping, ... just a trivial test module that requires Python on the remote-node".

Ansible can't stop managed service

I have a Spring Boot application managed as a systemd service on a Red Hat Enterprise Linux Server 7.7 (Maipo) cluster. The service unit configuration is all right, I can directly start and stop the service unit by hand on the system that I'm trying to command/control with Ansible 2.8.5.
The process owner is tomcat and I'm using another user (deployer) that can "become" tomcat and run the commands on the hosts. That's fine for some other actions, but it fails when I put the actions to manage the service (I've tried with both systemd and service modules):
# ./ansible/roles/boot-core/tasks/main.yml
---
- name: "Deploy/Install new application"
block:
# - name: "Make sure {{ service_id }} is stopped"
# systemd:
# name: "{{ service_id }}"
# state: stopped
- name: "Make sure {{ service_id }} is stopped"
service:
name: "{{ service_id }}"
state: stopped
# - name: "Make sure {{ service_id }} is enabled and started"
# systemd:
# enabled: yes
# name: "{{ service_id }}"
# state: started
- name: "Make sure {{ service_id }} is enabled and started"
service:
enabled: yes
name: "{{ service_id }}"
state: started
# ./ansible/site.yml
---
- hosts: webservers
any_errors_fatal: true
become_user: tomcat
become: yes
force_handlers: true
gather_facts: no
roles:
- boot-core
...and this is how I'm running the playbook as deployer (on a GitLab pipeline, the syntax is different, so I convert it here to what it would look like in a UN*X shell):
$ eval $(ssh-agent -s)
$ ssh-add <(echo "${PRIVATE_SSH_KEY}")
$ ansible-playbook -vvv \
--extra-vars CI_PIPELINE_ID="${CI_PIPELINE_ID}" \
--extra-vars CI_PROJECT_DIR="${CI_PROJECT_DIR}" \
--inventory-file "${CI_PROJECT_DIR}/infrastructure/ansible/inventories/${ANSIBLE_INVENTORY}" \
--limit webservers
--user deployer
"${CI_PROJECT_DIR}/infrastructure/ansible/site.yml"
This is what's getting printed in the logs:
TASK [boot-core : Make sure boot-core is stopped] ****************************
task path: /builds/x80486/boot-core/infrastructure/ansible/roles/boot-core/tasks/main.yml:58
<unixvm001> ESTABLISH SSH CONNECTION FOR USER: deployer
<unixvm001> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 unixvm001 '/bin/sh -c '"'"'echo ~deployer && sleep 0'"'"''
<unixvm001> (0, '/usr/local/home/deployer\n', "Warning: Permanently added 'unixvm001,10.5.177.1' (ECDSA) to the list of known hosts.\r\n\t\t\t\n")
<unixvm001> ESTABLISH SSH CONNECTION FOR USER: deployer
<unixvm001> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 unixvm001 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /var/tmp/ansible-tmp-1571756098.47-177915759067620 `" && echo ansible-tmp-1571756098.47-177915759067620="` echo /var/tmp/ansible-tmp-1571756098.47-177915759067620 `" ) && sleep 0'"'"''
<unixvm001> (0, 'ansible-tmp-1571756098.47-177915759067620=/var/tmp/ansible-tmp-1571756098.47-177915759067620\n', '')
<unixvm001> Attempting python interpreter discovery
<unixvm001> ESTABLISH SSH CONNECTION FOR USER: deployer
<unixvm001> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 unixvm001 '/bin/sh -c '"'"'echo PLATFORM; uname; echo FOUND; command -v '"'"'"'"'"'"'"'"'/usr/bin/python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python3.5'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.7'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python2.6'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/libexec/platform-python'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'/usr/bin/python3'"'"'"'"'"'"'"'"'; command -v '"'"'"'"'"'"'"'"'python'"'"'"'"'"'"'"'"'; echo ENDFOUND && sleep 0'"'"''
<unixvm001> (0, 'PLATFORM\nLinux\nFOUND\n/usr/bin/python\n/usr/bin/python2.7\n/usr/libexec/platform-python\n/usr/bin/python\nENDFOUND\n', '')
<unixvm001> ESTABLISH SSH CONNECTION FOR USER: deployer
<unixvm001> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 unixvm001 '/bin/sh -c '"'"'/usr/bin/python && sleep 0'"'"''
<unixvm001> (0, '{"osrelease_content": "NAME=\\"Red Hat Enterprise Linux Server\\"\\nVERSION=\\"7.7 (Maipo)\\"\\nID=\\"rhel\\"\\nID_LIKE=\\"fedora\\"\\nVARIANT=\\"Server\\"\\nVARIANT_ID=\\"server\\"\\nVERSION_ID=\\"7.7\\"\\nPRETTY_NAME=\\"Red Hat Enterprise Linux Server 7.7 (Maipo)\\"\\nANSI_COLOR=\\"0;31\\"\\nCPE_NAME=\\"cpe:/o:redhat:enterprise_linux:7.7:GA:server\\"\\nHOME_URL=\\"https://www.redhat.com/\\"\\nBUG_REPORT_URL=\\"https://bugzilla.redhat.com/\\"\\n\\nREDHAT_BUGZILLA_PRODUCT=\\"Red Hat Enterprise Linux 7\\"\\nREDHAT_BUGZILLA_PRODUCT_VERSION=7.7\\nREDHAT_SUPPORT_PRODUCT=\\"Red Hat Enterprise Linux\\"\\nREDHAT_SUPPORT_PRODUCT_VERSION=\\"7.7\\"\\n", "platform_dist_result": ["redhat", "7.7", "Maipo"]}\n', '')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/system/setup.py
<unixvm001> PUT /root/.ansible/tmp/ansible-local-4292hN1DYR/tmpYM8xgh TO /var/tmp/ansible-tmp-1571756098.47-177915759067620/AnsiballZ_setup.py
<unixvm001> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 '[unixvm001]'
<unixvm001> (0, 'sftp> put /root/.ansible/tmp/ansible-local-4292hN1DYR/tmpYM8xgh /var/tmp/ansible-tmp-1571756098.47-177915759067620/AnsiballZ_setup.py\n', '')
<unixvm001> ESTABLISH SSH CONNECTION FOR USER: deployer
<unixvm001> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 unixvm001 '/bin/sh -c '"'"'setfacl -m u:tomcat:r-x /var/tmp/ansible-tmp-1571756098.47-177915759067620/ /var/tmp/ansible-tmp-1571756098.47-177915759067620/AnsiballZ_setup.py && sleep 0'"'"''
<unixvm001> (0, '', '')
<unixvm001> ESTABLISH SSH CONNECTION FOR USER: deployer
<unixvm001> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 -tt unixvm001 '/bin/sh -c '"'"'sudo -H -S -n -u tomcat /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-ldekvnpzwgrgribssedqdqimvuzpvozm ; /usr/bin/python /var/tmp/ansible-tmp-1571756098.47-177915759067620/AnsiballZ_setup.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<unixvm001> (0, '\r\n{"invocation": {"module_args": {"filter": "ansible_service_mgr", "gather_subset": ["!all"], "fact_path": "/etc/ansible/facts.d", "gather_timeout": 10}}, "ansible_facts": {"ansible_service_mgr": "systemd"}}\r\n', 'Shared connection to unixvm001 closed.\r\n')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/system/systemd.py
<unixvm001> PUT /root/.ansible/tmp/ansible-local-4292hN1DYR/tmpuOv4ys TO /var/tmp/ansible-tmp-1571756098.47-177915759067620/AnsiballZ_systemd.py
<unixvm001> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 '[unixvm001]'
<unixvm001> (0, 'sftp> put /root/.ansible/tmp/ansible-local-4292hN1DYR/tmpuOv4ys /var/tmp/ansible-tmp-1571756098.47-177915759067620/AnsiballZ_systemd.py\n', '')
<unixvm001> ESTABLISH SSH CONNECTION FOR USER: deployer
<unixvm001> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 unixvm001 '/bin/sh -c '"'"'setfacl -m u:tomcat:r-x /var/tmp/ansible-tmp-1571756098.47-177915759067620/ /var/tmp/ansible-tmp-1571756098.47-177915759067620/AnsiballZ_systemd.py && sleep 0'"'"''
<unixvm001> (0, '', '')
<unixvm001> ESTABLISH SSH CONNECTION FOR USER: deployer
<unixvm001> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 -tt unixvm001 '/bin/sh -c '"'"'sudo -H -S -n -u tomcat /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-nnwsmnabevfloceodiibjgkauxvxykgu ; /usr/bin/python /var/tmp/ansible-tmp-1571756098.47-177915759067620/AnsiballZ_systemd.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<unixvm001> (1, '\x1b[1;31m==== AUTHENTICATING FOR org.freedesktop.systemd1.manage-units ===\r\n\x1b[0mAuthentication is required to manage system services or units.\r\nAuthenticating as: Unix Admin (rsc_sys)\r\nPassword: \r\n{"msg": "Unable to stop service boot-core: Failed to stop boot-core.service: Connection timed out\\nSee system logs and \'systemctl status boot-core.service\' for details.\\n", "failed": true, "invocation": {"module_args": {"no_block": false, "force": null, "name": "boot-core", "daemon_reexec": false, "enabled": null, "daemon_reload": false, "state": "stopped", "masked": null, "scope": null, "user": null}}}\r\n', 'Shared connection to unixvm001 closed.\r\n')
<unixvm001> Failed to connect to the host via ssh: Shared connection to unixvm001 closed.
<unixvm001> ESTABLISH SSH CONNECTION FOR USER: deployer
<unixvm001> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="deployer"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/ea5c024329 unixvm001 '/bin/sh -c '"'"'rm -f -r /var/tmp/ansible-tmp-1571756098.47-177915759067620/ > /dev/null 2>&1 && sleep 0'"'"''
<unixvm001> (0, '', '')
fatal: [unixvm001]: FAILED! => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python"
},
"changed": false,
"invocation": {
"module_args": {
"daemon_reexec": false,
"daemon_reload": false,
"enabled": null,
"force": null,
"masked": null,
"name": "boot-core",
"no_block": false,
"scope": null,
"state": "stopped",
"user": null
}
},
"msg": "Unable to stop service boot-core: Failed to stop boot-core.service: Connection timed out\nSee system logs and 'systemctl status boot-core.service' for details.\n"
}
NO MORE HOSTS LEFT *************************************************************
PLAY RECAP *********************************************************************
unixvm001 : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
I remembered I've seen this message: Authentication is required to manage system services or units.\r\nAuthenticating as: Unix Admin (rsc_sys) before when sudo privileges were not applied correctly on the hosts and I was trying to start / stop the service unit by hand, but I'm not sure why it's showing up with Ansible here now.
This is what I get when I do sudo -l:
[deployer#unixvm001 ~]$ sudo -l
Matching Defaults entries for deployer on unixvm001:
ignore_dot, !mail_no_user, !root_sudo, !syslog, timestamp_timeout=10, logfile=/var/log/sudo.log, pwfeedback, passwd_timeout=5, passwd_tries=3, umask_override,
umask=0027, log_host, visiblepw, env_keep+=SSH_AUTH_SOCK, ignore_dot, !mail_no_user, !root_sudo, !syslog, timestamp_timeout=10, logfile=/var/log/sudo.log, pwfeedback,
passwd_timeout=5, passwd_tries=3, umask_override, umask=0027, log_host, visiblepw, env_keep+=SSH_AUTH_SOCK
User deployer may run the following commands on unixvm001:
(root) NOPASSWD: /sbin/multipath -ll, /sbin/ifconfig -a, /usr/bin/ipmitool lan print, /usr/sbin/dmidecode -s system-product-name, /usr/sbin/dmidecode -s
system-serial-number, /usr/bin/last, /usr/sbin/nscd -i hosts, /usr/local/bin/ports, /bin/cat /var/log/dmesg
(oem) NOPASSWD: /usr/oem/agent/agent_inst/bin/emctl, /opt/oracle-oem/bin/emctl, /usr/oem/bin/emctl, /opt/oracle-oem/agent/agent_inst/bin/emctl,
/u01/oracle/agent/agent_inst/bin/emctl
(tomcat) NOPASSWD: ALL, !/bin/su
(root) NOPASSWD: /bin/systemctl * tomcat*, /bin/view /var/log/messages, /bin/systemctl * boot-core*, /bin/systemctl daemon-reload
(tomcat) NOPASSWD: /bin/systemctl * boot-core*
Again, on the hosts I can do: sudo /bin/systemctl stop boot-core.service (same with start) and everything is fine, although if I do only systemctl stop boot-core.service I would get the same error message:
[deployer#unixvm001~]$ systemctl stop boot-core.service
==== AUTHENTICATING FOR org.freedesktop.systemd1.manage-units ===
Authentication is required to manage system services or units.
Authenticating as: Unix Admin (rsc_sys)
Password:
Any clues what's going on here? I believe the sudo privileges should be tweaked, but I'm not entirely sure.
UPDATE:
I modified the Ansible script (just for testing) to use the command module:
- name: "Make sure {{ service_id }} is stopped"
command: "sudo systemctl stop {{ service_id }}"
- name: "Make sure {{ service_id }} is started"
command: "sudo systemctl start {{ service_id }}"
...and it "does work" (though I have to use sudo, it does not work using become: yes and removing sudo from the command):
Oct 24 13:29:35 : deployer : HOST=unixvm001 : TTY=pts/2 ; PWD=/usr/local/home/deployer ; USER=tomcat ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acdbbxcaetxxlfgnnbvtmrxcofktyjnw ; /usr/bin/python /var/tmp/ansible-tmp-1571938173.5-172296377610468/AnsiballZ_command.py
Oct 24 13:29:36 : tomcat : HOST=unixvm001 : TTY=pts/2 ; PWD=/usr/local/home/tomcat/.ansible/tmp/ansible-moduletmp-1571938175.42-_jtzB0 ; USER=root ; COMMAND=/usr/bin/systemctl stop boot-core.service
Oct 24 13:29:37 : deployer : HOST=unixvm001 : TTY=pts/2 ; PWD=/usr/local/home/deployer ; USER=tomcat ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utdnsysqmyzkactqhadmmoiujwounyru ; /usr/bin/python /var/tmp/ansible-tmp-1571938176.98-167412210657077/AnsiballZ_command.py
Oct 24 13:29:37 : tomcat : HOST=unixvm001 : TTY=pts/2 ; PWD=/usr/local/home/tomcat/.ansible/tmp/ansible-moduletmp-1571938177.75-k7qyDh ; USER=root ; COMMAND=/usr/bin/systemctl start boot-core.service
---
Oct 24 13:29:37 unixvm001 python: ansible-command Invoked with creates=None executable=None _uses_shell=False strip_empty_ends=True _raw_params=sudo systemctl start boot-core.service removes=None argv=None warn=True chdir=None stdin_add_newline=True stdin=None
Oct 24 13:29:37 unixvm001 python: ansible-command [WARNING] Consider using 'become', 'become_method', and 'become_user' rather than running sudo
It sounds like you're running a playbook as the tomcat user but then trying to manage a service, this won't work. If you ssh into that machine as the tomcat user and just try to run a systemctl command without escalating privileges then it won't work by hand either. It seems as though you're telling the playbook to do one thing and then doing a completely different thing by hand and calling them equivalent but that one of them isn't working properly. I suspect this is not the case (but I could be wrong and bugs do happen).
You could either break this up as multiple plays, each play set to different users, or with appropriate become options and divide the tasks that way. Alternatively, you can set privilege escalation for those tasks specifically (or at the block level). https://docs.ansible.com/ansible/latest/user_guide/become.html

Generate SSL Certificates Failed - Installation Error

I am trying to install IBM Cloud Private CE 2.1.0.3 on my local machine (single node) and I encounter following error during the installation process (verbose mode with -vvv is on):
TASK [certificates : Generating ssl certificates] ******************************
task path: /installer/playbook/roles/certificates/tasks/certificate.yaml:27
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: user
<127.0.0.1> SSH: EXEC sshpass -d15 ssh -C -o CheckHostIP=no -o LogLevel=ERROR -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o Port=22 -o 'IdentityFile="/installer/cluster/ssh_key"' -o User=user -o ConnectTimeout=60 -oPubkeyAuthentication=no 127.0.0.1 '/bin/bash -c '"'"'echo ~ && sleep 0'"'"''
<127.0.0.1> (0, '/home/user\n', '')
<127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: user
<127.0.0.1> SSH: EXEC sshpass -d15 ssh -C -o CheckHostIP=no -o LogLevel=ERROR -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o Port=22 -o 'IdentityFile="/installer/cluster/ssh_key"' -o User=user -o ConnectTimeout=60 -oPubkeyAuthentication=no 127.0.0.1 '/bin/bash -c '"'"'( umask 77 && mkdir -p "` echo /home/user/.ansible/tmp/ansible-tmp-1528437619.25-45151919681624 `" && echo ansible-tmp-1528437619.25-45151919681624="` echo /home/user/.ansible/tmp/ansible-tmp-1528437619.25-45151919681624 `" ) && sleep 0'"'"''
<127.0.0.1> (0, 'ansible-tmp-1528437619.25-45151919681624=/home/user/.ansible/tmp/ansible-tmp-1528437619.25-45151919681624\n', '')
<127.0.0.1> PUT /root/.ansible/tmp/ansible-local-14gnUpht/tmp_VvSJv TO /home/user/.ansible/tmp/ansible-tmp-1528437619.25-45151919681624/command.py
<127.0.0.1> SSH: EXEC sshpass -d15 sftp -o BatchMode=no -b - -C -o CheckHostIP=no -o LogLevel=ERROR -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o Port=22 -o 'IdentityFile="/installer/cluster/ssh_key"' -o User=user -o ConnectTimeout=60 -oPubkeyAuthentication=no '[127.0.0.1]'
<127.0.0.1> (0, 'sftp> put /root/.ansible/tmp/ansible-local-14gnUpht/tmp_VvSJv /home/user/.ansible/tmp/ansible-tmp-1528437619.25-45151919681624/command.py\n', '')
<127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: user
<127.0.0.1> SSH: EXEC sshpass -d15 ssh -C -o CheckHostIP=no -o LogLevel=ERROR -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o Port=22 -o 'IdentityFile="/installer/cluster/ssh_key"' -o User=user -o ConnectTimeout=60 -oPubkeyAuthentication=no 127.0.0.1 '/bin/bash -c '"'"'chmod u+x /home/user/.ansible/tmp/ansible-tmp-1528437619.25-45151919681624/ /home/user/.ansible/tmp/ansible-tmp-1528437619.25-45151919681624/command.py && sleep 0'"'"''
<127.0.0.1> (0, '', '')
<127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: user
<127.0.0.1> SSH: EXEC sshpass -d15 ssh -C -o CheckHostIP=no -o LogLevel=ERROR -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o Port=22 -o 'IdentityFile="/installer/cluster/ssh_key"' -o User=user -o ConnectTimeout=60 -oPubkeyAuthentication=no -tt 127.0.0.1 '/bin/bash -c '"'"'sudo -H -S -p "[sudo via ansible, key=iwoajxmkcbeqtvsibemwhzxoiowwbdgi] password: " -u root /bin/bash -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-iwoajxmkcbeqtvsibemwhzxoiowwbdgi; /usr/bin/python /home/user/.ansible/tmp/ansible-tmp-1528437619.25-45151919681624/command.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<127.0.0.1> (1, '\r\n\r\n{"changed": true, "end": "2018-06-08 08:00:21.114115", "stdout": "", "cmd": "CERT_DIR=/installer/cluster/cfc-certs /installer/playbook/roles/certificates/files/kubernetes/make-ca-cert.sh 127.0.0.1 IP:127.0.0.1,IP:10.0.0.1,DNS:kubernetes,DNS:kubernetes.default,DNS:kubernetes.default.svc,DNS:kubernetes.default.svc.cluster.local,DNS:mycluster.icp", "failed": true, "delta": "0:00:00.003196", "stderr": "/bin/bash: /installer/playbook/roles/certificates/files/kubernetes/make-ca-cert.sh: No such file or directory", "rc": 127, "invocation": {"module_args": {"warn": true, "executable": "/bin/bash", "_uses_shell": true, "_raw_params": "CERT_DIR=/installer/cluster/cfc-certs /installer/playbook/roles/certificates/files/kubernetes/make-ca-cert.sh 127.0.0.1 IP:127.0.0.1,IP:10.0.0.1,DNS:kubernetes,DNS:kubernetes.default,DNS:kubernetes.default.svc,DNS:kubernetes.default.svc.cluster.local,DNS:mycluster.icp", "removes": null, "creates": "/installer/cluster/cfc-certs/server.key", "chdir": null, "stdin": null}}, "start": "2018-06-08 08:00:21.110919", "msg": "non-zero return code"}\r\n', 'Connection to 127.0.0.1 closed.\r\n')
<127.0.0.1> ESTABLISH SSH CONNECTION FOR USER: user
<127.0.0.1> SSH: EXEC sshpass -d15 ssh -C -o CheckHostIP=no -o LogLevel=ERROR -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o Port=22 -o 'IdentityFile="/installer/cluster/ssh_key"' -o User=user -o ConnectTimeout=60 -oPubkeyAuthentication=no 127.0.0.1 '/bin/bash -c '"'"'rm -f -r /home/user/.ansible/tmp/ansible-tmp-1528437619.25-45151919681624/ > /dev/null 2>&1 && sleep 0'"'"''
<127.0.0.1> (0, '', '')
fatal: [127.0.0.1]: FAILED! => {
"changed": true,
"cmd": "CERT_DIR=/installer/cluster/cfc-certs /installer/playbook/roles/certificates/files/kubernetes/make-ca-cert.sh 127.0.0.1 IP:127.0.0.1,IP:10.0.0.1,DNS:kubernetes,DNS:kubernetes.default,DNS:kubernetes.default.svc,DNS:kubernetes.default.svc.cluster.local,DNS:mycluster.icp",
"delta": "0:00:00.003196",
"end": "2018-06-08 08:00:21.114115",
"invocation": {
"module_args": {
"_raw_params": "CERT_DIR=/installer/cluster/cfc-certs /installer/playbook/roles/certificates/files/kubernetes/make-ca-cert.sh 127.0.0.1 IP:127.0.0.1,IP:10.0.0.1,DNS:kubernetes,DNS:kubernetes.default,DNS:kubernetes.default.svc,DNS:kubernetes.default.svc.cluster.local,DNS:mycluster.icp",
"_uses_shell": true,
"chdir": null,
"creates": "/installer/cluster/cfc-certs/server.key",
"executable": "/bin/bash",
"removes": null,
"stdin": null,
"warn": true
}
},
"msg": "non-zero return code",
"rc": 127,
"start": "2018-06-08 08:00:21.110919",
"stderr": "/bin/bash: /installer/playbook/roles/certificates/files/kubernetes/make-ca-cert.sh: No such file or directory",
"stderr_lines": [
"/bin/bash: /installer/playbook/roles/certificates/files/kubernetes/make-ca-cert.sh: No such file or directory"
],
"stdout": "",
"stdout_lines": []
}
NO MORE HOSTS LEFT *************************************************************
PLAY RECAP *********************************************************************
127.0.0.1 : ok=44 changed=13 unreachable=0 failed=1
Playbook run took 0 days, 0 hours, 2 minutes, 32 seconds
user#kim:/opt/ibm-cloud-private-ce-2.1.0.3/cluster$
It says "no such file or directory" for the "make-ca-cert.sh" file in the installer directory, but I don't quite know what I am supposed to do to fix this.
I am very thankful for any help!
Kim, could you let me check your ICP hosts and your /etc/hosts files? Normally, you can not specify '127.0.0.1' in ICP hosts file for installation, thanks.
Did you copy your ssh key for the ansible user or root(which ever you are using) to the $install_dir/cluster/ssh_key file?
Example, mine is /opt/ibm-cloud-private-ce-2.1.0.3/cluster/ssh_key
Specifically, steps 6 and 8 in Step 2 here:
https://www.ibm.com/support/knowledgecenter/en/SSBS6K_2.1.0.3/installing/install_containers_CE.html#setup
HTH

stopping a service using ansible

I am using ansible to test against a VM where I have already installed nginx. I tried stopping the service using the command below and the resulting status displays that the process stopped. However, on tha target server, I can see that the process is still running (and has been running for a few days). I have the correct server in the ansible command
and am checking the right server. Any thoughts on why the command would display the status that the service has stopped even when it does not seem to have done so.
ansible testserver -vvv -m service -a "name=nginx state=stopped"
Using /home/test/devops/ansible.cfg as config file
<ec2-xx.xxx.xxx.xxx.compute-1.amazonaws.com> ESTABLISH SSH CONNECTION FOR USER: test
<ec2-xx.xxx.xxx.xxx.compute-1.amazonaws.com> SSH: EXEC ssh -C -q -o ControlMaster=auto -o ControlPersist=60s -o Port=1234 -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=test -o ConnectTimeout=10 -o ControlPath=/home/test/.ansible/cp/ansible-ssh-%h-%p-%r ec2-xx.xxx.xxx.xxx.compute-1.amazonaws.com '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1467905869.39-108785461246651 `" && echo ansible-tmp-1467905869.39-108785461246651="` echo $HOME/.ansible/tmp/ansible-tmp-1467905869.39-108785461246651 `" ) && sleep 0'"'"''
<ec2-xx.xxx.xxx.xxx.compute-1.amazonaws.com> PUT /tmp/tmpqpCm5g TO /home/ali/.ansible/tmp/ansible-tmp-1467905869.39-108785461246651/service
<ec2-xx.xxx.xxx.xxx.compute-1.amazonaws.com> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o Port=1234 -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ali -o ConnectTimeout=10 -o ControlPath=/home/test/.ansible/cp/ansible-ssh-%h-%p-%r '[ec2-52-87-166-241.compute-1.amazonaws.com]'
<ec2-xx.xxx.xxx.xxx.compute-1.amazonaws.com> ESTABLISH SSH CONNECTION FOR USER: test
<ec2-xx.xxx.xxx.xxx.compute-1.amazonaws.com> SSH: EXEC ssh -C -q -o ControlMaster=auto -o ControlPersist=60s -o Port=1234 -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=ali -o ConnectTimeout=10 -o ControlPath=/home/ali/.ansible/cp/ansible-ssh-%h-%p-%r -tt ec2-xx.xxx.xxx.xxx.compute-1.amazonaws.com '/bin/sh -c '"'"'LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 LC_MESSAGES=en_US.UTF-8 /usr/bin/python /home/test/.ansible/tmp/ansible-tmp-1467905869.39-108785461246651/service; rm -rf "/home/test/.ansible/tmp/ansible-tmp-1467905869.39-108785461246651/" > /dev/null 2>&1 && sleep 0'"'"''
testserver | SUCCESS => {
"changed": true,
"invocation": {
"module_args": {
"arguments": "",
"enabled": null,
"name": "nginx",
"pattern": null,
"runlevel": "default",
"sleep": null,
"state": "stopped"
},
"module_name": "service"
},
"name": "nginx",
"state": "stopped"
}
in play-book:
- name: stop nginx service
service: name=nginx state=stopped
or
adhoc-command:to stop service
ansible testserver -m service -a "name=nginx state=stopped"

Resources