Ansible hangs on job submission step Ansible ZOS Core - ansible

I'm using ansible on zlinux. I have a playbook that is using the zos_job_submit command from the zos ansible core modules.
The module is used with a job that generates random data to the jes spool.
//SPOOL1 JOB (UU999999999,1103),'DART JOB',CLASS=0,
// REGION=0M,MSGCLASS=R,TIME=5, LINES=(999999,WARNING),
// NOTIFY=&SYSUID
//* Automatic process will kill the job and cleanup spool.
//* author: xxxxxx , xxxxxxxxx
//STEPNAME EXEC PGM=BPXBATCH
//STDERR DD SYSOUT=*
//STDOUT DD SYSOUT=*
//STDPARM DD *
SH cat /dev/urandom
enter code here
This was working fine until a few days ago when It started to freeze up and error out. It still submits the job but it fails to return its output after the job starts running and then errors out.
Here is the playbook I'm using ( stripped down to only the offending task)
# Author: xxxxxxxxxxxxxx
- name: "DART JES CHAOS EVENT"
hosts: all # WARNING: USE WITH --LIMIT <target> OTHERWISE ALL HOSTS IN INVENTORY WILL BE TARGETED!
vars:
all_jobs:
jobs: [ ]
jobs_file_location: "jobs/{{inventory_hostname}}"
tasks:
- name: "Submit job tasks"
block:
- name: Submit job
ibm.ibm_zos_core.zos_job_submit:
src: "{{uss_jcl_path}}"
location: LOCAL
wait: false
vars:
uss_jcl_path: "{{jcl_lib}}/{{job_jcl}}"
Here is the log using -vvv
ansible-playbook 2.9.27
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/adshome/svc.dart/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /bin/ansible-playbook
python version = 2.7.5 (default, May 27 2022, 07:27:39) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
Using /etc/ansible/ansible.cfg as config file
PLAYBOOK: debug_event.yml **************************************************************************************************************************************************
Positional arguments: /XXXX/dart/ansible/playbooks/debug_event.yml
subset: nwrd
become_method: sudo
inventory: (u'/XXXX/dart/ansible/inventory', u'/XXXX/dart/ansible/auth_inventory')
forks: 5
tags: (u'all',)
extra_vars: (u'jcl_lib=/XXXX/dart/ansible/playbooks/jcl job_jcl=SPOOL',)
verbosity: 4
connection: smart
timeout: 10
1 plays in /XXXX/dart/ansible/playbooks/debug_event.yml
PLAY [DART JES CHAOS EVENT] ************************************************************************************************************************************************
TASK [Submit job] **********************************************************************************************************************************************************
task path: /XXXX/dart/ansible/playbooks/debug_event.yml:12
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/tempfile.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '\n{"changed": true, "path": "/tmp/ansible.nzzb29wz", "uid": 10001120, "gid": 7212, "owner": "XXXXXXX", "group": "GLTCMF", "mode": "0600", "state": "file", "size": 0, "invocation": {"module_args": {"state": "file", "prefix": "ansible.", "suffix": "", "path": null}}}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'echo ~XXXXXXX && sleep 0'"'"''
<XXXX.XXX.COM> (0, '/u/XXXXXXX\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /u/XXXXXXX/.ansible/tmp `"&& mkdir "` echo /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409 `" && echo ansible-tmp-1658270106.84-62001-187469557526409="` echo /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409 `" ) && sleep 0'"'"''
<XXXX.XXX.COM> (0, 'ansible-tmp-1658270106.84-62001-187469557526409=/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/stat.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '\n{"changed": false, "stat": {"exists": true, "path": "/tmp/ansible.nzzb29wz", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 10001120, "gid": 7212, "size": 0, "inode": 10691, "dev": 3248, "nlink": 1, "atime": 1658270106, "mtime": 1658270106, "ctime": 1658270106, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "XXXXXXX", "gr_name": "GLTCMF", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "mimetype": "unknown", "charset": "unknown", "version": null, "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"checksum_algorithm": "sha1", "get_checksum": true, "follow": false, "path": "/tmp/ansible.nzzb29wz", "get_md5": false, "get_mime": true, "get_attributes": true}}}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> PUT /XXXX/dart/ansible/playbooks/jcl/SPOOL TO /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source
<XXXX.XXX.COM> SSH: EXEC sftp -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 '[XXXX.XXX.COM]'
<XXXX.XXX.COM> (0, 'sftp> put /XXXX/dart/ansible/playbooks/jcl/SPOOL /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "posix-rename#openssh.com" revision 1\r\ndebug2: Server supports extension "statvfs#openssh.com" revision 2\r\ndebug2: Server supports extension "fstatvfs#openssh.com" revision 2\r\ndebug2: Server supports extension "hardlink#openssh.com" revision 1\r\ndebug2: Server supports extension "fsync#openssh.com" revision 1\r\ndebug3: Sent message fd 5 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /u/XXXXXXX size 0\r\ndebug3: Looking up /XXXX/dart/ansible/playbooks/jcl/SPOOL\r\ndebug3: Sent message fd 5 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:395\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 395 bytes at 0\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'chmod u+x /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/ /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source && sleep 0'"'"''
<XXXX.XXX.COM> (0, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/copy.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '\n{"dest": "/tmp/ansible.nzzb29wz", "src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source", "md5sum": "e18be97081c5a8d8ae1e1c57eb0d2123", "checksum": "b220730079e63acfee97f2b694ae2d31d3074083", "changed": true, "uid": 10001120, "gid": 7212, "owner": "XXXXXXX", "group": "GLTCMF", "mode": "0600", "state": "file", "size": 395, "invocation": {"module_args": {"src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source", "dest": "/tmp/ansible.nzzb29wz", "_original_basename": "SPOOL", "mode": "0600", "backup": false, "force": true, "follow": false, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "checksum": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /adshome/svc.dart/.ansible/collections/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (1, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\nTraceback (most recent call last):\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 800, in run_module\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 58, in job_output\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 73, in _get_job_output\nRuntimeError: Failed to retrieve job output. RC: -9 Error: \n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "<stdin>", line 102, in <module>\n File "<stdin>", line 94, in _ansiballz_main\n File "<stdin>", line 40, in invoke_module\n File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 205, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 96, in _run_module_code\n mod_name, mod_spec, pkg_name, script_name)\n File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 85, in _run_code\n exec(code, run_globals)\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 894, in <module>\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 890, in main\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 805, in run_module\nIndexError: tuple index out of range\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\n')
<XXXX.XXX.COM> Failed to connect to the host via ssh: OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 60: Applying options for *
debug1: auto-mux: Trying existing master
debug2: fd 4 setting O_NONBLOCK
debug2: mux_client_hello_exchange: master version 4
debug3: mux_client_forwards: request forwardings: 0 local, 0 remote
debug3: mux_client_request_session: entering
debug3: mux_client_request_alive: entering
debug3: mux_client_request_alive: done pid = 61998
debug3: mux_client_request_session: session request sent
debug1: mux_client_request_session: master session id: 2
Traceback (most recent call last):
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 800, in run_module
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 58, in job_output
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 73, in _get_job_output
RuntimeError: Failed to retrieve job output. RC: -9 Error:
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 102, in <module>
File "<stdin>", line 94, in _ansiballz_main
File "<stdin>", line 40, in invoke_module
File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 894, in <module>
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 890, in main
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 805, in run_module
IndexError: tuple index out of range
debug3: mux_client_read_packet: read header failed: Broken pipe
debug2: Received exit status from master 1
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'rm -f -r /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/ > /dev/null 2>&1 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
fatal: [XXXX.XXX.COM]: FAILED! => {
"changed": true,
"checksum": "b220730079e63acfee97f2b694ae2d31d3074083",
"dest": "/tmp/ansible.nzzb29wz",
"gid": 7212,
"group": "GLTCMF",
"invocation": {
"module_args": {
"_original_basename": "SPOOL",
"attributes": null,
"backup": false,
"checksum": null,
"content": null,
"delimiter": null,
"dest": "/tmp/ansible.nzzb29wz",
"directory_mode": null,
"follow": false,
"force": true,
"group": null,
"local_follow": null,
"mode": "0600",
"owner": null,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source",
"unsafe_writes": false,
"validate": null
}
},
"md5sum": "e18be97081c5a8d8ae1e1c57eb0d2123",
"mode": "0600",
"module_stderr": "OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\nTraceback (most recent call last):\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 800, in run_module\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py\", line 58, in job_output\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py\", line 73, in _get_job_output\nRuntimeError: Failed to retrieve job output. RC: -9 Error: \n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"<stdin>\", line 102, in <module>\n File \"<stdin>\", line 94, in _ansiballz_main\n File \"<stdin>\", line 40, in invoke_module\n File \"/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py\", line 205, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File \"/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py\", line 96, in _run_module_code\n mod_name, mod_spec, pkg_name, script_name)\n File \"/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py\", line 85, in _run_code\n exec(code, run_globals)\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 894, in <module>\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 890, in main\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 805, in run_module\nIndexError: tuple index out of range\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"owner": "XXXXXXX",
"rc": 1,
"size": 395,
"src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source",
"state": "file",
"uid": 10001120
}
PLAY RECAP *****************************************************************************************************************************************************************
XXXX.XXX.COM : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0

I have a few comments and hopefully we can resolve this.
There are a few things that confuse me and make me wonder how it ever worked else I am missing some data points.
The playbook is using option location: LOCAL and the src: appears to be a directory in USS (u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source), the option LOCAL is meant for files located on the control node (where Ansible engine is running, in your case zLinux), can you confirm that you do have the file on the control node?
When I look at the verbose environment vars passed to the managed node (z/OS) I see no environment vars for the required dependency ZOAU. The zos_job_submit module uses the ZOAU APIs in this module; I am also questioning how this worked before without the environment vars or has your vars been corrupted (see below)?
SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
I am expecting environment vars similar to this:
environment_vars:
_BPXK_AUTOCVT: "ON"
ZOAU_HOME: "{{ ZOAU }}"
PYTHONPATH: "{{ ZOAU }}/lib"
LIBPATH: "{{ ZOAU }}/lib:{{ PYZ }}/lib:/lib:/usr/lib:."
PATH: "{{ ZOAU }}/bin:{{ PYZ }}/bin:/bin:/var/bin"
_CEE_RUNOPTS: "FILETAG(AUTOCVT,AUTOTAG) POSIX(ON)"
_TAG_REDIR_ERR: "txt"
_TAG_REDIR_IN: "txt"
_TAG_REDIR_OUT: "txt"
LANG: "C"
Based on the trace it looks like you are using the latest Ansible Core collection version 1.4.0-beta.1 yet you are using Anaconda Python 3.6 from Rocket, that version of Python has long been removed from support.
For your version of the collection you will want to reference these requirements and probably start with a simple example to ensure its functional before doing something more complex , this is a good one to start with.
I am happy to help but supporting what appears to be a mismatch of requirements can be challenging.

Related

ansible synchronize module fails to get directory from remote to local - failed: No such file or directory

I wish to copy /web/playbooks/automation/misc/filecopyprod from mysourceuser#mysourcehost to destination mydestuser#mydesthost under the below location /web/playbooks/automation/misc/filecopy/tmpfiles/500/
Evident that both the source and destinations are present and have good permissions.
[mydestuser#mydesthost ~]$ ssh mysourceuser#mysourcehost ls -ld '/web/playbooks/automation/misc/filecopyprod'
##################################################################
# *** This Server is using Centrify *** #
# *** Remember to use your Active Directory account *** #
# *** password when logging in *** #
##################################################################
drwxrwxr-x 3 mysourceuser mysourceuser 209 Sep 26 14:58 /web/playbooks/automation/misc/filecopyprod
[mydestuser#mydesthost ~]$ ls -ld /web/playbooks/automation/misc/filecopy/tmpfiles/500/
drwxr-xr-x 2 mydestuser aces 6 Sep 26 14:13 /web/playbooks/automation/misc/filecopy/tmpfiles/500/
Here is my playbook that runs on the mydesthost and gets me files & folders from a remote server mysourceuser#mysourcehost to local server mydestuser#mydesthost
- name: Copying from "{{ inventory_hostname }}" to this ansible server.
tags: validate
synchronize:
src: "'{{ item }}'"
dest: "{{ playbook_dir }}/tmpfiles/{{ Latest_Build_Number }}/"
mode: pull
copy_links: yes
with_items:
- "{{ source_file_new.splitlines() }}"
To run the above playbook:
ansible-playbook /web/playbooks/automation/misc/filecopy/copyfiles.yml -e "source_file_new='$source_file_new'" -e "Latest_Build_Number='500'"
Output of my run:
TASK [Copying from "mysourcehost" to this ansible server.] **********************
task path: /web/playbooks/automation/misc/filecopy/copyfiles.yml:218
Monday 26 September 2022 14:13:02 -0500 (0:00:00.047) 0:00:03.084 ******
redirecting (type: action) ansible.builtin.synchronize to ansible.posix.synchronize
redirecting (type: action) ansible.builtin.synchronize to ansible.posix.synchronize
<mysourcehost> ESTABLISH LOCAL CONNECTION FOR USER: mydestuser
<mysourcehost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81 `"&& mkdir "` echo /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597 `" && echo ansible-tmp-1664219583.0005133-20679-105296975361597="` echo /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597 `" ) && sleep 0'
Using module file /home/mydestuser/.ansible/collections/ansible_collections/ansible/posix/plugins/modules/synchronize.py
<mysourcehost> PUT /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/tmpxhpyaf0m TO /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/AnsiballZ_synchronize.py
<mysourcehost> EXEC /bin/sh -c 'chmod u+x /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/ /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/AnsiballZ_synchronize.py && sleep 0'
<mysourcehost> EXEC /bin/sh -c '/usr/local/bin/python3.8 /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/AnsiballZ_synchronize.py && sleep 0'
<mysourcehost> EXEC /bin/sh -c 'rm -f -r /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/ > /dev/null 2>&1 && sleep 0'
failed: [mysourcehost] (item=/web/playbooks/automation/misc/filecopyprod) => {
"ansible_loop_var": "item",
"changed": false,
"cmd": "/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/",
"invocation": {
"module_args": {
"_local_rsync_password": null,
"_local_rsync_path": "rsync",
"_substitute_controller": false,
"archive": true,
"checksum": false,
"compress": true,
"copy_links": true,
"delete": false,
"dest": "/web/playbooks/automation/misc/filecopy/tmpfiles/500/",
"dest_port": null,
"dirs": false,
"existing_only": false,
"group": null,
"link_dest": null,
"links": null,
"mode": "pull",
"owner": null,
"partial": false,
"perms": null,
"private_key": null,
"recursive": null,
"rsync_opts": [],
"rsync_path": null,
"rsync_timeout": 0,
"set_remote_user": true,
"src": "mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod'",
"ssh_args": null,
"ssh_connection_multiplexing": false,
"times": null,
"verify_host": false
}
},
"item": "/web/playbooks/automation/misc/filecopyprod",
"msg": "Warning: Permanently added 'mysourcehost' (ED25519) to the list of known hosts.\r\n\nThis system is for the use by authorized users only. All data contained\non all systems is owned by the company and may be monitored, intercepted,\nrecorded, read, copied, or captured in any manner and disclosed in any\nmanner, by authorized company personnel. Users (authorized or unauthorized)\nhave no explicit or implicit expectation of privacy. Unauthorized or improper\nuse of this system may result in administrative, disciplinary action, civil\nand criminal penalties. Use of this system by any user, authorized or\nunauthorized, constitutes express consent to this monitoring, interception,\nrecording, reading, copying, or capturing and disclosure.\n\nIF YOU DO NOT CONSENT, LOG OFF NOW.\n\n##################################################################\n# *** This Server is using Centrify *** #\n# *** Remember to use your Active Directory account *** #\n# *** password when logging in *** #\n##################################################################\n\nrsync: change_dir \"/home/mysourceuser//'/web/playbooks/automation/misc\" failed: No such file or directory (2)\nrsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1658) [Receiver=3.1.2]\nrsync: [Receiver] write error: Broken pipe (32)\n",
"rc": 23
}
From the output i got the concerned rsync command and tried to run it manually on my playbook host mydestuser#mydesthost and i get similar error:
"/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/"
Output:
bash: /bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/: No such file or directory
Upon suggestion from StackOverflow I quoted --out-format but I continue to get the same error. See snapshot of the error in the output below:
"/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format='<<CHANGED>>%i %n%L' mysourceuser#mysourcehost:'/tmp/myfolder' /tmp/myfolder1"
Can you please suggest?
Your full command was this
/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/
You've omitted to quote arguments containing spaces, so when the shell parses the line it splits at those spaces, leading to syntax errors when rsync tries to understand the line.
Fix the --rsh parameter, which contains spaces, by changing it to this:
--rsh='/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null'
Fix the --out-format parameter, which contains whitespace and shell special characters by changing it to this:
--out-format='<<CHANGED>>%i %n%L'
In a later example, you've put double quotes around the entire command, so the shell is trying to execute the entire command as a single entity. For example, in the first line the shell splits the line at spaces then executes the command echo with a parameter hello. In second line the shell sees the quoted string and treats it as a single entity; it then tries to execute the command called echo hello - not a command echo with a parameter hello but a command with a literal space character in the middle:
echo hello # → hello
"echo hello" # → -bash: echo hello: command not found
Rule: if a command or parameter contains a space or other special shell character and it's to be considered as a single item, it must be quoted.
The playbook works for older versions of rsync.
With the latest version it started to fail as reported here.
Changed
synchronize:
src: "'{{ item }}'"
to
synchronize:
src: "{{ item }}"
and the error was gone issued resolved with the latest rsync.

ansible: debug only output from tasks?

When i use like ansible-playbook -vvvv, it shows all stdout for all running tasks. Though what it also shows, is noise that shows how each command is run through SSH. Is there a way to use verbosity to just show tasks stdout without any other noise?
Currently it shows like this:
<myhost> ESTABLISH SSH CONNECTION FOR USER: root
<myhost> SSH: EXEC ssh -vvv -o ControlMaster=auto -o ControlPersist=30m -o PreferredAuthentications=publickey -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/tmp/ansible-ssh-%h-%p-%r myhost '/bin/sh -c '"'"'rm -f -r /var/tmp/ansible-tmp-1333333.89364-1154635-13434444/ > /dev/null 2>&1 && sleep 0'"'"''
<myhost> (0, b'', b'OpenSSH_8.2p1 Ubuntu-4ubuntu0.4, OpenSSL 1.1.1f 31 Mar 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: include /etc/ssh/ssh_config.d/*.conf matched no files\r\ndebug1: /etc/ssh/ssh_config line 21: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 1136353\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
changed: [dev] => {
"changed": true,
"cmd": [
"/opt/odoo/.local/bin/docker-compose",
"rm",
"-fsv",
"odoo"
],
"delta": "0:00:03.287055",
"end": "2022-01-25 08:09:25.105235",
"invocation": {
"module_args": {
"_raw_params": "/opt/odoo/.local/bin/docker-compose rm -fsv odoo",
"_uses_shell": false,
"argv": null,
"chdir": "/opt/odoo/app",
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": false
}
},
"msg": "",
"rc": 0,
"start": "2022-01-25 08:09:21.818180",
"stderr": "Stopping app_odoo_1 ... \r\nStopping app_odoo_1 ... done\r\nRemoving app_odoo_1 ... \r\nRemoving app_odoo_1 ... done",
"stderr_lines": [
"Stopping app_odoo_1 ... ",
"Stopping app_odoo_1 ... done",
"Removing app_odoo_1 ... ",
"Removing app_odoo_1 ... done"
],
"stdout": "Going to remove app_odoo_1",
"stdout_lines": [
"Going to remove app_odoo_1"
]
}
Is there a way to just keep JSON part without explicitly specifying debug arg for each task?
When using Ansible in Verbose Mode like ansible-playbook -vvvv, it shows how each command is run through SSH and it also shows all stdout for all running tasks.
verbose mode (-vvv for more, -vvvv to enable connection debugging)
This is so far the intended behavior.
Is there a way to use verbosity to just show tasks stdout without any other noise?
Even the parameters -v or -vv producing a kind of noise. This is also the intended behavior since the feature is for debugging functionality outside of tasks. See in example source query vvv or source query verbose.
Is there a way to just keep JSON part without explicitly specifying debug arg for each task?
I am not aware of any feature like that.
Currently I assume it would be necessary to use the debug_module. Or using the Playbook Debugger to Debugging tasks. With that it would be possible to get the stdout of a task only.
Further Readings
How to debug Ansible issues?
Debugging modules
Aside from Ansible there might be a way of filtering the output in example via | grep -A2 stdout or awk 'p; / =>/ {p=1}', sed '0,/ => /d' and than | jq -M -r '.stdout', respective | jq -M -r '.stdout_lines'.
Thanks to
Delete everything before pattern including pattern using awk or sed
Using jq to fetch key value from JSON output

Ansible error Shared connection to myhost1 closed. when using raw module

My ansible target server is SunOS.
I get Ansible error Shared connection to myhost1 closed. when using raw module.
The error does not show when i change module to shell however, the execution of script start.sh does not happen (evident from outout of ps command) hence i wish to use raw
- name: "START ADP SERVICES"
raw: "source ~/.profile; sh /web/external_products/springboot/{{ vars[ environ + '_folder'] }}/veladpservice/bin/start.sh veladpservice.jar {{ vars[ environ + '_folder'] }} {{ allpass }}"
Output when using shell module:
TASK [START ADAPTER SERVICES] **************************************************
task path: /web/playbooks/automation/veladp/va_action.yml:32
changed: [myhost1] => {"changed": true, "cmd": "source ~/.profile; sh /web/external_products/springboot/stg/veladpservice/bin/start.sh veladpservice.jar stg WEB_USER apps_user_2021_2378 MSPW435 MSPW435 MSPW445 MSPW445 PETWEB440 Temp_45678 MSPW460 Temp_3456789012 MSPW430 Temp_1234567890 MSPW455 Temp_09876", "delta": "0:00:01.009433", "end": "2021-10-28 05:47:51.277868", "rc": 0, "start": "2021-10-28 05:47:50.268435", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
When using raw module:
TASK [START adp SERVICES] *********************************************************************************
task path: /web/playbooks/automation/veladprestart/va_action.yml:32
<myhost1> ESTABLISH SSH CONNECTION FOR USER: myuser1
<myhost1> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="myuser1"' -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o ControlPath=/home/myuser1/.ansible/cp/cc89a5347e -tt myhost1 'source ~/.profile; /web/external_products/springboot/stg/veladpservice/bin/start.sh veladpservice.jar stg MSP_WEB_USER Marsh_apps_user_2021_2378 MSPW435 MSPW435 MSPW445 MSPW445 PETWEB440 Temp_345678 MSPW460 Temp_3456789012 MSPW430 Temp_1234567890 MSPW455 Temp_0987654321'
<myhost1> (0, b'', b"OpenSSH_8.4p1 (CentrifyDC build 5.7.1-346) , OpenSSL 1.1.1g 21 Apr 2020\r\ndebug1: Reading configuration data /etc/centrifydc/ssh/ssh_config\r\ndebug1: /etc/centrifydc/ssh/ssh_config line 3: Applying options for *\r\ndebug3: expanded UserKnownHostsFile '~/.ssh/known_hosts' -> '/home/myuser1/.ssh/known_hosts'\r\ndebug3: expanded UserKnownHostsFile '~/.ssh/known_hosts2' -> '/home/myuser1/.ssh/known_hosts2'\r\ndebug1: Authenticator provider $SSH_SK_PROVIDER did not resolve; disabling\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 21893\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to myhost1 closed.\r\n")
changed: [myhost1] => {
"changed": true,
"rc": 0,
"stderr": "OpenSSH_8.4p1 (CentrifyDC build 5.7.1-346) , OpenSSL 1.1.1g 21 Apr 2020\r\ndebug1: Reading configuration data /etc/centrifydc/ssh/ssh_config\r\ndebug1: /etc/centrifydc/ssh/ssh_config line 3: Applying options for *\r\ndebug3: expanded UserKnownHostsFile '~/.ssh/known_hosts' -> '/home/myuser1/.ssh/known_hosts'\r\ndebug3: expanded UserKnownHostsFile '~/.ssh/known_hosts2' -> '/home/myuser1/.ssh/known_hosts2'\r\ndebug1: Authenticator provider $SSH_SK_PROVIDER did not resolve; disabling\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 21893\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to myhost1 closed.\r\n",
"stderr_lines": [
"OpenSSH_8.4p1 (CentrifyDC build 5.7.1-346) , OpenSSL 1.1.1g 21 Apr 2020",
"debug1: Reading configuration data /etc/centrifydc/ssh/ssh_config",
"debug1: /etc/centrifydc/ssh/ssh_config line 3: Applying options for *",
"debug3: expanded UserKnownHostsFile '~/.ssh/known_hosts' -> '/home/myuser1/.ssh/known_hosts'",
"debug3: expanded UserKnownHostsFile '~/.ssh/known_hosts2' -> '/home/myuser1/.ssh/known_hosts2'",
"debug1: Authenticator provider $SSH_SK_PROVIDER did not resolve; disabling",
"debug1: auto-mux: Trying existing master",
"debug2: fd 3 setting O_NONBLOCK",
"debug2: mux_client_hello_exchange: master version 4",
"debug3: mux_client_forwards: request forwardings: 0 local, 0 remote",
"debug3: mux_client_request_session: entering",
"debug3: mux_client_request_alive: entering",
"debug3: mux_client_request_alive: done pid = 21893",
"debug3: mux_client_request_session: session request sent",
"debug1: mux_client_request_session: master session id: 2",
"debug3: mux_client_read_packet: read header failed: Broken pipe",
"debug2: Received exit status from master 0",
"Shared connection to myhost1 closed."
],
"stdout": "",
"stdout_lines": []
}
Update post suggestion:
I also tried detaching the process using disown like below but the ps still does not show the process running:
- name: "START ADAPTER SERVICES"
shell: "source ~/.profile && sh /web/external_products/springboot/{{ vars[ environ + '_folder'] }}/velocityadapterservice/bin/start.sh velocityadapterservice.jar {{ vars[ environ + '_folder'] }} {{ allpass }} &; disown %%"
~/.profile & start.sh but have 744 permissions and the owner is myuser1
Note: Running the same process as is manually works!!
Can you please suggest?
#mdaniel suggestion gave me the clue. However, nohup helped resolve the issue and still not sure about disown. Below is the solution.
- name: "START ADAPTER SERVICES"
shell: "source ~/.profile && nohup /web/external_products/springboot/{{ vars[ environ + '_folder'] }}/velocityadapterservice/bin/start.sh velocityadapterservice.jar {{ vars[ environ + '_folder'] }} {{ allpass }} &"

Ansible fortios_config backups not working

I use Fortinet for firewall automation, but i get the error "Error reading running config" . I already followed this website: https://github.com/ansible/ansible/issues/33392
But do not find any solution. Please tell me what am I doing wrong ?
Ansible version: 2.7.0
Python version: 2.7.5
Fortinet: 60E
FortiOS version: 6.0.2
Here is what I am trying:
FortiOS.yml playbook:
---
- name: FortiOS Firewall Settings
hosts: fortiFW
connection: local
vars_files:
- /etc/ansible/vars/FortiOS_Settings_vars.yml
tasks:
- name: Backup current config
fortios_config:
host: 192.168.1.99
username: admin
password: Password#123
backup: yes
backup_path: /etc/ansible/forti_backup
Here is what I get as error:
ok: [192.168.1.99] META: ran handlers Read vars_file
'/etc/ansible/vars/FortiOS_Settings_vars.yml'
TASK [Backup current config]
**************************************************************************************************************************************************************************************************************** task path: /etc/ansible/FortiOS_Settings_test.yml:8 <192.168.1.99>
ESTABLISH LOCAL CONNECTION FOR USER: root <192.168.1.99> EXEC /bin/sh
-c 'echo ~root && sleep 0' <192.168.1.99> EXEC /bin/sh -c '( umask 77 && mkdir -p "echo
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226" && echo
ansible-tmp-1539674386.05-16470854685226="echo
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226" ) &&
sleep 0' Using module file
/usr/lib/python2.7/site-packages/ansible/modules/network/fortios/fortios_config.py
<192.168.1.99> PUT
/root/.ansible/tmp/ansible-local-6154Uq5Dmw/tmpt6JukB TO
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/AnsiballZ_fortios_config.py
<192.168.1.99> EXEC /bin/sh -c 'chmod u+x
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/AnsiballZ_fortios_config.py
&& sleep 0' <192.168.1.99> EXEC /bin/sh -c '/usr/bin/python
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/AnsiballZ_fortios_config.py
&& sleep 0' <192.168.1.99> EXEC /bin/sh -c 'rm -f -r
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/ >
/dev/null 2>&1 && sleep 0' The full traceback is: WARNING: The below
traceback may not be related to the actual failure. File
"/tmp/ansible_fortios_config_payload_b6IQmy/main.py", line 132, in
main
f.load_config(path=module.params['filter']) File "/usr/lib/python2.7/site-packages/pyFG/fortios.py", line 212, in
load_config
config_text = self.execute_command(command) File "/usr/lib/python2.7/site-packages/pyFG/fortios.py", line 154, in
execute_command
output = output + self._read_wrapper(o) File "/usr/lib/python2.7/site-packages/pyFG/fortios.py", line 120, in
_read_wrapper
return py23_compat.text_type(data)
fatal: [192.168.1.99]: FAILED! => {
"changed": false,
"invocation": {
"module_args": {
"backup": true,
"backup_filename": null,
"backup_path": "/etc/ansible/forti_backup",
"config_file": null,
"file_mode": false,
"filter": "",
"host": "192.168.1.99",
"password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
"src": null,
"timeout": 60,
"username": "admin",
"vdom": null
}
},
"msg": "Error reading running config" }
When working with this module, I had the same issue. I looked into the source code of the module, and found that this error occurs when filter is set to "" -> empty string. You can get facts about the device when changing filter to something like "firewall address". But then you will only get back the options from exactly that, like if you would've typed "show firewall address" on the CLI of the device.
I'm currently working on a solution to use Ansible for FortiGate automation, but it's not looking good. E.g. FortiGates additionally do not support Netconf, so you can't use Netconf to send commands to the device.
So therefore, you're not doing anything wrong, but the modules is either not optimized, or I guessed that maybe the full-configuration is too big to be read by the module, so that you have to use the filter option to shrink it.

ansible 2.3.0 - No authentication methods available

I have installed ansible on ubuntu server using:
pip install git+git://github.com/ansible/ansible.git#devel
So my curent version is:
ansible 2.3.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
In configuration file I have:
# uncomment this to disable SSH key host checking
host_key_checking = False
And when I run ansible I get:
Using /etc/ansible/ansible.cfg as config file
SSH password:
PLAYBOOK: test1.yml ************************************************************
1 plays in test1.yml
PLAY [testowy playbook] ********************************************************
TASK [show version] ************************************************************
task path: /home/mszczesniak/test1.yml:8
Using module file /usr/local/lib/python2.7/dist-packages/ansible/modules/core/network/ios/ios_command.py
<10.27.200.80> ESTABLISH LOCAL CONNECTION FOR USER: root
<10.27.200.80> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo $HOME/.ansible/tmp/ansible-tmp-1477054111.24-214066525349846 `" && echo ansible-tmp-1477054111.24-214066525349846="` echo $HOME/.ansible/tmp/ansible-tmp-1477054111.24-214066525349846 `" ) && sleep 0'
<10.27.200.80> PUT /tmp/tmpNfII7q TO /home/mszczesniak/.ansible/tmp/ansible-tmp-1477054111.24-214066525349846/ios_command.py
<10.27.200.80> EXEC /bin/sh -c 'chmod u+x /home/mszczesniak/.ansible/tmp/ansible-tmp-1477054111.24-214066525349846/ /home/mszczesniak/.ansible/tmp/ansible-tmp-1477054111.24-214066525349846/ios_command.py && sleep 0'
<10.27.200.80> EXEC /bin/sh -c '/usr/bin/python /home/mszczesniak/.ansible/tmp/ansible-tmp-1477054111.24-214066525349846/ios_command.py; rm -rf "/home/mszczesniak/.ansible/tmp/ansible-tmp-1477054111.24-214066525349846/" > /dev/null 2>&1 && sleep 0'
An exception occurred during task execution. The full traceback is:
Traceback (most recent call last):
File "/tmp/ansible_zKWzh_/ansible_module_ios_command.py", line 237, in <module>
main()
File "/tmp/ansible_zKWzh_/ansible_module_ios_command.py", line 200, in main
runner.add_command(**cmd)
File "/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/netcli.py", line 147, in add_command
File "/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/network.py", line 116, in cli
File "/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/network.py", line 147, in connect
File "/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/ios.py", line 180, in connect
File "/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/shell.py", line 230, in connect
File "/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/shell.py", line 100, in open
File "/usr/lib/python2.7/dist-packages/paramiko/client.py", line 367, in connect
look_for_keys, gss_auth, gss_kex, gss_deleg_creds, gss_host)
File "/usr/lib/python2.7/dist-packages/paramiko/client.py", line 585, in _auth
raise SSHException('No authentication methods available')
paramiko.ssh_exception.SSHException: No authentication methods available
fatal: [10.27.200.80]: FAILED! => {
"changed": false,
"failed": true,
"invocation": {
"module_name": "ios_command"
},
"module_stderr": "Traceback (most recent call last):\n File \"/tmp/ansible_zKWzh_/ansible_module_ios_command.py\", line 237, in <module>\n main()\n File \"/tmp/ansible_zKWzh_/ansible_module_ios_command.py\", line 200, in main\n runner.add_command(**cmd)\n File \"/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/netcli.py\", line 147, in add_command\n File \"/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/network.py\", line 116, in cli\n File \"/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/network.py\", line 147, in connect\n File \"/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/ios.py\", line 180, in connect\n File \"/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/shell.py\", line 230, in connect\n File \"/tmp/ansible_zKWzh_/ansible_modlib.zip/ansible/module_utils/shell.py\", line 100, in open\n File \"/usr/lib/python2.7/dist-packages/paramiko/client.py\", line 367, in connect\n look_for_keys, gss_auth, gss_kex, gss_deleg_creds, gss_host)\n File \"/usr/lib/python2.7/dist-packages/paramiko/client.py\", line 585, in _auth\n raise SSHException('No authentication methods available')\nparamiko.ssh_exception.SSHException: No authentication methods available\n",
"module_stdout": "",
"msg": "MODULE FAILURE"
}
to retry, use: --limit #/home/mszczesniak/test1.retry
PLAY RECAP *********************************************************************
10.27.200.80 : ok=0 changed=0 unreachable=0 failed=1
What is wrong? It looks like option in a config file is not taken or maby there is problem in dev 2.3.0 version?
This exception is raised when you do not provide any means for authentication. The Paramiko SSH Client doesn't know what method to use and hence raises SSHException('No authentication methods available').
You should provide either a password or a private key (or both), for the SSHClient to work. Else, it's just clueless.
A little extra:
If you look at the code, you can see that this exception is raised when none of the possible auth methods have been tried.
if we are adding the below parameters
ansible_user=rakesh
ansible_password=xxxxx
it is giving "unable to set terminal parameters" error

Resources