ansible: debug only output from tasks? - ansible

When i use like ansible-playbook -vvvv, it shows all stdout for all running tasks. Though what it also shows, is noise that shows how each command is run through SSH. Is there a way to use verbosity to just show tasks stdout without any other noise?
Currently it shows like this:
<myhost> ESTABLISH SSH CONNECTION FOR USER: root
<myhost> SSH: EXEC ssh -vvv -o ControlMaster=auto -o ControlPersist=30m -o PreferredAuthentications=publickey -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/tmp/ansible-ssh-%h-%p-%r myhost '/bin/sh -c '"'"'rm -f -r /var/tmp/ansible-tmp-1333333.89364-1154635-13434444/ > /dev/null 2>&1 && sleep 0'"'"''
<myhost> (0, b'', b'OpenSSH_8.2p1 Ubuntu-4ubuntu0.4, OpenSSL 1.1.1f 31 Mar 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: include /etc/ssh/ssh_config.d/*.conf matched no files\r\ndebug1: /etc/ssh/ssh_config line 21: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 1136353\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
changed: [dev] => {
"changed": true,
"cmd": [
"/opt/odoo/.local/bin/docker-compose",
"rm",
"-fsv",
"odoo"
],
"delta": "0:00:03.287055",
"end": "2022-01-25 08:09:25.105235",
"invocation": {
"module_args": {
"_raw_params": "/opt/odoo/.local/bin/docker-compose rm -fsv odoo",
"_uses_shell": false,
"argv": null,
"chdir": "/opt/odoo/app",
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": false
}
},
"msg": "",
"rc": 0,
"start": "2022-01-25 08:09:21.818180",
"stderr": "Stopping app_odoo_1 ... \r\nStopping app_odoo_1 ... done\r\nRemoving app_odoo_1 ... \r\nRemoving app_odoo_1 ... done",
"stderr_lines": [
"Stopping app_odoo_1 ... ",
"Stopping app_odoo_1 ... done",
"Removing app_odoo_1 ... ",
"Removing app_odoo_1 ... done"
],
"stdout": "Going to remove app_odoo_1",
"stdout_lines": [
"Going to remove app_odoo_1"
]
}
Is there a way to just keep JSON part without explicitly specifying debug arg for each task?

When using Ansible in Verbose Mode like ansible-playbook -vvvv, it shows how each command is run through SSH and it also shows all stdout for all running tasks.
verbose mode (-vvv for more, -vvvv to enable connection debugging)
This is so far the intended behavior.
Is there a way to use verbosity to just show tasks stdout without any other noise?
Even the parameters -v or -vv producing a kind of noise. This is also the intended behavior since the feature is for debugging functionality outside of tasks. See in example source query vvv or source query verbose.
Is there a way to just keep JSON part without explicitly specifying debug arg for each task?
I am not aware of any feature like that.
Currently I assume it would be necessary to use the debug_module. Or using the Playbook Debugger to Debugging tasks. With that it would be possible to get the stdout of a task only.
Further Readings
How to debug Ansible issues?
Debugging modules
Aside from Ansible there might be a way of filtering the output in example via | grep -A2 stdout or awk 'p; / =>/ {p=1}', sed '0,/ => /d' and than | jq -M -r '.stdout', respective | jq -M -r '.stdout_lines'.
Thanks to
Delete everything before pattern including pattern using awk or sed
Using jq to fetch key value from JSON output

Related

ansible synchronize module fails to get directory from remote to local - failed: No such file or directory

I wish to copy /web/playbooks/automation/misc/filecopyprod from mysourceuser#mysourcehost to destination mydestuser#mydesthost under the below location /web/playbooks/automation/misc/filecopy/tmpfiles/500/
Evident that both the source and destinations are present and have good permissions.
[mydestuser#mydesthost ~]$ ssh mysourceuser#mysourcehost ls -ld '/web/playbooks/automation/misc/filecopyprod'
##################################################################
# *** This Server is using Centrify *** #
# *** Remember to use your Active Directory account *** #
# *** password when logging in *** #
##################################################################
drwxrwxr-x 3 mysourceuser mysourceuser 209 Sep 26 14:58 /web/playbooks/automation/misc/filecopyprod
[mydestuser#mydesthost ~]$ ls -ld /web/playbooks/automation/misc/filecopy/tmpfiles/500/
drwxr-xr-x 2 mydestuser aces 6 Sep 26 14:13 /web/playbooks/automation/misc/filecopy/tmpfiles/500/
Here is my playbook that runs on the mydesthost and gets me files & folders from a remote server mysourceuser#mysourcehost to local server mydestuser#mydesthost
- name: Copying from "{{ inventory_hostname }}" to this ansible server.
tags: validate
synchronize:
src: "'{{ item }}'"
dest: "{{ playbook_dir }}/tmpfiles/{{ Latest_Build_Number }}/"
mode: pull
copy_links: yes
with_items:
- "{{ source_file_new.splitlines() }}"
To run the above playbook:
ansible-playbook /web/playbooks/automation/misc/filecopy/copyfiles.yml -e "source_file_new='$source_file_new'" -e "Latest_Build_Number='500'"
Output of my run:
TASK [Copying from "mysourcehost" to this ansible server.] **********************
task path: /web/playbooks/automation/misc/filecopy/copyfiles.yml:218
Monday 26 September 2022 14:13:02 -0500 (0:00:00.047) 0:00:03.084 ******
redirecting (type: action) ansible.builtin.synchronize to ansible.posix.synchronize
redirecting (type: action) ansible.builtin.synchronize to ansible.posix.synchronize
<mysourcehost> ESTABLISH LOCAL CONNECTION FOR USER: mydestuser
<mysourcehost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81 `"&& mkdir "` echo /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597 `" && echo ansible-tmp-1664219583.0005133-20679-105296975361597="` echo /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597 `" ) && sleep 0'
Using module file /home/mydestuser/.ansible/collections/ansible_collections/ansible/posix/plugins/modules/synchronize.py
<mysourcehost> PUT /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/tmpxhpyaf0m TO /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/AnsiballZ_synchronize.py
<mysourcehost> EXEC /bin/sh -c 'chmod u+x /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/ /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/AnsiballZ_synchronize.py && sleep 0'
<mysourcehost> EXEC /bin/sh -c '/usr/local/bin/python3.8 /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/AnsiballZ_synchronize.py && sleep 0'
<mysourcehost> EXEC /bin/sh -c 'rm -f -r /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/ > /dev/null 2>&1 && sleep 0'
failed: [mysourcehost] (item=/web/playbooks/automation/misc/filecopyprod) => {
"ansible_loop_var": "item",
"changed": false,
"cmd": "/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/",
"invocation": {
"module_args": {
"_local_rsync_password": null,
"_local_rsync_path": "rsync",
"_substitute_controller": false,
"archive": true,
"checksum": false,
"compress": true,
"copy_links": true,
"delete": false,
"dest": "/web/playbooks/automation/misc/filecopy/tmpfiles/500/",
"dest_port": null,
"dirs": false,
"existing_only": false,
"group": null,
"link_dest": null,
"links": null,
"mode": "pull",
"owner": null,
"partial": false,
"perms": null,
"private_key": null,
"recursive": null,
"rsync_opts": [],
"rsync_path": null,
"rsync_timeout": 0,
"set_remote_user": true,
"src": "mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod'",
"ssh_args": null,
"ssh_connection_multiplexing": false,
"times": null,
"verify_host": false
}
},
"item": "/web/playbooks/automation/misc/filecopyprod",
"msg": "Warning: Permanently added 'mysourcehost' (ED25519) to the list of known hosts.\r\n\nThis system is for the use by authorized users only. All data contained\non all systems is owned by the company and may be monitored, intercepted,\nrecorded, read, copied, or captured in any manner and disclosed in any\nmanner, by authorized company personnel. Users (authorized or unauthorized)\nhave no explicit or implicit expectation of privacy. Unauthorized or improper\nuse of this system may result in administrative, disciplinary action, civil\nand criminal penalties. Use of this system by any user, authorized or\nunauthorized, constitutes express consent to this monitoring, interception,\nrecording, reading, copying, or capturing and disclosure.\n\nIF YOU DO NOT CONSENT, LOG OFF NOW.\n\n##################################################################\n# *** This Server is using Centrify *** #\n# *** Remember to use your Active Directory account *** #\n# *** password when logging in *** #\n##################################################################\n\nrsync: change_dir \"/home/mysourceuser//'/web/playbooks/automation/misc\" failed: No such file or directory (2)\nrsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1658) [Receiver=3.1.2]\nrsync: [Receiver] write error: Broken pipe (32)\n",
"rc": 23
}
From the output i got the concerned rsync command and tried to run it manually on my playbook host mydestuser#mydesthost and i get similar error:
"/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/"
Output:
bash: /bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/: No such file or directory
Upon suggestion from StackOverflow I quoted --out-format but I continue to get the same error. See snapshot of the error in the output below:
"/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format='<<CHANGED>>%i %n%L' mysourceuser#mysourcehost:'/tmp/myfolder' /tmp/myfolder1"
Can you please suggest?
Your full command was this
/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/
You've omitted to quote arguments containing spaces, so when the shell parses the line it splits at those spaces, leading to syntax errors when rsync tries to understand the line.
Fix the --rsh parameter, which contains spaces, by changing it to this:
--rsh='/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null'
Fix the --out-format parameter, which contains whitespace and shell special characters by changing it to this:
--out-format='<<CHANGED>>%i %n%L'
In a later example, you've put double quotes around the entire command, so the shell is trying to execute the entire command as a single entity. For example, in the first line the shell splits the line at spaces then executes the command echo with a parameter hello. In second line the shell sees the quoted string and treats it as a single entity; it then tries to execute the command called echo hello - not a command echo with a parameter hello but a command with a literal space character in the middle:
echo hello # → hello
"echo hello" # → -bash: echo hello: command not found
Rule: if a command or parameter contains a space or other special shell character and it's to be considered as a single item, it must be quoted.
The playbook works for older versions of rsync.
With the latest version it started to fail as reported here.
Changed
synchronize:
src: "'{{ item }}'"
to
synchronize:
src: "{{ item }}"
and the error was gone issued resolved with the latest rsync.

Ansible hangs on job submission step Ansible ZOS Core

I'm using ansible on zlinux. I have a playbook that is using the zos_job_submit command from the zos ansible core modules.
The module is used with a job that generates random data to the jes spool.
//SPOOL1 JOB (UU999999999,1103),'DART JOB',CLASS=0,
// REGION=0M,MSGCLASS=R,TIME=5, LINES=(999999,WARNING),
// NOTIFY=&SYSUID
//* Automatic process will kill the job and cleanup spool.
//* author: xxxxxx , xxxxxxxxx
//STEPNAME EXEC PGM=BPXBATCH
//STDERR DD SYSOUT=*
//STDOUT DD SYSOUT=*
//STDPARM DD *
SH cat /dev/urandom
enter code here
This was working fine until a few days ago when It started to freeze up and error out. It still submits the job but it fails to return its output after the job starts running and then errors out.
Here is the playbook I'm using ( stripped down to only the offending task)
# Author: xxxxxxxxxxxxxx
- name: "DART JES CHAOS EVENT"
hosts: all # WARNING: USE WITH --LIMIT <target> OTHERWISE ALL HOSTS IN INVENTORY WILL BE TARGETED!
vars:
all_jobs:
jobs: [ ]
jobs_file_location: "jobs/{{inventory_hostname}}"
tasks:
- name: "Submit job tasks"
block:
- name: Submit job
ibm.ibm_zos_core.zos_job_submit:
src: "{{uss_jcl_path}}"
location: LOCAL
wait: false
vars:
uss_jcl_path: "{{jcl_lib}}/{{job_jcl}}"
Here is the log using -vvv
ansible-playbook 2.9.27
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/adshome/svc.dart/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /bin/ansible-playbook
python version = 2.7.5 (default, May 27 2022, 07:27:39) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
Using /etc/ansible/ansible.cfg as config file
PLAYBOOK: debug_event.yml **************************************************************************************************************************************************
Positional arguments: /XXXX/dart/ansible/playbooks/debug_event.yml
subset: nwrd
become_method: sudo
inventory: (u'/XXXX/dart/ansible/inventory', u'/XXXX/dart/ansible/auth_inventory')
forks: 5
tags: (u'all',)
extra_vars: (u'jcl_lib=/XXXX/dart/ansible/playbooks/jcl job_jcl=SPOOL',)
verbosity: 4
connection: smart
timeout: 10
1 plays in /XXXX/dart/ansible/playbooks/debug_event.yml
PLAY [DART JES CHAOS EVENT] ************************************************************************************************************************************************
TASK [Submit job] **********************************************************************************************************************************************************
task path: /XXXX/dart/ansible/playbooks/debug_event.yml:12
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/tempfile.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '\n{"changed": true, "path": "/tmp/ansible.nzzb29wz", "uid": 10001120, "gid": 7212, "owner": "XXXXXXX", "group": "GLTCMF", "mode": "0600", "state": "file", "size": 0, "invocation": {"module_args": {"state": "file", "prefix": "ansible.", "suffix": "", "path": null}}}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'echo ~XXXXXXX && sleep 0'"'"''
<XXXX.XXX.COM> (0, '/u/XXXXXXX\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /u/XXXXXXX/.ansible/tmp `"&& mkdir "` echo /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409 `" && echo ansible-tmp-1658270106.84-62001-187469557526409="` echo /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409 `" ) && sleep 0'"'"''
<XXXX.XXX.COM> (0, 'ansible-tmp-1658270106.84-62001-187469557526409=/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/stat.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '\n{"changed": false, "stat": {"exists": true, "path": "/tmp/ansible.nzzb29wz", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 10001120, "gid": 7212, "size": 0, "inode": 10691, "dev": 3248, "nlink": 1, "atime": 1658270106, "mtime": 1658270106, "ctime": 1658270106, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "XXXXXXX", "gr_name": "GLTCMF", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "mimetype": "unknown", "charset": "unknown", "version": null, "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"checksum_algorithm": "sha1", "get_checksum": true, "follow": false, "path": "/tmp/ansible.nzzb29wz", "get_md5": false, "get_mime": true, "get_attributes": true}}}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> PUT /XXXX/dart/ansible/playbooks/jcl/SPOOL TO /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source
<XXXX.XXX.COM> SSH: EXEC sftp -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 '[XXXX.XXX.COM]'
<XXXX.XXX.COM> (0, 'sftp> put /XXXX/dart/ansible/playbooks/jcl/SPOOL /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "posix-rename#openssh.com" revision 1\r\ndebug2: Server supports extension "statvfs#openssh.com" revision 2\r\ndebug2: Server supports extension "fstatvfs#openssh.com" revision 2\r\ndebug2: Server supports extension "hardlink#openssh.com" revision 1\r\ndebug2: Server supports extension "fsync#openssh.com" revision 1\r\ndebug3: Sent message fd 5 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /u/XXXXXXX size 0\r\ndebug3: Looking up /XXXX/dart/ansible/playbooks/jcl/SPOOL\r\ndebug3: Sent message fd 5 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:395\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 395 bytes at 0\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'chmod u+x /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/ /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source && sleep 0'"'"''
<XXXX.XXX.COM> (0, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/copy.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '\n{"dest": "/tmp/ansible.nzzb29wz", "src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source", "md5sum": "e18be97081c5a8d8ae1e1c57eb0d2123", "checksum": "b220730079e63acfee97f2b694ae2d31d3074083", "changed": true, "uid": 10001120, "gid": 7212, "owner": "XXXXXXX", "group": "GLTCMF", "mode": "0600", "state": "file", "size": 395, "invocation": {"module_args": {"src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source", "dest": "/tmp/ansible.nzzb29wz", "_original_basename": "SPOOL", "mode": "0600", "backup": false, "force": true, "follow": false, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "checksum": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /adshome/svc.dart/.ansible/collections/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (1, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\nTraceback (most recent call last):\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 800, in run_module\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 58, in job_output\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 73, in _get_job_output\nRuntimeError: Failed to retrieve job output. RC: -9 Error: \n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "<stdin>", line 102, in <module>\n File "<stdin>", line 94, in _ansiballz_main\n File "<stdin>", line 40, in invoke_module\n File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 205, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 96, in _run_module_code\n mod_name, mod_spec, pkg_name, script_name)\n File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 85, in _run_code\n exec(code, run_globals)\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 894, in <module>\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 890, in main\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 805, in run_module\nIndexError: tuple index out of range\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\n')
<XXXX.XXX.COM> Failed to connect to the host via ssh: OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 60: Applying options for *
debug1: auto-mux: Trying existing master
debug2: fd 4 setting O_NONBLOCK
debug2: mux_client_hello_exchange: master version 4
debug3: mux_client_forwards: request forwardings: 0 local, 0 remote
debug3: mux_client_request_session: entering
debug3: mux_client_request_alive: entering
debug3: mux_client_request_alive: done pid = 61998
debug3: mux_client_request_session: session request sent
debug1: mux_client_request_session: master session id: 2
Traceback (most recent call last):
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 800, in run_module
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 58, in job_output
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 73, in _get_job_output
RuntimeError: Failed to retrieve job output. RC: -9 Error:
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 102, in <module>
File "<stdin>", line 94, in _ansiballz_main
File "<stdin>", line 40, in invoke_module
File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 894, in <module>
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 890, in main
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 805, in run_module
IndexError: tuple index out of range
debug3: mux_client_read_packet: read header failed: Broken pipe
debug2: Received exit status from master 1
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'rm -f -r /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/ > /dev/null 2>&1 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
fatal: [XXXX.XXX.COM]: FAILED! => {
"changed": true,
"checksum": "b220730079e63acfee97f2b694ae2d31d3074083",
"dest": "/tmp/ansible.nzzb29wz",
"gid": 7212,
"group": "GLTCMF",
"invocation": {
"module_args": {
"_original_basename": "SPOOL",
"attributes": null,
"backup": false,
"checksum": null,
"content": null,
"delimiter": null,
"dest": "/tmp/ansible.nzzb29wz",
"directory_mode": null,
"follow": false,
"force": true,
"group": null,
"local_follow": null,
"mode": "0600",
"owner": null,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source",
"unsafe_writes": false,
"validate": null
}
},
"md5sum": "e18be97081c5a8d8ae1e1c57eb0d2123",
"mode": "0600",
"module_stderr": "OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\nTraceback (most recent call last):\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 800, in run_module\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py\", line 58, in job_output\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py\", line 73, in _get_job_output\nRuntimeError: Failed to retrieve job output. RC: -9 Error: \n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"<stdin>\", line 102, in <module>\n File \"<stdin>\", line 94, in _ansiballz_main\n File \"<stdin>\", line 40, in invoke_module\n File \"/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py\", line 205, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File \"/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py\", line 96, in _run_module_code\n mod_name, mod_spec, pkg_name, script_name)\n File \"/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py\", line 85, in _run_code\n exec(code, run_globals)\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 894, in <module>\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 890, in main\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 805, in run_module\nIndexError: tuple index out of range\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"owner": "XXXXXXX",
"rc": 1,
"size": 395,
"src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source",
"state": "file",
"uid": 10001120
}
PLAY RECAP *****************************************************************************************************************************************************************
XXXX.XXX.COM : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
I have a few comments and hopefully we can resolve this.
There are a few things that confuse me and make me wonder how it ever worked else I am missing some data points.
The playbook is using option location: LOCAL and the src: appears to be a directory in USS (u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source), the option LOCAL is meant for files located on the control node (where Ansible engine is running, in your case zLinux), can you confirm that you do have the file on the control node?
When I look at the verbose environment vars passed to the managed node (z/OS) I see no environment vars for the required dependency ZOAU. The zos_job_submit module uses the ZOAU APIs in this module; I am also questioning how this worked before without the environment vars or has your vars been corrupted (see below)?
SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
I am expecting environment vars similar to this:
environment_vars:
_BPXK_AUTOCVT: "ON"
ZOAU_HOME: "{{ ZOAU }}"
PYTHONPATH: "{{ ZOAU }}/lib"
LIBPATH: "{{ ZOAU }}/lib:{{ PYZ }}/lib:/lib:/usr/lib:."
PATH: "{{ ZOAU }}/bin:{{ PYZ }}/bin:/bin:/var/bin"
_CEE_RUNOPTS: "FILETAG(AUTOCVT,AUTOTAG) POSIX(ON)"
_TAG_REDIR_ERR: "txt"
_TAG_REDIR_IN: "txt"
_TAG_REDIR_OUT: "txt"
LANG: "C"
Based on the trace it looks like you are using the latest Ansible Core collection version 1.4.0-beta.1 yet you are using Anaconda Python 3.6 from Rocket, that version of Python has long been removed from support.
For your version of the collection you will want to reference these requirements and probably start with a simple example to ensure its functional before doing something more complex , this is a good one to start with.
I am happy to help but supporting what appears to be a mismatch of requirements can be challenging.

shell script gives incorrect output when invoked using Ansible shell module

I have check.sh script that I wish to run on the target nodes:
cat check.sh
str=`echo $1 | sed -e 's#[\][\]n# #g'`
echo $str>check.row
It is suppose to replace \n with a single white space from the argument and save it in check.row file.
When I run it manually on the target server i get good output results as shown below:
bash -x ./check.sh '/fin/app/01/scripts\\n/fin/app/01/sql'
++ echo '/fin/app/01/scripts\\n/fin/app/01/sql'
++ sed -e 's#[\][\]n# #g'
+ str='/fin/app/01/scripts /fin/app/01/sql'
+ echo /fin/app/01/scripts /fin/app/01/sql
The check.row generated looks good as below:
[user1#remotehost1 ~]$ cat check.row
/fin/app/01/scripts /fin/app/01/sql
However, when i run the same using ansible shell or command module I do not get the expected results.
Below is my playbook:
tasks:
- copy:
src: "{{ playbook_dir }}/files/check.sh"
dest: "~/"
mode: 0754
- set_fact:
install_dir: "{{ hostvars[\'localhost\'][\'command_result\'].stdout.split('\t')[2] }}"
- shell: "bash -x ~/check.sh '{{ install_dir }}' > ~/check_rollback.log"
See ansible's debug output below:
changed: [10.8.44.55] => {
"changed": true,
"cmd": "bash -x ~/check.sh '/fin/app/01/scripts\\n/fin/app/01/sql' > ~/check_rollback.log",
"delta": "0:00:00.118943",
"end": "2019-09-04 10:50:16.503745",
"invocation": {
"module_args": {
"_raw_params": "bash -x ~/check.sh '/fin/app/01/scripts\\n/fin/app/01/sql' > ~/check_rollback.log",
"_uses_shell": true,
"argv": null,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": true
}
},
"rc": 0,
"start": "2019-09-04 10:50:16.384802",
"stderr": "++ echo '/fin/app/01/scripts\\n/fin/app/01/sql'\n++ sed -e 's#[\\][\\]n# #g'\n+ str='/fin/app/01/scripts\\n/fin/app/01/sql'\n+ echo '/fin/app/01/scripts\\n/fin/app/01/sql'",
"stderr_lines": [
"++ echo '/fin/app/01/scripts\\n/fin/app/01/sql'",
"++ sed -e 's#[\\][\\]n# #g'",
"+ str='/fin/app/01/scripts\\n/fin/app/01/sql'",
"+ echo '/fin/app/01/scripts\\n/fin/app/01/sql'"
],
"stdout": "",
"stdout_lines": [] }
And here is the check.row file output from ansible's run:
[user1#remotehost1 ~]$ cat check.row
/fin/app/01/scripts\\n/fin/app/01/sql
As you can instead of single whitespace it is now printing \n.
I am on the latest version of ansible.
One can replicate this issue easily. Can you please suggest why am I getting this issue and how to fix this?
First of all, you are using the shell module in which only the shell command is specified, you have incorrectly used bash in it.
shell: "bash -x ~/check.sh '{{ install_dir }}' > ~/check_rollback.log"
Secondly, as you can see your task has resulted in an error as seen in your attached output.
Stdout is empty and we can see an error in stderr.
Thirdly, If you want to use bash you can use command module, as shown below,
- command: "bash -x ~/check.sh '{{ install_dir }}' > ~/check_rollback.log"
I also suggest the following changes in your check.sh script,
#!/bin/bash
echo $1 # You can check the value that is passed to the script
str=$(echo "$1" | sed -e 's/\\n/ /g') # Use quotes around your variable
echo "$str" > check.row
And it is working fine.

Ansible fortios_config backups not working

I use Fortinet for firewall automation, but i get the error "Error reading running config" . I already followed this website: https://github.com/ansible/ansible/issues/33392
But do not find any solution. Please tell me what am I doing wrong ?
Ansible version: 2.7.0
Python version: 2.7.5
Fortinet: 60E
FortiOS version: 6.0.2
Here is what I am trying:
FortiOS.yml playbook:
---
- name: FortiOS Firewall Settings
hosts: fortiFW
connection: local
vars_files:
- /etc/ansible/vars/FortiOS_Settings_vars.yml
tasks:
- name: Backup current config
fortios_config:
host: 192.168.1.99
username: admin
password: Password#123
backup: yes
backup_path: /etc/ansible/forti_backup
Here is what I get as error:
ok: [192.168.1.99] META: ran handlers Read vars_file
'/etc/ansible/vars/FortiOS_Settings_vars.yml'
TASK [Backup current config]
**************************************************************************************************************************************************************************************************************** task path: /etc/ansible/FortiOS_Settings_test.yml:8 <192.168.1.99>
ESTABLISH LOCAL CONNECTION FOR USER: root <192.168.1.99> EXEC /bin/sh
-c 'echo ~root && sleep 0' <192.168.1.99> EXEC /bin/sh -c '( umask 77 && mkdir -p "echo
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226" && echo
ansible-tmp-1539674386.05-16470854685226="echo
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226" ) &&
sleep 0' Using module file
/usr/lib/python2.7/site-packages/ansible/modules/network/fortios/fortios_config.py
<192.168.1.99> PUT
/root/.ansible/tmp/ansible-local-6154Uq5Dmw/tmpt6JukB TO
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/AnsiballZ_fortios_config.py
<192.168.1.99> EXEC /bin/sh -c 'chmod u+x
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/AnsiballZ_fortios_config.py
&& sleep 0' <192.168.1.99> EXEC /bin/sh -c '/usr/bin/python
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/AnsiballZ_fortios_config.py
&& sleep 0' <192.168.1.99> EXEC /bin/sh -c 'rm -f -r
/root/.ansible/tmp/ansible-tmp-1539674386.05-16470854685226/ >
/dev/null 2>&1 && sleep 0' The full traceback is: WARNING: The below
traceback may not be related to the actual failure. File
"/tmp/ansible_fortios_config_payload_b6IQmy/main.py", line 132, in
main
f.load_config(path=module.params['filter']) File "/usr/lib/python2.7/site-packages/pyFG/fortios.py", line 212, in
load_config
config_text = self.execute_command(command) File "/usr/lib/python2.7/site-packages/pyFG/fortios.py", line 154, in
execute_command
output = output + self._read_wrapper(o) File "/usr/lib/python2.7/site-packages/pyFG/fortios.py", line 120, in
_read_wrapper
return py23_compat.text_type(data)
fatal: [192.168.1.99]: FAILED! => {
"changed": false,
"invocation": {
"module_args": {
"backup": true,
"backup_filename": null,
"backup_path": "/etc/ansible/forti_backup",
"config_file": null,
"file_mode": false,
"filter": "",
"host": "192.168.1.99",
"password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
"src": null,
"timeout": 60,
"username": "admin",
"vdom": null
}
},
"msg": "Error reading running config" }
When working with this module, I had the same issue. I looked into the source code of the module, and found that this error occurs when filter is set to "" -> empty string. You can get facts about the device when changing filter to something like "firewall address". But then you will only get back the options from exactly that, like if you would've typed "show firewall address" on the CLI of the device.
I'm currently working on a solution to use Ansible for FortiGate automation, but it's not looking good. E.g. FortiGates additionally do not support Netconf, so you can't use Netconf to send commands to the device.
So therefore, you're not doing anything wrong, but the modules is either not optimized, or I guessed that maybe the full-configuration is too big to be read by the module, so that you have to use the filter option to shrink it.

How to pass multiple executable in ansible within a shell module?

I am trying to pass a prompt y in ansible when it executes command.
When i do manually on server it asks for a prompt.
The issue is for command to run i need to pass the executable /bin/bash
command: source /etc/profile.d/tableau_server.sh && tsm pending-changes apply for expect command to run i need to pass /usr/bin/expect .
My question, how can i pass 2 executable in ansible such that for command it uses /bin/bash and for expect prompt it should use /usr/bin/expect and the error is because i am using source, what is an alternative i can use?
Update: I dont know why but i am not able to pass --ignore-prompt , It gives an error
ubuntu#ip-xx-xxx-xx-xx:~$ tsm pending-changes apply --ignore-prompt
Unrecognized option: --ignore-prompt
Please help me with a solution!
ubuntu#ip-xx-xxx-xx-xx:~$ tsm pending-changes apply
This operation will perform a server restart. Are you sure you wish to continue?
(y/n):
My ansible script:
shell: |
source /etc/profile.d/tableau_server.sh && tsm pending-changes apply
expect "This operation will perform a server restart. Are you sure you wish to continue?\n(y/n):"
send "y\n"
exit 0
args:
executable: /usr/bin/expect
args:
executable: /bin/bash/expect
when: inventory_hostname == "xx.xxx.xx.xx"
ERROR:
changed: [xx.xxx.xxx.xx] => {
"changed": true,
"cmd": "source /etc/profile.d/tableau_server.sh && tsm pending-changes apply\n expect \"This operation will perform a server restart. Are you sure you wish to continue?\\n(y/n):\"\n send \"y\\n\"\n exit 0",
"delta": "0:00:00.034824",
"end": "2018-08-20 17:29:41.457700",
"invocation": {
"module_args": {
"_raw_params": "source /etc/profile.d/tableau_server.sh && tsm pending-changes apply\n expect \"This operation will perform a server restart. Are you sure you wish to continue?\\n(y/n):\"\n send \"y\\n\"\n exit 0",
"_uses_shell": true,
"argv": null,
"chdir": null,
"creates": null,
"executable": "/usr/bin/expect",
"removes": null,
"stdin": null,
"warn": true
}
},
"rc": 0,
"start": "2018-08-20 17:29:41.422876",
"stderr": "wrong # args: should be \"source ?-encoding name? fileName\"\n while executing\n\"source /etc/profile.d/tableau_server.sh && tsm pending-changes apply\"",
"stderr_lines": [
"wrong # args: should be \"source ?-encoding name? fileName\"",
" while executing",
"\"source /etc/profile.d/tableau_server.sh && tsm pending-changes apply\""
],
"stdout": "",
"stdout_lines": []
I would say you are doing far too much with bash commands and '&&' inside command, none of this feels idempotent.
Can I recommend going back to the drawing board with this. I would recommend creating the command using the 'creates' parameter so it can tell if it needs to run.
https://docs.ansible.com/ansible/2.6/modules/command_module.html
Or alternatively check before hand which will then see if the command needs running using register.
In this instance of your issue with the:
tsm pending-changes apply
should support as per https://onlinehelp.tableau.com/current/server-linux/en-us/cli_pending-changes.htm
tsm pending-changes apply --ignore-prompt
which will then not prompt for a yes and will not need the expect module.
I solved my issue by passing an -r option.
- name: Initialize and Start Tableau Server
shell: source /etc/profile.d/tableau_server.sh && tsm pending-changes apply -r -u ubuntu -p '{{ tableau_server_admin_password }}'
args:
executable: /bin/bash
when: inventory_hostname == "xx.xxx.xx.xx"

Resources