What is the proper syntax for starting a session between a remote device and my ansible control node if the device is behind a PAT boundary? - ansible

I am new to Ansible and currently whenever I try to run my playbook I get an error "fatal: [E3]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Received disconnect from 76.81.200.163 port 30003:2: Bad string Length\r\nDisconnected from 76.81.200.163 port 30003", "unreachable": true}"
This is the same across all devices in the inventory file.
This is meant to help audit networking equipment behind a gateway we have setup. I've tried adjusting the syntax of the inventory file to read ansible_ssh_user instead of ansible_user. This seemed to yield similar results.
---
- hosts: switches
vars:
ansible_network_os: icx
ansible_user: MyUsername
ansible_become: True
ansible_password: MyPassword
ansible_become_method: enable
ansible_become_pass: MyPassword
ansible_command_timeout: 60
tasks:
- name: Collect the default facts
icx_facts:
gather_subset:
- default
register: result
- name: Collect the hardware facts
icx_facts:
gather_subset:
- hardware
register: result
- name: Collect the config facts
icx_facts:
gather_subset:
- config
register: result
- name: Collect the interfaces facts
icx_facts:
gather_subset:
- interfaces
register: result
- name: Collect all the facts
icx_facts:
gather_subset:
- all
register: result
- name: debug
debug:
var: result
Below is the inventory file:
[switches]
E3 ansible_port=30003 ansible_host=76.81.200.163
E4 ansible_port=30004 ansible_host=76.81.200.163
E5 ansible_port=30005 ansible_host=76.81.200.163
E6 ansible_port=30006 ansible_host=76.81.200.163
E7 ansible_port=30007 ansible_host=76.81.200.163
E8 ansible_port=30008 ansible_host=76.81.200.163
E9 ansible_port=30009 ansible_host=76.81.200.163
E10 ansible_port=30010 ansible_host=76.81.200.163
E11 ansible_port=30011 ansible_host=76.81.200.163
E12 ansible_port=30012 ansible_host=76.81.200.163
E13 ansible_port=30013 ansible_host=76.81.200.163
E14 ansible_port=30014 ansible_host=76.81.200.163
E15 ansible_port=30015 ansible_host=76.81.200.163
E16 ansible_port=30016 ansible_host=76.81.200.163
E17 ansible_port=30017 ansible_host=76.81.200.163
E18 ansible_port=30018 ansible_host=76.81.200.163
E19 ansible_port=30019 ansible_host=76.81.200.163
E20 ansible_port=30020 ansible_host=76.81.200.163
E21 ansible_port=30021 ansible_host=76.81.200.163
E22 ansible_port=30022 ansible_host=76.81.200.163
E23 ansible_port=30023 ansible_host=76.81.200.163
E24 ansible_port=30024 ansible_host=76.81.200.163
E25 ansible_port=30025 ansible_host=76.81.200.163
E26 ansible_port=30026 ansible_host=76.81.200.163
E27 ansible_port=30027 ansible_host=76.81.200.163
E28 ansible_port=30028 ansible_host=76.81.200.163
E29 ansible_port=30029 ansible_host=76.81.200.163
E30 ansible_port=30030 ansible_host=76.81.200.163
E31 ansible_port=30031 ansible_host=76.81.200.163
E32 ansible_port=30032 ansible_host=76.81.200.163
E33 ansible_port=30033 ansible_host=76.81.200.163
E34 ansible_port=30034 ansible_host=76.81.200.163
E35 ansible_port=30035 ansible_host=76.81.200.163
E36 ansible_port=30036 ansible_host=76.81.200.163
The command being used is as follows:
sudo ansible-playbook showFacts.yml -i hosts
The goal was to audit the equipment and see what information could be found on the switches with the commands in the playbook.
I ran the command with -vvvv and received the following output
<76.81.200.163> ESTABLISH SSH CONNECTION FOR USER: MyUserName
<76.81.200.163> SSH: ansible.cfg set ssh_args: (-C)(-o)(ControlMaster=auto)(-o)(ControlPersist=60s)
<76.81.200.163> SSH: ANSIBLE_HOST_KEY_CHECKING/host_key_checking disabled: (-o)(StrictHostKeyChecking=no)
<76.81.200.163> SSH: ANSIBLE_REMOTE_PORT/remote_port/ansible_port set: (-o)(Port=30014)
<76.81.200.163> SSH: ANSIBLE_REMOTE_USER/remote_user/ansible_user/user/-u set: (-o)(User="MyUserName")
<76.81.200.163> SSH: ANSIBLE_TIMEOUT/timeout set: (-o)(ConnectTimeout=10)
<76.81.200.163> SSH: PlayContext set ssh_common_args: ()
<76.81.200.163> SSH: PlayContext set ssh_extra_args: ()
<76.81.200.163> SSH: found only ControlPersist; added ControlPath: (-o)(ControlPath=/home/aws/.ansible/cp/7fcfcd09f0)
<76.81.200.163> SSH: EXEC sshpass -d10 ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o Port=30014 -o 'User="MyUserName"' -o ConnectTimeout=10 -o ControlPath=/home/aws/.ansible/cp/7fcfcd09f0 76.81.200.163 '/bin/sh -c '"'"'echo ~MyUserName && sleep 0'"'"''
<76.81.200.163> (255, '', 'OpenSSH_7.2p2 Ubuntu-4ubuntu2.8, OpenSSL 1.0.2g 1 Mar 2016\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug1: Control socket "/home/aws/.ansible/cp/fe4276fa4d" does not exist\r\ndebug2: resolving "76.81.200.163" port 30010\r\ndebug2: ssh_connect_direct: needpriv 0\r\ndebug1: Connecting to 76.81.200.163 [76.81.200.163] port 30010.\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug1: fd 3 clearing O_NONBLOCK\r\ndebug1: Connection established.\r\ndebug3: timeout: 9931 ms remain after connect\r\ndebug1: permanently_set_uid: 0/0\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /root/.ssh/id_dsa type -1\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /root/.ssh/id_dsa-cert type -1\r\ndebug1: Enabling compatibility mode for protocol 2.0\r\ndebug1: Local version string SSH-2.0-OpenSSH_7.2p2 Ubuntu-4ubuntu2.8\r\ndebug1: Remote protocol version 2.0, remote software version RomSShell_5.40\r\ndebug1: no match: RomSShell_5.40\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug1: Authenticating to 76.81.200.163:30010 as \'MyUserName\'\r\ndebug3: put_host_port: [76.81.200.163]:30010\r\ndebug3: hostkeys_foreach: reading file "/root/.ssh/known_hosts"\r\ndebug3: record_hostkey: found key type RSA in file /root/.ssh/known_hosts:16\r\ndebug3: load_hostkeys: loaded 1 keys from [76.81.200.163]:30010\r\ndebug3: order_hostkeyalgs: prefer hostkeyalgs: ssh-rsa-cert-v01#openssh.com,rsa-sha2-512,rsa-sha2-256,ssh-rsa\r\ndebug3: send packet: type 20\r\ndebug1: SSH2_MSG_KEXINIT sent\r\ndebug3: receive packet: type 20\r\ndebug1: SSH2_MSG_KEXINIT received\r\ndebug2: local client KEXINIT proposal\r\ndebug2: KEX algorithms: curve25519-sha256#libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1,ext-info-c\r\ndebug2: host key algorithms: ssh-rsa-cert-v01#openssh.com,rsa-sha2-512,rsa-sha2-256,ssh-rsa,ecdsa-sha2-nistp256-cert-v01#openssh.com,ecdsa-sha2-nistp384-cert-v01#openssh.com,ecdsa-sha2-nistp521-cert-v01#openssh.com,ssh-ed25519-cert-v01#openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,ssh-ed25519\r\ndebug2: ciphers ctos: chacha20-poly1305#openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm#openssh.com,aes256-gcm#openssh.com,aes128-cbc,aes192-cbc,aes256-cbc,3des-cbc\r\ndebug2: ciphers stoc: chacha20-poly1305#openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm#openssh.com,aes256-gcm#openssh.com,aes128-cbc,aes192-cbc,aes256-cbc,3des-cbc\r\ndebug2: MACs ctos: umac-64-etm#openssh.com,umac-128-etm#openssh.com,hmac-sha2-256-etm#openssh.com,hmac-sha2-512-etm#openssh.com,hmac-sha1-etm#openssh.com,umac-64#openssh.com,umac-128#openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1\r\ndebug2: MACs stoc: umac-64-etm#openssh.com,umac-128-etm#openssh.com,hmac-sha2-256-etm#openssh.com,hmac-sha2-512-etm#openssh.com,hmac-sha1-etm#openssh.com,umac-64#openssh.com,umac-128#openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1\r\ndebug2: compression ctos: zlib#openssh.com,zlib,none\r\ndebug2: compression stoc: zlib#openssh.com,zlib,none\r\ndebug2: languages ctos: \r\ndebug2: languages stoc: \r\ndebug2: first_kex_follows 0 \r\ndebug2: reserved 0 \r\ndebug2: peer server KEXINIT proposal\r\ndebug2: KEX algorithms: diffie-hellman-group14-sha1,diffie-hellman-group1-sha1\r\ndebug2: host key algorithms: ssh-rsa\r\ndebug2: ciphers ctos: aes256-cbc,aes192-cbc,aes128-cbc,aes256-ctr,aes192-ctr,aes128-ctr,3des-cbc\r\ndebug2: ciphers stoc: aes256-cbc,aes192-cbc,aes128-cbc,aes256-ctr,aes192-ctr,aes128-ctr,3des-cbc\r\ndebug2: MACs ctos: hmac-sha1\r\ndebug2: MACs stoc: hmac-sha1\r\ndebug2: compression ctos: none\r\ndebug2: compression stoc: none\r\ndebug2: languages ctos: \r\ndebug2: languages stoc: \r\ndebug2: first_kex_follows 0 \r\ndebug2: reserved 0 \r\ndebug1: kex: algorithm: diffie-hellman-group14-sha1\r\ndebug1: kex: host key algorithm: ssh-rsa\r\ndebug1: kex: server->client cipher: aes128-ctr MAC: hmac-sha1 compression: none\r\ndebug1: kex: client->server cipher: aes128-ctr MAC: hmac-sha1 compression: none\r\ndebug1: sending SSH2_MSG_KEXDH_INIT\r\ndebug2: bits set: 991/2048\r\ndebug3: send packet: type 30\r\ndebug1: expecting SSH2_MSG_KEXDH_REPLY\r\ndebug3: receive packet: type 31\r\ndebug1: Server host key: ssh-rsa SHA256:MPoSxa389tZ42pQuBeYhXnkud6aQRtnmpDcaT9SR7WQ\r\ndebug3: put_host_port: [76.81.200.163]:30010\r\ndebug3: put_host_port: [76.81.200.163]:30010\r\ndebug3: hostkeys_foreach: reading file "/root/.ssh/known_hosts"\r\ndebug3: record_hostkey: found key type RSA in file /root/.ssh/known_hosts:16\r\ndebug3: load_hostkeys: loaded 1 keys from [76.81.200.163]:30010\r\ndebug3: hostkeys_foreach: reading file "/root/.ssh/known_hosts"\r\ndebug3: record_hostkey: found key type RSA in file /root/.ssh/known_hosts:16\r\ndebug3: load_hostkeys: loaded 1 keys from [76.81.200.163]:30010\r\ndebug1: Host \'[76.81.200.163]:30010\' is known and matches the RSA host key.\r\ndebug1: Found key in /root/.ssh/known_hosts:16\r\ndebug2: bits set: 1036/2048\r\ndebug3: send packet: type 21\r\ndebug2: set_newkeys: mode 1\r\ndebug1: rekey after 4294967296 blocks\r\ndebug1: SSH2_MSG_NEWKEYS sent\r\ndebug1: expecting SSH2_MSG_NEWKEYS\r\ndebug3: receive packet: type 21\r\ndebug1: SSH2_MSG_NEWKEYS received\r\ndebug2: set_newkeys: mode 0\r\ndebug1: rekey after 4294967296 blocks\r\ndebug2: key: /root/.ssh/id_dsa ((nil))\r\ndebug3: send packet: type 5\r\ndebug3: receive packet: type 6\r\ndebug2: service_accept: ssh-userauth\r\ndebug1: SSH2_MSG_SERVICE_ACCEPT received\r\ndebug3: send packet: type 50\r\ndebug3: receive packet: type 51\r\ndebug1: Authentications that can continue: publickey,password,keyboard-interactive\r\ndebug3: start over, passed a different list publickey,password,keyboard-interactive\r\ndebug3: preferred gssapi-keyex,gssapi-with-mic,publickey,keyboard-interactive,password\r\ndebug3: authmethod_lookup publickey\r\ndebug3: remaining preferred: keyboard-interactive,password\r\ndebug3: authmethod_is_enabled publickey\r\ndebug1: Next authentication method: publickey\r\ndebug1: Trying private key: /root/.ssh/id_dsa\r\ndebug3: no such identity: /root/.ssh/id_dsa: No such file or directory\r\ndebug2: we did not send a packet, disable method\r\ndebug3: authmethod_lookup keyboard-interactive\r\ndebug3: remaining preferred: password\r\ndebug3: authmethod_is_enabled keyboard-interactive\r\ndebug1: Next authentication method: keyboard-interactive\r\ndebug2: userauth_kbdint\r\ndebug3: send packet: type 50\r\ndebug2: we sent a keyboard-interactive packet, wait for reply\r\ndebug3: receive packet: type 60\r\ndebug2: input_userauth_info_req\r\ndebug2: input_userauth_info_req: num_prompts 1\r\ndebug3: send packet: type 61\r\ndebug3: receive packet: type 52\r\ndebug1: Authentication succeeded (keyboard-interactive).\r\nAuthenticated to 76.81.200.163 ([76.81.200.163]:30010).\r\ndebug1: setting up multiplex master socket\r\ndebug3: muxserver_listen: temporary control path /home/aws/.ansible/cp/fe4276fa4d.TXOAtXG8SQgQI5BF\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug3: fd 4 is O_NONBLOCK\r\ndebug3: fd 4 is O_NONBLOCK\r\ndebug1: channel 0: new [/home/aws/.ansible/cp/fe4276fa4d]\r\ndebug3: muxserver_listen: mux listener channel 0 fd 4\r\ndebug2: fd 3 setting TCP_NODELAY\r\ndebug3: ssh_packet_set_tos: set IP_TOS 0x08\r\ndebug1: control_persist_detach: backgrounding master process\r\ndebug2: control_persist_detach: background process is 17612\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug1: forking to background\r\ndebug1: Entering interactive session.\r\ndebug1: pledge: id\r\ndebug2: set_control_persist_exit_time: schedule exit in 60 seconds\r\ndebug1: multiplexing control connection\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug3: fd 5 is O_NONBLOCK\r\ndebug1: channel 1: new [mux-control]\r\ndebug3: channel_post_mux_listener: new mux channel 1 fd 5\r\ndebug3: mux_master_read_cb: channel 1: hello sent\r\ndebug2: set_control_persist_exit_time: cancel scheduled exit\r\ndebug3: mux_master_read_cb: channel 1 packet type 0x00000001 len 4\r\ndebug2: process_mux_master_hello: channel 1 slave version 4\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_master_read_cb: channel 1 packet type 0x10000004 len 4\r\ndebug2: process_mux_alive_check: channel 1: alive check\r\ndebug3: mux_client_request_alive: done pid = 17614\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_master_read_cb: channel 1 packet type 0x10000002 len 381\r\ndebug2: process_mux_new_session: channel 1: request tty 0, X 0, agent 0, subsys 0, term "xterm", cmd "/bin/sh -c \'( umask 77 && mkdir -p "` echo \\001Protocol error, doesn\'"\'"\'t start with scp!/.ansible/tmp/ansible-tmp-1565800866.89-171947226254301 `" && echo ansible-tmp-1565800866.89-171947226254301="` echo \\001Protocol error, doesn\'"\'"\'t start with scp!/.ansible/tmp/ansible-tmp-1565800866.89-171947226254301 `" ) && sleep 0\'", env 1\r\ndebug3: process_mux_new_session: got fds stdin 6, stdout 7, stderr 8\r\ndebug2: fd 7 setting O_NONBLOCK\r\ndebug2: fd 8 setting O_NONBLOCK\r\ndebug1: channel 2: new [client-session]\r\ndebug2: process_mux_new_session: channel_new: 2 linked to control channel 1\r\ndebug2: channel 2: send open\r\ndebug3: send packet: type 90\r\ndebug3: receive packet: type 91\r\ndebug2: callback start\r\ndebug2: client_session2_setup: id 2\r\ndebug1: Sending environment.\r\ndebug1: Sending env LANG = en_US.UTF-8\r\ndebug2: channel 2: request env confirm 0\r\ndebug3: send packet: type 98\r\ndebug1: Sending command: /bin/sh -c \'( umask 77 && mkdir -p "` echo \\001Protocol error, doesn\'"\'"\'t start with scp!/.ansible/tmp/ansible-tmp-1565800866.89-171947226254301 `" && echo ansible-tmp-1565800866.89-171947226254301="` echo \\001Protocol error, doesn\'"\'"\'t start with scp!/.ansible/tmp/ansible-tmp-1565800866.89-171947226254301 `" ) && sleep 0\'\r\ndebug2: channel 2: request exec confirm 1\r\ndebug3: send packet: type 98\r\ndebug3: mux_session_confirm: sending success reply\r\ndebug2: callback done\r\ndebug2: channel 2: open confirm rwindow 8192 rmax 8192\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: receive packet: type 99\r\ndebug2: channel_input_status_confirm: type 99 id 2\r\ndebug2: exec request accepted on channel 2\r\ndebug3: receive packet: type 1\r\nReceived disconnect from 76.81.200.163 port 30010:2: Bad string Length\r\nDisconnected from 76.81.200.163 port 30010\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Control master terminated unexpectedly\r\n')
fatal: [E10]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: OpenSSH_7.2p2 Ubuntu-4ubuntu2.8, OpenSSL 1.0.2g 1 Mar 2016\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: Applying options for *\r\ndebug3: kex names ok: [diffie-hellman-group1-sha1]\r\ndebug1: auto-mux: Trying existing master\r\ndebug1: Control socket \"/home/aws/.ansible/cp/fe4276fa4d\" does not exist\r\ndebug2: resolving \"76.81.200.163\" port 30010\r\ndebug2: ssh_connect_direct: needpriv 0\r\ndebug1: Connecting to 76.81.200.163 [76.81.200.163] port 30010.\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug1: fd 3 clearing O_NONBLOCK\r\ndebug1: Connection established.\r\ndebug3: timeout: 9931 ms remain after connect\r\ndebug1: permanently_set_uid: 0/0\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /root/.ssh/id_dsa type -1\r\ndebug1: key_load_public: No such file or directory\r\ndebug1: identity file /root/.ssh/id_dsa-cert type -1\r\ndebug1: Enabling compatibility mode for protocol 2.0\r\ndebug1: Local version string SSH-2.0-OpenSSH_7.2p2 Ubuntu-4ubuntu2.8\r\ndebug1: Remote protocol version 2.0, remote software version RomSShell_5.40\r\ndebug1: no match: RomSShell_5.40\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug1: Authenticating to 76.81.200.163:30010 as 'MyUserName'\r\ndebug3: put_host_port: [76.81.200.163]:30010\r\ndebug3: hostkeys_foreach: reading file \"/root/.ssh/known_hosts\"\r\ndebug3: record_hostkey: found key type RSA in file /root/.ssh/known_hosts:16\r\ndebug3: load_hostkeys: loaded 1 keys from [76.81.200.163]:30010\r\ndebug3: order_hostkeyalgs: prefer hostkeyalgs: ssh-rsa-cert-v01#openssh.com,rsa-sha2-512,rsa-sha2-256,ssh-rsa\r\ndebug3: send packet: type 20\r\ndebug1: SSH2_MSG_KEXINIT sent\r\ndebug3: receive packet: type 20\r\ndebug1: SSH2_MSG_KEXINIT received\r\ndebug2: local client KEXINIT proposal\r\ndebug2: KEX algorithms: curve25519-sha256#libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1,ext-info-c\r\ndebug2: host key algorithms: ssh-rsa-cert-v01#openssh.com,rsa-sha2-512,rsa-sha2-256,ssh-rsa,ecdsa-sha2-nistp256-cert-v01#openssh.com,ecdsa-sha2-nistp384-cert-v01#openssh.com,ecdsa-sha2-nistp521-cert-v01#openssh.com,ssh-ed25519-cert-v01#openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,ssh-ed25519\r\ndebug2: ciphers ctos: chacha20-poly1305#openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm#openssh.com,aes256-gcm#openssh.com,aes128-cbc,aes192-cbc,aes256-cbc,3des-cbc\r\ndebug2: ciphers stoc: chacha20-poly1305#openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm#openssh.com,aes256-gcm#openssh.com,aes128-cbc,aes192-cbc,aes256-cbc,3des-cbc\r\ndebug2: MACs ctos: umac-64-etm#openssh.com,umac-128-etm#openssh.com,hmac-sha2-256-etm#openssh.com,hmac-sha2-512-etm#openssh.com,hmac-sha1-etm#openssh.com,umac-64#openssh.com,umac-128#openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1\r\ndebug2: MACs stoc: umac-64-etm#openssh.com,umac-128-etm#openssh.com,hmac-sha2-256-etm#openssh.com,hmac-sha2-512-etm#openssh.com,hmac-sha1-etm#openssh.com,umac-64#openssh.com,umac-128#openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1\r\ndebug2: compression ctos: zlib#openssh.com,zlib,none\r\ndebug2: compression stoc: zlib#openssh.com,zlib,none\r\ndebug2: languages ctos: \r\ndebug2: languages stoc: \r\ndebug2: first_kex_follows 0 \r\ndebug2: reserved 0 \r\ndebug2: peer server KEXINIT proposal\r\ndebug2: KEX algorithms: diffie-hellman-group14-sha1,diffie-hellman-group1-sha1\r\ndebug2: host key algorithms: ssh-rsa\r\ndebug2: ciphers ctos: aes256-cbc,aes192-cbc,aes128-cbc,aes256-ctr,aes192-ctr,aes128-ctr,3des-cbc\r\ndebug2: ciphers stoc: aes256-cbc,aes192-cbc,aes128-cbc,aes256-ctr,aes192-ctr,aes128-ctr,3des-cbc\r\ndebug2: MACs ctos: hmac-sha1\r\ndebug2: MACs stoc: hmac-sha1\r\ndebug2: compression ctos: none\r\ndebug2: compression stoc: none\r\ndebug2: languages ctos: \r\ndebug2: languages stoc: \r\ndebug2: first_kex_follows 0 \r\ndebug2: reserved 0 \r\ndebug1: kex: algorithm: diffie-hellman-group14-sha1\r\ndebug1: kex: host key algorithm: ssh-rsa\r\ndebug1: kex: server->client cipher: aes128-ctr MAC: hmac-sha1 compression: none\r\ndebug1: kex: client->server cipher: aes128-ctr MAC: hmac-sha1 compression: none\r\ndebug1: sending SSH2_MSG_KEXDH_INIT\r\ndebug2: bits set: 991/2048\r\ndebug3: send packet: type 30\r\ndebug1: expecting SSH2_MSG_KEXDH_REPLY\r\ndebug3: receive packet: type 31\r\ndebug1: Server host key: ssh-rsa SHA256:MPoSxa389tZ42pQuBeYhXnkud6aQRtnmpDcaT9SR7WQ\r\ndebug3: put_host_port: [76.81.200.163]:30010\r\ndebug3: put_host_port: [76.81.200.163]:30010\r\ndebug3: hostkeys_foreach: reading file \"/root/.ssh/known_hosts\"\r\ndebug3: record_hostkey: found key type RSA in file /root/.ssh/known_hosts:16\r\ndebug3: load_hostkeys: loaded 1 keys from [76.81.200.163]:30010\r\ndebug3: hostkeys_foreach: reading file \"/root/.ssh/known_hosts\"\r\ndebug3: record_hostkey: found key type RSA in file /root/.ssh/known_hosts:16\r\ndebug3: load_hostkeys: loaded 1 keys from [76.81.200.163]:30010\r\ndebug1: Host '[76.81.200.163]:30010' is known and matches the RSA host key.\r\ndebug1: Found key in /root/.ssh/known_hosts:16\r\ndebug2: bits set: 1036/2048\r\ndebug3: send packet: type 21\r\ndebug2: set_newkeys: mode 1\r\ndebug1: rekey after 4294967296 blocks\r\ndebug1: SSH2_MSG_NEWKEYS sent\r\ndebug1: expecting SSH2_MSG_NEWKEYS\r\ndebug3: receive packet: type 21\r\ndebug1: SSH2_MSG_NEWKEYS received\r\ndebug2: set_newkeys: mode 0\r\ndebug1: rekey after 4294967296 blocks\r\ndebug2: key: /root/.ssh/id_dsa ((nil))\r\ndebug3: send packet: type 5\r\ndebug3: receive packet: type 6\r\ndebug2: service_accept: ssh-userauth\r\ndebug1: SSH2_MSG_SERVICE_ACCEPT received\r\ndebug3: send packet: type 50\r\ndebug3: receive packet: type 51\r\ndebug1: Authentications that can continue: publickey,password,keyboard-interactive\r\ndebug3: start over, passed a different list publickey,password,keyboard-interactive\r\ndebug3: preferred gssapi-keyex,gssapi-with-mic,publickey,keyboard-interactive,password\r\ndebug3: authmethod_lookup publickey\r\ndebug3: remaining preferred: keyboard-interactive,password\r\ndebug3: authmethod_is_enabled publickey\r\ndebug1: Next authentication method: publickey\r\ndebug1: Trying private key: /root/.ssh/id_dsa\r\ndebug3: no such identity: /root/.ssh/id_dsa: No such file or directory\r\ndebug2: we did not send a packet, disable method\r\ndebug3: authmethod_lookup keyboard-interactive\r\ndebug3: remaining preferred: password\r\ndebug3: authmethod_is_enabled keyboard-interactive\r\ndebug1: Next authentication method: keyboard-interactive\r\ndebug2: userauth_kbdint\r\ndebug3: send packet: type 50\r\ndebug2: we sent a keyboard-interactive packet, wait for reply\r\ndebug3: receive packet: type 60\r\ndebug2: input_userauth_info_req\r\ndebug2: input_userauth_info_req: num_prompts 1\r\ndebug3: send packet: type 61\r\ndebug3: receive packet: type 52\r\ndebug1: Authentication succeeded (keyboard-interactive).\r\nAuthenticated to 76.81.200.163 ([76.81.200.163]:30010).\r\ndebug1: setting up multiplex master socket\r\ndebug3: muxserver_listen: temporary control path /home/aws/.ansible/cp/fe4276fa4d.TXOAtXG8SQgQI5BF\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug3: fd 4 is O_NONBLOCK\r\ndebug3: fd 4 is O_NONBLOCK\r\ndebug1: channel 0: new [/home/aws/.ansible/cp/fe4276fa4d]\r\ndebug3: muxserver_listen: mux listener channel 0 fd 4\r\ndebug2: fd 3 setting TCP_NODELAY\r\ndebug3: ssh_packet_set_tos: set IP_TOS 0x08\r\ndebug1: control_persist_detach: backgrounding master process\r\ndebug2: control_persist_detach: background process is 17612\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug1: forking to background\r\ndebug1: Entering interactive session.\r\ndebug1: pledge: id\r\ndebug2: set_control_persist_exit_time: schedule exit in 60 seconds\r\ndebug1: multiplexing control connection\r\ndebug2: fd 5 setting O_NONBLOCK\r\ndebug3: fd 5 is O_NONBLOCK\r\ndebug1: channel 1: new [mux-control]\r\ndebug3: channel_post_mux_listener: new mux channel 1 fd 5\r\ndebug3: mux_master_read_cb: channel 1: hello sent\r\ndebug2: set_control_persist_exit_time: cancel scheduled exit\r\ndebug3: mux_master_read_cb: channel 1 packet type 0x00000001 len 4\r\ndebug2: process_mux_master_hello: channel 1 slave version 4\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_master_read_cb: channel 1 packet type 0x10000004 len 4\r\ndebug2: process_mux_alive_check: channel 1: alive check\r\ndebug3: mux_client_request_alive: done pid = 17614\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_master_read_cb: channel 1 packet type 0x10000002 len 381\r\ndebug2: process_mux_new_session: channel 1: request tty 0, X 0, agent 0, subsys 0, term \"xterm\", cmd \"/bin/sh -c '( umask 77 && mkdir -p \"` echo \\001Protocol error, doesn'\"'\"'t start with scp!/.ansible/tmp/ansible-tmp-1565800866.89-171947226254301 `\" && echo ansible-tmp-1565800866.89-171947226254301=\"` echo \\001Protocol error, doesn'\"'\"'t start with scp!/.ansible/tmp/ansible-tmp-1565800866.89-171947226254301 `\" ) && sleep 0'\", env 1\r\ndebug3: process_mux_new_session: got fds stdin 6, stdout 7, stderr 8\r\ndebug2: fd 7 setting O_NONBLOCK\r\ndebug2: fd 8 setting O_NONBLOCK\r\ndebug1: channel 2: new [client-session]\r\ndebug2: process_mux_new_session: channel_new: 2 linked to control channel 1\r\ndebug2: channel 2: send open\r\ndebug3: send packet: type 90\r\ndebug3: receive packet: type 91\r\ndebug2: callback start\r\ndebug2: client_session2_setup: id 2\r\ndebug1: Sending environment.\r\ndebug1: Sending env LANG = en_US.UTF-8\r\ndebug2: channel 2: request env confirm 0\r\ndebug3: send packet: type 98\r\ndebug1: Sending command: /bin/sh -c '( umask 77 && mkdir -p \"` echo \\001Protocol error, doesn'\"'\"'t start with scp!/.ansible/tmp/ansible-tmp-1565800866.89-171947226254301 `\" && echo ansible-tmp-1565800866.89-171947226254301=\"` echo \\001Protocol error, doesn'\"'\"'t start with scp!/.ansible/tmp/ansible-tmp-1565800866.89-171947226254301 `\" ) && sleep 0'\r\ndebug2: channel 2: request exec confirm 1\r\ndebug3: send packet: type 98\r\ndebug3: mux_session_confirm: sending success reply\r\ndebug2: callback done\r\ndebug2: channel 2: open confirm rwindow 8192 rmax 8192\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: receive packet: type 99\r\ndebug2: channel_input_status_confirm: type 99 id 2\r\ndebug2: exec request accepted on channel 2\r\ndebug3: receive packet: type 1\r\nReceived disconnect from 76.81.200.163 port 30010:2: Bad string Length\r\nDisconnected from 76.81.200.163 port 30010\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Control master terminated unexpectedly",
"unreachable": true
}

I managed to figured it out.
The following is the inventory file:
[test:vars]
ansible_user=username
ansible_password=password
ansible_become_pass=password
[test]
Manseau-E5 ansible_port=30005 ansible_host=67.53.178.51
Manseau-E6 ansible_port=30006 ansible_host=67.53.178.51
Manseau-E7 ansible_port=30006 ansible_host=67.53.178.51
Manseau-E8 ansible_port=30006 ansible_host=67.53.178.51
Then the actual playbook
---
- hosts: test
#gather_facts: no
vars:
ansible_network_os: icx
ansible_connection: network_cli
ansible_become: True
ansible_become_method: enable
ansible_command_timeout: 60
tasks:
- name: Check for Legacy AAA
icx_config:
lines:
- radius-client coa host 52.39.117.1 key 2 $Zl5ucm5nUGlebi0=
- radius-server host 52.41.63.155 auth-port 1812 acct-port 1813 default key 2 $Zl5ucm5nUGlebi0= dot1x mac-auth web-auth
check_mode: True

Related

Ansible hangs on job submission step Ansible ZOS Core

I'm using ansible on zlinux. I have a playbook that is using the zos_job_submit command from the zos ansible core modules.
The module is used with a job that generates random data to the jes spool.
//SPOOL1 JOB (UU999999999,1103),'DART JOB',CLASS=0,
// REGION=0M,MSGCLASS=R,TIME=5, LINES=(999999,WARNING),
// NOTIFY=&SYSUID
//* Automatic process will kill the job and cleanup spool.
//* author: xxxxxx , xxxxxxxxx
//STEPNAME EXEC PGM=BPXBATCH
//STDERR DD SYSOUT=*
//STDOUT DD SYSOUT=*
//STDPARM DD *
SH cat /dev/urandom
enter code here
This was working fine until a few days ago when It started to freeze up and error out. It still submits the job but it fails to return its output after the job starts running and then errors out.
Here is the playbook I'm using ( stripped down to only the offending task)
# Author: xxxxxxxxxxxxxx
- name: "DART JES CHAOS EVENT"
hosts: all # WARNING: USE WITH --LIMIT <target> OTHERWISE ALL HOSTS IN INVENTORY WILL BE TARGETED!
vars:
all_jobs:
jobs: [ ]
jobs_file_location: "jobs/{{inventory_hostname}}"
tasks:
- name: "Submit job tasks"
block:
- name: Submit job
ibm.ibm_zos_core.zos_job_submit:
src: "{{uss_jcl_path}}"
location: LOCAL
wait: false
vars:
uss_jcl_path: "{{jcl_lib}}/{{job_jcl}}"
Here is the log using -vvv
ansible-playbook 2.9.27
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/adshome/svc.dart/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/site-packages/ansible
executable location = /bin/ansible-playbook
python version = 2.7.5 (default, May 27 2022, 07:27:39) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
Using /etc/ansible/ansible.cfg as config file
PLAYBOOK: debug_event.yml **************************************************************************************************************************************************
Positional arguments: /XXXX/dart/ansible/playbooks/debug_event.yml
subset: nwrd
become_method: sudo
inventory: (u'/XXXX/dart/ansible/inventory', u'/XXXX/dart/ansible/auth_inventory')
forks: 5
tags: (u'all',)
extra_vars: (u'jcl_lib=/XXXX/dart/ansible/playbooks/jcl job_jcl=SPOOL',)
verbosity: 4
connection: smart
timeout: 10
1 plays in /XXXX/dart/ansible/playbooks/debug_event.yml
PLAY [DART JES CHAOS EVENT] ************************************************************************************************************************************************
TASK [Submit job] **********************************************************************************************************************************************************
task path: /XXXX/dart/ansible/playbooks/debug_event.yml:12
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/tempfile.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '\n{"changed": true, "path": "/tmp/ansible.nzzb29wz", "uid": 10001120, "gid": 7212, "owner": "XXXXXXX", "group": "GLTCMF", "mode": "0600", "state": "file", "size": 0, "invocation": {"module_args": {"state": "file", "prefix": "ansible.", "suffix": "", "path": null}}}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'echo ~XXXXXXX && sleep 0'"'"''
<XXXX.XXX.COM> (0, '/u/XXXXXXX\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /u/XXXXXXX/.ansible/tmp `"&& mkdir "` echo /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409 `" && echo ansible-tmp-1658270106.84-62001-187469557526409="` echo /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409 `" ) && sleep 0'"'"''
<XXXX.XXX.COM> (0, 'ansible-tmp-1658270106.84-62001-187469557526409=/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/stat.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '\n{"changed": false, "stat": {"exists": true, "path": "/tmp/ansible.nzzb29wz", "mode": "0600", "isdir": false, "ischr": false, "isblk": false, "isreg": true, "isfifo": false, "islnk": false, "issock": false, "uid": 10001120, "gid": 7212, "size": 0, "inode": 10691, "dev": 3248, "nlink": 1, "atime": 1658270106, "mtime": 1658270106, "ctime": 1658270106, "wusr": true, "rusr": true, "xusr": false, "wgrp": false, "rgrp": false, "xgrp": false, "woth": false, "roth": false, "xoth": false, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": false, "pw_name": "XXXXXXX", "gr_name": "GLTCMF", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "mimetype": "unknown", "charset": "unknown", "version": null, "attributes": [], "attr_flags": ""}, "invocation": {"module_args": {"checksum_algorithm": "sha1", "get_checksum": true, "follow": false, "path": "/tmp/ansible.nzzb29wz", "get_md5": false, "get_mime": true, "get_attributes": true}}}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> PUT /XXXX/dart/ansible/playbooks/jcl/SPOOL TO /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source
<XXXX.XXX.COM> SSH: EXEC sftp -b - -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 '[XXXX.XXX.COM]'
<XXXX.XXX.COM> (0, 'sftp> put /XXXX/dart/ansible/playbooks/jcl/SPOOL /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug2: Remote version: 3\r\ndebug2: Server supports extension "posix-rename#openssh.com" revision 1\r\ndebug2: Server supports extension "statvfs#openssh.com" revision 2\r\ndebug2: Server supports extension "fstatvfs#openssh.com" revision 2\r\ndebug2: Server supports extension "hardlink#openssh.com" revision 1\r\ndebug2: Server supports extension "fsync#openssh.com" revision 1\r\ndebug3: Sent message fd 5 T:16 I:1\r\ndebug3: SSH_FXP_REALPATH . -> /u/XXXXXXX size 0\r\ndebug3: Looking up /XXXX/dart/ansible/playbooks/jcl/SPOOL\r\ndebug3: Sent message fd 5 T:17 I:2\r\ndebug3: Received stat reply T:101 I:2\r\ndebug1: Couldn\'t stat remote file: No such file or directory\r\ndebug3: Sent message SSH2_FXP_OPEN I:3 P:/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source\r\ndebug3: Sent message SSH2_FXP_WRITE I:4 O:0 S:395\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: In write loop, ack for 4 395 bytes at 0\r\ndebug3: Sent message SSH2_FXP_CLOSE I:4\r\ndebug3: SSH2_FXP_STATUS 0\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'chmod u+x /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/ /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source && sleep 0'"'"''
<XXXX.XXX.COM> (0, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/copy.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '\n{"dest": "/tmp/ansible.nzzb29wz", "src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source", "md5sum": "e18be97081c5a8d8ae1e1c57eb0d2123", "checksum": "b220730079e63acfee97f2b694ae2d31d3074083", "changed": true, "uid": 10001120, "gid": 7212, "owner": "XXXXXXX", "group": "GLTCMF", "mode": "0600", "state": "file", "size": 395, "invocation": {"module_args": {"src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source", "dest": "/tmp/ansible.nzzb29wz", "_original_basename": "SPOOL", "mode": "0600", "backup": false, "force": true, "follow": false, "unsafe_writes": false, "content": null, "validate": null, "directory_mode": null, "remote_src": null, "local_follow": null, "checksum": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null, "regexp": null, "delimiter": null}}}\n', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
Using module file /adshome/svc.dart/.ansible/collections/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py
Pipelining is enabled.
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
<XXXX.XXX.COM> (1, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\nTraceback (most recent call last):\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 800, in run_module\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 58, in job_output\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 73, in _get_job_output\nRuntimeError: Failed to retrieve job output. RC: -9 Error: \n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "<stdin>", line 102, in <module>\n File "<stdin>", line 94, in _ansiballz_main\n File "<stdin>", line 40, in invoke_module\n File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 205, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 96, in _run_module_code\n mod_name, mod_spec, pkg_name, script_name)\n File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 85, in _run_code\n exec(code, run_globals)\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 894, in <module>\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 890, in main\n File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 805, in run_module\nIndexError: tuple index out of range\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\n')
<XXXX.XXX.COM> Failed to connect to the host via ssh: OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017
debug1: Reading configuration data /etc/ssh/ssh_config
debug1: /etc/ssh/ssh_config line 60: Applying options for *
debug1: auto-mux: Trying existing master
debug2: fd 4 setting O_NONBLOCK
debug2: mux_client_hello_exchange: master version 4
debug3: mux_client_forwards: request forwardings: 0 local, 0 remote
debug3: mux_client_request_session: entering
debug3: mux_client_request_alive: entering
debug3: mux_client_request_alive: done pid = 61998
debug3: mux_client_request_session: session request sent
debug1: mux_client_request_session: master session id: 2
Traceback (most recent call last):
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 800, in run_module
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 58, in job_output
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py", line 73, in _get_job_output
RuntimeError: Failed to retrieve job output. RC: -9 Error:
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 102, in <module>
File "<stdin>", line 94, in _ansiballz_main
File "<stdin>", line 40, in invoke_module
File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 894, in <module>
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 890, in main
File "/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py", line 805, in run_module
IndexError: tuple index out of range
debug3: mux_client_read_packet: read header failed: Broken pipe
debug2: Received exit status from master 1
<XXXX.XXX.COM> ESTABLISH SSH CONNECTION FOR USER: XXXXXXX
<XXXX.XXX.COM> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'rm -f -r /u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/ > /dev/null 2>&1 && sleep 0'"'"''
<XXXX.XXX.COM> (0, '', 'OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
fatal: [XXXX.XXX.COM]: FAILED! => {
"changed": true,
"checksum": "b220730079e63acfee97f2b694ae2d31d3074083",
"dest": "/tmp/ansible.nzzb29wz",
"gid": 7212,
"group": "GLTCMF",
"invocation": {
"module_args": {
"_original_basename": "SPOOL",
"attributes": null,
"backup": false,
"checksum": null,
"content": null,
"delimiter": null,
"dest": "/tmp/ansible.nzzb29wz",
"directory_mode": null,
"follow": false,
"force": true,
"group": null,
"local_follow": null,
"mode": "0600",
"owner": null,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source",
"unsafe_writes": false,
"validate": null
}
},
"md5sum": "e18be97081c5a8d8ae1e1c57eb0d2123",
"mode": "0600",
"module_stderr": "OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 60: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 4 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 61998\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\nTraceback (most recent call last):\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 800, in run_module\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py\", line 58, in job_output\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/module_utils/job.py\", line 73, in _get_job_output\nRuntimeError: Failed to retrieve job output. RC: -9 Error: \n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"<stdin>\", line 102, in <module>\n File \"<stdin>\", line 94, in _ansiballz_main\n File \"<stdin>\", line 40, in invoke_module\n File \"/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py\", line 205, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File \"/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py\", line 96, in _run_module_code\n mod_name, mod_spec, pkg_name, script_name)\n File \"/usr/lpp/izoda/anaconda/lib/python3.6/runpy.py\", line 85, in _run_code\n exec(code, run_globals)\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 894, in <module>\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 890, in main\n File \"/tmp/ansible_ibm.ibm_zos_core.zos_job_submit_payload_aplq15qp/ansible_ibm.ibm_zos_core.zos_job_submit_payload.zip/ansible_collections/ibm/ibm_zos_core/plugins/modules/zos_job_submit.py\", line 805, in run_module\nIndexError: tuple index out of range\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"owner": "XXXXXXX",
"rc": 1,
"size": 395,
"src": "/u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source",
"state": "file",
"uid": 10001120
}
PLAY RECAP *****************************************************************************************************************************************************************
XXXX.XXX.COM : ok=1 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
I have a few comments and hopefully we can resolve this.
There are a few things that confuse me and make me wonder how it ever worked else I am missing some data points.
The playbook is using option location: LOCAL and the src: appears to be a directory in USS (u/XXXXXXX/.ansible/tmp/ansible-tmp-1658270106.84-62001-187469557526409/source), the option LOCAL is meant for files located on the control node (where Ansible engine is running, in your case zLinux), can you confirm that you do have the file on the control node?
When I look at the verbose environment vars passed to the managed node (z/OS) I see no environment vars for the required dependency ZOAU. The zos_job_submit module uses the ZOAU APIs in this module; I am also questioning how this worked before without the environment vars or has your vars been corrupted (see below)?
SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/XXXX/dart/ansible/ansible_id_rsa"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="XXXXXXX"' -o ConnectTimeout=10 -o ControlPath=/adshome/svc.dart/.ansible/cp/7924694a90 XXXX.XXX.COM '/bin/sh -c '"'"'/usr/lpp/izoda/anaconda/bin/python3 && sleep 0'"'"''
I am expecting environment vars similar to this:
environment_vars:
_BPXK_AUTOCVT: "ON"
ZOAU_HOME: "{{ ZOAU }}"
PYTHONPATH: "{{ ZOAU }}/lib"
LIBPATH: "{{ ZOAU }}/lib:{{ PYZ }}/lib:/lib:/usr/lib:."
PATH: "{{ ZOAU }}/bin:{{ PYZ }}/bin:/bin:/var/bin"
_CEE_RUNOPTS: "FILETAG(AUTOCVT,AUTOTAG) POSIX(ON)"
_TAG_REDIR_ERR: "txt"
_TAG_REDIR_IN: "txt"
_TAG_REDIR_OUT: "txt"
LANG: "C"
Based on the trace it looks like you are using the latest Ansible Core collection version 1.4.0-beta.1 yet you are using Anaconda Python 3.6 from Rocket, that version of Python has long been removed from support.
For your version of the collection you will want to reference these requirements and probably start with a simple example to ensure its functional before doing something more complex , this is a good one to start with.
I am happy to help but supporting what appears to be a mismatch of requirements can be challenging.

Ansible error Shared connection to myhost1 closed. when using raw module

My ansible target server is SunOS.
I get Ansible error Shared connection to myhost1 closed. when using raw module.
The error does not show when i change module to shell however, the execution of script start.sh does not happen (evident from outout of ps command) hence i wish to use raw
- name: "START ADP SERVICES"
raw: "source ~/.profile; sh /web/external_products/springboot/{{ vars[ environ + '_folder'] }}/veladpservice/bin/start.sh veladpservice.jar {{ vars[ environ + '_folder'] }} {{ allpass }}"
Output when using shell module:
TASK [START ADAPTER SERVICES] **************************************************
task path: /web/playbooks/automation/veladp/va_action.yml:32
changed: [myhost1] => {"changed": true, "cmd": "source ~/.profile; sh /web/external_products/springboot/stg/veladpservice/bin/start.sh veladpservice.jar stg WEB_USER apps_user_2021_2378 MSPW435 MSPW435 MSPW445 MSPW445 PETWEB440 Temp_45678 MSPW460 Temp_3456789012 MSPW430 Temp_1234567890 MSPW455 Temp_09876", "delta": "0:00:01.009433", "end": "2021-10-28 05:47:51.277868", "rc": 0, "start": "2021-10-28 05:47:50.268435", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
When using raw module:
TASK [START adp SERVICES] *********************************************************************************
task path: /web/playbooks/automation/veladprestart/va_action.yml:32
<myhost1> ESTABLISH SSH CONNECTION FOR USER: myuser1
<myhost1> SSH: EXEC ssh -vvv -C -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="myuser1"' -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o ControlPath=/home/myuser1/.ansible/cp/cc89a5347e -tt myhost1 'source ~/.profile; /web/external_products/springboot/stg/veladpservice/bin/start.sh veladpservice.jar stg MSP_WEB_USER Marsh_apps_user_2021_2378 MSPW435 MSPW435 MSPW445 MSPW445 PETWEB440 Temp_345678 MSPW460 Temp_3456789012 MSPW430 Temp_1234567890 MSPW455 Temp_0987654321'
<myhost1> (0, b'', b"OpenSSH_8.4p1 (CentrifyDC build 5.7.1-346) , OpenSSL 1.1.1g 21 Apr 2020\r\ndebug1: Reading configuration data /etc/centrifydc/ssh/ssh_config\r\ndebug1: /etc/centrifydc/ssh/ssh_config line 3: Applying options for *\r\ndebug3: expanded UserKnownHostsFile '~/.ssh/known_hosts' -> '/home/myuser1/.ssh/known_hosts'\r\ndebug3: expanded UserKnownHostsFile '~/.ssh/known_hosts2' -> '/home/myuser1/.ssh/known_hosts2'\r\ndebug1: Authenticator provider $SSH_SK_PROVIDER did not resolve; disabling\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 21893\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to myhost1 closed.\r\n")
changed: [myhost1] => {
"changed": true,
"rc": 0,
"stderr": "OpenSSH_8.4p1 (CentrifyDC build 5.7.1-346) , OpenSSL 1.1.1g 21 Apr 2020\r\ndebug1: Reading configuration data /etc/centrifydc/ssh/ssh_config\r\ndebug1: /etc/centrifydc/ssh/ssh_config line 3: Applying options for *\r\ndebug3: expanded UserKnownHostsFile '~/.ssh/known_hosts' -> '/home/myuser1/.ssh/known_hosts'\r\ndebug3: expanded UserKnownHostsFile '~/.ssh/known_hosts2' -> '/home/myuser1/.ssh/known_hosts2'\r\ndebug1: Authenticator provider $SSH_SK_PROVIDER did not resolve; disabling\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 21893\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\nShared connection to myhost1 closed.\r\n",
"stderr_lines": [
"OpenSSH_8.4p1 (CentrifyDC build 5.7.1-346) , OpenSSL 1.1.1g 21 Apr 2020",
"debug1: Reading configuration data /etc/centrifydc/ssh/ssh_config",
"debug1: /etc/centrifydc/ssh/ssh_config line 3: Applying options for *",
"debug3: expanded UserKnownHostsFile '~/.ssh/known_hosts' -> '/home/myuser1/.ssh/known_hosts'",
"debug3: expanded UserKnownHostsFile '~/.ssh/known_hosts2' -> '/home/myuser1/.ssh/known_hosts2'",
"debug1: Authenticator provider $SSH_SK_PROVIDER did not resolve; disabling",
"debug1: auto-mux: Trying existing master",
"debug2: fd 3 setting O_NONBLOCK",
"debug2: mux_client_hello_exchange: master version 4",
"debug3: mux_client_forwards: request forwardings: 0 local, 0 remote",
"debug3: mux_client_request_session: entering",
"debug3: mux_client_request_alive: entering",
"debug3: mux_client_request_alive: done pid = 21893",
"debug3: mux_client_request_session: session request sent",
"debug1: mux_client_request_session: master session id: 2",
"debug3: mux_client_read_packet: read header failed: Broken pipe",
"debug2: Received exit status from master 0",
"Shared connection to myhost1 closed."
],
"stdout": "",
"stdout_lines": []
}
Update post suggestion:
I also tried detaching the process using disown like below but the ps still does not show the process running:
- name: "START ADAPTER SERVICES"
shell: "source ~/.profile && sh /web/external_products/springboot/{{ vars[ environ + '_folder'] }}/velocityadapterservice/bin/start.sh velocityadapterservice.jar {{ vars[ environ + '_folder'] }} {{ allpass }} &; disown %%"
~/.profile & start.sh but have 744 permissions and the owner is myuser1
Note: Running the same process as is manually works!!
Can you please suggest?
#mdaniel suggestion gave me the clue. However, nohup helped resolve the issue and still not sure about disown. Below is the solution.
- name: "START ADAPTER SERVICES"
shell: "source ~/.profile && nohup /web/external_products/springboot/{{ vars[ environ + '_folder'] }}/velocityadapterservice/bin/start.sh velocityadapterservice.jar {{ vars[ environ + '_folder'] }} {{ allpass }} &"

Is there support of hmac-md5-96 in setkey ipsec tools?

I want to use "hmac-md5-96" algorithm to create Security Associations at client side. I am using setkey ipsec tools. while adding spd entry, It is giving syntax error and unable to identify hmac-md5-96
I have tried keyed-md5 which is also not supported.
setkey -c << EOF
add $pcscf $ue esp $spi_uc -m transport -E aes-cbc $ck -A hmac-md5-96 "1234567890123456" ;
spdadd $pcscf/32[$port_ps] $ue/32[$port_uc] tcp -P in ipsec esp/transport//require ;
spdadd $pcscf/32[$port_ps] $ue/32[$port_uc] udp -P in ipsec esp/transport//require ;
EOF
Use ip xfrm state add instead of setkey, i.e.:
ip xfrm state add src $pcscf dst $ue proto esp spi $spi_uc enc "cbc(aes)" $ck auth-trunc "hmac(md5)" "1234567890123456" 96 mode transport
For some dummy parameters it creates the following SAD entry:
src 11.22.33.44 dst 22.33.44.55
proto esp spi 0x00000457 reqid 0 mode transport
replay-window 0
auth-trunc hmac(md5) 0x31323334353637383930313233343536 96
enc cbc(aes) 0x3131313131313131313131313131313131313131313131313131313131313131
anti-replay context: seq 0x0, oseq 0x0, bitmap 0x00000000
sel src 0.0.0.0/0 dst 0.0.0.0/0
Good luck!

etcd server output some error

I started up with an etcdv3 single server, etcdctl get / put is ok but the server console has some error logs and how to fix it
2017-06-19 09:26:42.225787 I | etcdserver/api/v3rpc: grpc: addrConn.resetTransport failed to create client transport: connection error: desc = "transport: dial tcp [::]:2379: connect: cannot assign requested address"; Reconnecting to {[::]:2379 <nil>}
2017-06-19 09:27:01.415397 I | etcdserver/api/v3rpc: grpc: addrConn.resetTransport failed to create client transport: connection error: desc = "transport: dial tcp [::]:2379: connect: cannot assign requested address"; Reconnecting to {[::]:2379 <nil>}
2017-06-19 09:27:08.209185 I | etcdserver/api/v3rpc: grpc: addrConn.resetTransport failed to create client transport: connection error: desc = "transport: dial tcp [::]:2379: connect: cannot assign requested address"; Reconnecting to {[::]:2379 <nil>}
2017-06-19 09:27:22.992708 I | etcdserver/api/v3rpc: grpc: addrConn.resetTransport failed to create client transport: connection error: desc = "transport: dial tcp [::]:2379: connect: cannot assign requested address"; Reconnecting to {[::]:2379 <nil>}
2017-06-19 09:27:40.156845 I | etcdserver/api/v3rpc: grpc: addrConn.resetTransport failed to create client transport: connection error: desc = "transport: dial tcp [::]:2379: connect: cannot assign requested address"; Reconnecting to {[::]:2379 <nil>}
server start script is :
docker run \
--rm \
-p 2379:2379 \
-p 2380:2380 \
--name etcd-v3 \
--volume=/data/docker/etcd-sigle-data:/etcd-data \
quay.io/coreos/etcd:v3.2.0 \
/usr/local/bin/etcd \
--name my-etcd \
--data-dir /etcd-data \
--listen-client-urls http://0.0.0.0:2379 \
--advertise-client-urls http://xxx.xxx.xxx.xxx:2379
--listen-peer-urls http://0.0.0.0:2380 \
--initial-advertise-peer-urls http://xxx.xxx.xxx.xxx:2380 \
--initial-cluster my-etcd=http://xxx.xxx.xxx.xxx:2380 \
--initial-cluster-token etcd-cluster-token \
--initial-cluster-state new \
--auto-compaction-retention 1
Looks like the trouble is related to connecting via ipv6. Try switching to ipv4 by specifiying proper IP address instead of 0.0.0.0 wildcard in --listen-client-urls.

PureFtpd passive port range doesn't deliver listening address to client

I'm trying to configure my pureftpd behind the firewall to act as a passive ftp/TLS server.
Acting machines:
Server: 192.168.3.220 (internal network, default route to the router at 192.168.3.1)
Configuration: pureftpd with PassivePorts 64000 64300, MasqueradeAddress ww.xx.yy.zz (this one is configured on router)
Router: internal: 192.168.3.1, DNAT rule (PREROUTING chain) ww.xx.yy.zz tcp/21,64000:64300 NATed to address 192.168.3.220, FORWARD chain accepting these packets both directions.
Client1: external server with fixed public IP
Client2: NATed machine somewhere - on 192.168.5.x network
Scenario1:
- Client1: connect OK, login OK, command 'ls':
gets OK, after PASV:
---> PASV
GNUTLS: REC[0x28ecce0]: Sending Packet[9] Application Data(23) with length: 6
GNUTLS: REC[0x28ecce0]: Sent Packet[10] Application Data(23) with length: 37
GNUTLS: ASSERT: gnutls_buffers.c:322
GNUTLS: ASSERT: gnutls_buffers.c:322
GNUTLS: REC[0x28ecce0]: Expected Packet[9] Application Data(23) with length: 65536
GNUTLS: REC[0x28ecce0]: Received Packet[9] Application Data(23) with length: 64
GNUTLS: REC[0x28ecce0]: Decrypted Packet[9] Application Data(23) with length: 31
<--- 200 Protection set to Private
---> LIST
---> ABOR
Interesting thing: 227 from server, which I see in paranoid log from pureftpd, I don't see on the client - only the 200 Protection set to Private
...waits cca 30sec and reconnects using ACTIVE(!!) mode -> ls
Scenario2:
- using Client2 (sorry for czech locales):
---> USER xxxxxx
<--- 331 Password required for xxxxxx
---> PASS XXXX
<--- 230 User xxxxxx logged in
---> PWD
<--- 230 Ls oi a:2013-01-03 21:19:00
---> PBSZ 0
<--- 257 "/" is the current directory
---> PROT P
<--- 200 PBSZ 0 successful
---> PASV
<--- 200 Protection set to Private
---> LIST
---> ABOR
---- Přerušený datový socket bude uzavřen (means closing data socket)
---- Řídicí socket bude uzavřen (means closing control socket)
---- Pasivní režim bude vypnut (means Passive will be turned off)
---- dns cache hit
---- Navazuje se spojení na ftp1.xxxxxxxxx.cz (ww.xx.yy.zz) port 21
<--- 220 ww.xx.yy.zz FTP server ready
...
---> USER xxxxxx
<--- 331 Password required for xxxxxx
---> PASS XXXX
<--- 230 User xxxxxx logged in
---> PWD
<--- 230 Ls oi a:2013-01-03 21:19:22
---> PBSZ 0
<--- 257 "/" is the current directory
---> PROT P
<--- 200 PBSZ 0 successful
---> PORT 192,168,5,xx,185,136
<--- 200 Protection set to Private
---> LIST
<--- 500 Illegal PORT command
---- Closing data socket
---> QUIT
ls: Nepřekonatelná chyba: 500 Illegal PORT command
<--- 425 Unable to build data connection: Connection refused
iptables on the NAT machine don't increase my accounting counters on ports 64000:64300, so I expect there's no passive connection made at all.
So... the real problem was the second 230 reply:
---> PWD
<--- 230 Ls oi a:2013-01-03 21:19:22
This is a known issue of the PureFTPd 1.3.3a (default debian squeeze)
The solution was to compile PureFTPd from wheezy (1.3.4a-2), now everything works fine.
Thank you all, who tried to figure out what's going on. Tldv

Resources