sendEmail.pl[11242]: ERROR => ERROR => SMTP-AUTH: Authentication to smtp.gmail.com:587 failed - bash

executing this command :
sendEmail.pl -s "smtp.gmail.com:587" -o tls=yes -xu xyz#gmail.com -xp password -f 'Test mail <xyz#gmail.com>' -t abc#gmail.com -u "Test report $cu_dt" -m "Hi all
Thanks,\nNote: This is a system generated automated report" -a /home/ubuntu/mongo-reports/test-report.txt
This command always work for me but from last 2 days i am facing this bellow error :
sendEmail.pl[11242]: ERROR => ERROR => SMTP-AUTH: Authentication to smtp.gmail.com:587 failed

Related

ansible synchronize module fails to get directory from remote to local - failed: No such file or directory

I wish to copy /web/playbooks/automation/misc/filecopyprod from mysourceuser#mysourcehost to destination mydestuser#mydesthost under the below location /web/playbooks/automation/misc/filecopy/tmpfiles/500/
Evident that both the source and destinations are present and have good permissions.
[mydestuser#mydesthost ~]$ ssh mysourceuser#mysourcehost ls -ld '/web/playbooks/automation/misc/filecopyprod'
##################################################################
# *** This Server is using Centrify *** #
# *** Remember to use your Active Directory account *** #
# *** password when logging in *** #
##################################################################
drwxrwxr-x 3 mysourceuser mysourceuser 209 Sep 26 14:58 /web/playbooks/automation/misc/filecopyprod
[mydestuser#mydesthost ~]$ ls -ld /web/playbooks/automation/misc/filecopy/tmpfiles/500/
drwxr-xr-x 2 mydestuser aces 6 Sep 26 14:13 /web/playbooks/automation/misc/filecopy/tmpfiles/500/
Here is my playbook that runs on the mydesthost and gets me files & folders from a remote server mysourceuser#mysourcehost to local server mydestuser#mydesthost
- name: Copying from "{{ inventory_hostname }}" to this ansible server.
tags: validate
synchronize:
src: "'{{ item }}'"
dest: "{{ playbook_dir }}/tmpfiles/{{ Latest_Build_Number }}/"
mode: pull
copy_links: yes
with_items:
- "{{ source_file_new.splitlines() }}"
To run the above playbook:
ansible-playbook /web/playbooks/automation/misc/filecopy/copyfiles.yml -e "source_file_new='$source_file_new'" -e "Latest_Build_Number='500'"
Output of my run:
TASK [Copying from "mysourcehost" to this ansible server.] **********************
task path: /web/playbooks/automation/misc/filecopy/copyfiles.yml:218
Monday 26 September 2022 14:13:02 -0500 (0:00:00.047) 0:00:03.084 ******
redirecting (type: action) ansible.builtin.synchronize to ansible.posix.synchronize
redirecting (type: action) ansible.builtin.synchronize to ansible.posix.synchronize
<mysourcehost> ESTABLISH LOCAL CONNECTION FOR USER: mydestuser
<mysourcehost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81 `"&& mkdir "` echo /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597 `" && echo ansible-tmp-1664219583.0005133-20679-105296975361597="` echo /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597 `" ) && sleep 0'
Using module file /home/mydestuser/.ansible/collections/ansible_collections/ansible/posix/plugins/modules/synchronize.py
<mysourcehost> PUT /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/tmpxhpyaf0m TO /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/AnsiballZ_synchronize.py
<mysourcehost> EXEC /bin/sh -c 'chmod u+x /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/ /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/AnsiballZ_synchronize.py && sleep 0'
<mysourcehost> EXEC /bin/sh -c '/usr/local/bin/python3.8 /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/AnsiballZ_synchronize.py && sleep 0'
<mysourcehost> EXEC /bin/sh -c 'rm -f -r /home/mydestuser/.ansible/tmp/ansible-local-20463qmilic81/ansible-tmp-1664219583.0005133-20679-105296975361597/ > /dev/null 2>&1 && sleep 0'
failed: [mysourcehost] (item=/web/playbooks/automation/misc/filecopyprod) => {
"ansible_loop_var": "item",
"changed": false,
"cmd": "/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/",
"invocation": {
"module_args": {
"_local_rsync_password": null,
"_local_rsync_path": "rsync",
"_substitute_controller": false,
"archive": true,
"checksum": false,
"compress": true,
"copy_links": true,
"delete": false,
"dest": "/web/playbooks/automation/misc/filecopy/tmpfiles/500/",
"dest_port": null,
"dirs": false,
"existing_only": false,
"group": null,
"link_dest": null,
"links": null,
"mode": "pull",
"owner": null,
"partial": false,
"perms": null,
"private_key": null,
"recursive": null,
"rsync_opts": [],
"rsync_path": null,
"rsync_timeout": 0,
"set_remote_user": true,
"src": "mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod'",
"ssh_args": null,
"ssh_connection_multiplexing": false,
"times": null,
"verify_host": false
}
},
"item": "/web/playbooks/automation/misc/filecopyprod",
"msg": "Warning: Permanently added 'mysourcehost' (ED25519) to the list of known hosts.\r\n\nThis system is for the use by authorized users only. All data contained\non all systems is owned by the company and may be monitored, intercepted,\nrecorded, read, copied, or captured in any manner and disclosed in any\nmanner, by authorized company personnel. Users (authorized or unauthorized)\nhave no explicit or implicit expectation of privacy. Unauthorized or improper\nuse of this system may result in administrative, disciplinary action, civil\nand criminal penalties. Use of this system by any user, authorized or\nunauthorized, constitutes express consent to this monitoring, interception,\nrecording, reading, copying, or capturing and disclosure.\n\nIF YOU DO NOT CONSENT, LOG OFF NOW.\n\n##################################################################\n# *** This Server is using Centrify *** #\n# *** Remember to use your Active Directory account *** #\n# *** password when logging in *** #\n##################################################################\n\nrsync: change_dir \"/home/mysourceuser//'/web/playbooks/automation/misc\" failed: No such file or directory (2)\nrsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1658) [Receiver=3.1.2]\nrsync: [Receiver] write error: Broken pipe (32)\n",
"rc": 23
}
From the output i got the concerned rsync command and tried to run it manually on my playbook host mydestuser#mydesthost and i get similar error:
"/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/"
Output:
bash: /bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/: No such file or directory
Upon suggestion from StackOverflow I quoted --out-format but I continue to get the same error. See snapshot of the error in the output below:
"/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format='<<CHANGED>>%i %n%L' mysourceuser#mysourcehost:'/tmp/myfolder' /tmp/myfolder1"
Can you please suggest?
Your full command was this
/bin/rsync --delay-updates -F --compress --copy-links --archive --rsh=/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null --out-format=<<CHANGED>>%i %n%L mysourceuser#mysourcehost:'/web/playbooks/automation/misc/filecopyprod' /web/playbooks/automation/misc/filecopy/tmpfiles/500/
You've omitted to quote arguments containing spaces, so when the shell parses the line it splits at those spaces, leading to syntax errors when rsync tries to understand the line.
Fix the --rsh parameter, which contains spaces, by changing it to this:
--rsh='/usr/share/centrifydc/bin/ssh -S none -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null'
Fix the --out-format parameter, which contains whitespace and shell special characters by changing it to this:
--out-format='<<CHANGED>>%i %n%L'
In a later example, you've put double quotes around the entire command, so the shell is trying to execute the entire command as a single entity. For example, in the first line the shell splits the line at spaces then executes the command echo with a parameter hello. In second line the shell sees the quoted string and treats it as a single entity; it then tries to execute the command called echo hello - not a command echo with a parameter hello but a command with a literal space character in the middle:
echo hello # → hello
"echo hello" # → -bash: echo hello: command not found
Rule: if a command or parameter contains a space or other special shell character and it's to be considered as a single item, it must be quoted.
The playbook works for older versions of rsync.
With the latest version it started to fail as reported here.
Changed
synchronize:
src: "'{{ item }}'"
to
synchronize:
src: "{{ item }}"
and the error was gone issued resolved with the latest rsync.

ansible: debug only output from tasks?

When i use like ansible-playbook -vvvv, it shows all stdout for all running tasks. Though what it also shows, is noise that shows how each command is run through SSH. Is there a way to use verbosity to just show tasks stdout without any other noise?
Currently it shows like this:
<myhost> ESTABLISH SSH CONNECTION FOR USER: root
<myhost> SSH: EXEC ssh -vvv -o ControlMaster=auto -o ControlPersist=30m -o PreferredAuthentications=publickey -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/tmp/ansible-ssh-%h-%p-%r myhost '/bin/sh -c '"'"'rm -f -r /var/tmp/ansible-tmp-1333333.89364-1154635-13434444/ > /dev/null 2>&1 && sleep 0'"'"''
<myhost> (0, b'', b'OpenSSH_8.2p1 Ubuntu-4ubuntu0.4, OpenSSL 1.1.1f 31 Mar 2020\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 19: include /etc/ssh/ssh_config.d/*.conf matched no files\r\ndebug1: /etc/ssh/ssh_config line 21: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 1136353\r\ndebug3: mux_client_request_session: session request sent\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 0\r\n')
changed: [dev] => {
"changed": true,
"cmd": [
"/opt/odoo/.local/bin/docker-compose",
"rm",
"-fsv",
"odoo"
],
"delta": "0:00:03.287055",
"end": "2022-01-25 08:09:25.105235",
"invocation": {
"module_args": {
"_raw_params": "/opt/odoo/.local/bin/docker-compose rm -fsv odoo",
"_uses_shell": false,
"argv": null,
"chdir": "/opt/odoo/app",
"creates": null,
"executable": null,
"removes": null,
"stdin": null,
"stdin_add_newline": true,
"strip_empty_ends": true,
"warn": false
}
},
"msg": "",
"rc": 0,
"start": "2022-01-25 08:09:21.818180",
"stderr": "Stopping app_odoo_1 ... \r\nStopping app_odoo_1 ... done\r\nRemoving app_odoo_1 ... \r\nRemoving app_odoo_1 ... done",
"stderr_lines": [
"Stopping app_odoo_1 ... ",
"Stopping app_odoo_1 ... done",
"Removing app_odoo_1 ... ",
"Removing app_odoo_1 ... done"
],
"stdout": "Going to remove app_odoo_1",
"stdout_lines": [
"Going to remove app_odoo_1"
]
}
Is there a way to just keep JSON part without explicitly specifying debug arg for each task?
When using Ansible in Verbose Mode like ansible-playbook -vvvv, it shows how each command is run through SSH and it also shows all stdout for all running tasks.
verbose mode (-vvv for more, -vvvv to enable connection debugging)
This is so far the intended behavior.
Is there a way to use verbosity to just show tasks stdout without any other noise?
Even the parameters -v or -vv producing a kind of noise. This is also the intended behavior since the feature is for debugging functionality outside of tasks. See in example source query vvv or source query verbose.
Is there a way to just keep JSON part without explicitly specifying debug arg for each task?
I am not aware of any feature like that.
Currently I assume it would be necessary to use the debug_module. Or using the Playbook Debugger to Debugging tasks. With that it would be possible to get the stdout of a task only.
Further Readings
How to debug Ansible issues?
Debugging modules
Aside from Ansible there might be a way of filtering the output in example via | grep -A2 stdout or awk 'p; / =>/ {p=1}', sed '0,/ => /d' and than | jq -M -r '.stdout', respective | jq -M -r '.stdout_lines'.
Thanks to
Delete everything before pattern including pattern using awk or sed
Using jq to fetch key value from JSON output

greenplum6.9 gpexpand occurred error,but no error log

20201118:18:13:10:008884 gpexpand:master27:gpadmin-[INFO]:-Configuring new segments (primary)
20201118:18:13:10:008884 gpexpand:master27:gpadmin-[INFO]:-{'slave34': '/opt/data/gpdatap3/gpseg17:6002:true:false:19:17::-1:', 'slave33': '/opt/data/gpdatap3/gpseg16:6002:true:false:18:16::-1:', 'slave32': '/opt/data/gpdatap3/gpseg15:6002:true:false:17:15::-1:', 'slave31': '/opt/data/gpdatap3/gpseg14:6002:true:false:16:14::-1:', 'slave28': '/opt/data/gpdatap3/gpseg12:6002:true:false:14:12::-1:', 'slave29': '/opt/data/gpdatap3/gpseg13:6002:true:false:15:13::-1:'}
20201118:18:13:42:008884 gpexpand:master27:gpadmin-[ERROR]:-gpexpand failed: ExecutionError: 'Error Executing Command: ' occurred. Details: 'ssh -o StrictHostKeyChecking=no -o ServerAliveInterval=60 slave33 ". /usr/local/greenplum-db-6.9.0/greenplum_path.sh; env GPSESSID=0000000000 GPERA=None $GPHOME/bin/pg_ctl -D /opt/data/gpdatap3/gpseg16 -l /opt/data/gpdatap3/gpseg16/pg_log/startup.log -w -t 600 -o \" -p 6002 -c gp_role=utility -M \" start 2>&1"' cmd had rc=1 completed=True halted=False
stdout='waiting for server to start.... stopped waiting
pg_ctl: could not start server
Examine the log output.
'
stderr=''
Exiting...
20201118:18:13:42:008884 gpexpand:master27:gpadmin-[ERROR]:-Please run 'gpexpand -r' to rollback to the original state.
20201118:18:13:42:008884 gpexpand:master27:gpadmin-[INFO]:-Shutting down gpexpand...
Any help?
Thanks a lot

“was unexpected at this time.”

I'm trying to run the following code but is encountering the "was unexpected at this time" error.
(echo COPY (SELECT ta.colA as name, ta.colB as user_e, ta.colC as user_n, ta.activation_dt, ta.creation_dt, MAX(tb.update_dt) as updated_at, MAX(tb.login_dt) as lastest_login, tc.colD as roleFROM tblA ta, tblB tb, tblC tc WHERE ta.id = tb.tb_id AND ta.tc_id = tc.id AND tc.colD <> 'Guest' GROUP BY ta.colA, ta.colB, ta.colC, ta.activation_dt, ta.creation_dt, tc.colD ORDER BY ta.colA, tc.colD^^^) TO 'E:\Details.csv' CSV DELIMITER ',' HEADER;) | psql -h localhost -p 8060 -U uname -d dbase
Looking for some insights please. Thank you.
Screenshot of error encountered
Try adding some quotes around the SQL, and lose the brackets:
echo "COPY ..." | psql -h localhost -p 8060 -U uname -d dbase
or use -c option:
psql -h localhost -p 8060 -U uname -d dbase -c "COPY ..."
I prefer the -c because it works on all OS

docker curl socket inside container

I have a bash post process script for rtorrent.
In it I try to create a Container, start it and on the end remove it.
All via curl commands to the docker socket which i mounted into the container.
The command is successfully executed from rtorrent. The curl command for pushover is working nicely.
But I get a curl: (7) Couldn't connect to server Error Message for the docker curl commands.
Hope someone could point me in the right direction.
Log:
^#
---
^#/usr/local/bin/rtorrent-postprocess.sh /Pathtothedownload Nameofthedownload label
---
^#{"status":1,"request":"ec5c3c9c-5744-48f4-909b-68d38ec5e659"}curl: (7) Couldn't connect to server
curl: (7) Couldn't connect to server
curl: (7) Couldn't connect to server
curl: (7) Couldn't connect to server
--- Success ---
Script:
#!/bin/bash
# rtorrent postprocess Script by Tobias
export LANG=de_DE.UTF-8
# The file for logging events from this script
LOGFILE="/config/rtorrent-postprocess.log"
#LOGFILE="./debug.log"
# Pfad des Downloads
FOLDER="$1"
# Name des Downloads
NAME="$2"
# Label des Downloads
LABEL="$3"
# Media Verzeichniss /data/Media
MEDIA="/data/Media"
# COMPLETE Verzeichniss mit label /data/torrent/completed/$3
COMPLETE="/data/torrent/completed/$3"
##############################################################################
function edate
{
echo "`date '+%Y-%m-%d %H:%M:%S'` $1" >> "$LOGFILE"
}
function pushover {
curl -s \
-F "token=xxxxxxxxxxxxxxxx" \
-F "user=xxxxxxxxxxxxxxxxx" \
-F "message=$1 finished $2 $3 on `date +%d.%m.%y-%H:%m`" \
https://api.pushover.net/1/messages.json
}
edate " "
edate "Verzeichniss ist $COMPLETE"
edate "Name ist $NAME"
edate "Label ist $LABEL"
edate "rtorrent finished downloading $NAME"
pushover "rtorrent" "downloading" "$NAME"
edate "Starte Filebot - $COMPLETE/$NAME"
test_command() {
curl --unix-socket /var/run/docker.sock -X POST "http://localhost/containers/${NAME}/wait" -H "accept: application/json"
}
curl --unix-socket /var/run/docker.sock -H "Content-Type: application/json" -d '{ "Image": "rednoah/filebot", "Cmd": ["-script", "fn:amc", "--output", "/Media", "--action", "move", "-non-strict", "/volume1", "--log-file", "/opt/rtorrentvpn/config/filebot.log", "--conflict", "auto", "--def", "artwork=n", "seriesFormat=Serien/{localize.eng.n}/Season {s.pad(2)}/{localize.eng.n} - {s00e00} - {localize.deu.t}", "movieFormat=Filme/{localize.deu.n} ({y})/{localize.deu.n} ({y})", "musicFormat=Musik/{artist}/{album}/{fn}"], "HostConfig": { "Binds": ["'$COMPLETE/$NAME':/volume1", "data:/data", "/data/Media:/Media"]} }' "http://localhost/containers/create?name=${NAME}"
curl --unix-socket /var/run/docker.sock -X POST "http://localhost/containers/${NAME}/start" -H "accept: application/json"
if [ "$(test_command)" == "200" ]; then
edate "Status ist $test_command"
fi
curl --unix-socket /var/run/docker.sock -X DELETE "http://localhost/containers/${NAME}?force=true?v=true" -H "accept: application/json"
edate " "
edate "Filebot fertig"
I changed the PUID and GUID to the root id. Thanks to Robin479's comment. Now everything is running as expected.

Resources