I cannot create a good log file - ftp

Found the Solution !!!!
After a gob of Googling, I found this in a forum from a person asking " How to: Add or display today’s date from a shell script"
This is what I did
I added the following to the beginning of my ftp script
#!/bin/bash
TODAY=$(date)
HOST=$(hostname)
echo "--------------------------------------------"
echo "This script was run: $TODAY ON HOST:$HOST "
echo "--------------------------------------------"
# below is original code minus the #!/bin/sh
#
cd /folder where csv files are/
ftp -v -i -n 111.222.333.444 <<EOF
user mainuser dbuser
mput phas*.csv
bye
EOF
Now my log, on each cron event of the ftp'ing, show:
This script was run: Tue Nov 12 11:16:02 EST 2013 ON MyServer's HostName>
On the crontab, I changed the entry for logging to include 2 >> so the log is appended and not re-written:
16 11 * * * /srv/phonedialer_tmp/ftp-date.sh &>> /srv/phonedialer_tmp/ftp-date.log
I found a way to create a log file of daily ftp's by searching here:
./ftp_csv.sh 2>&1 > ftp_csv.log
I works great in that is records each time the cronjob runs. However, what I cannot find is a way to insert the date/time of each event. As you can see below, it records the transferring of the files.
is there a way I can somehow add the date/timestamp to the beginning or end of each recorded event within the log file?
[stevek#localhost phonedialer_tmp]$ cat ftp_csv.log
Connected to 1.2.3.4 (1.2.3.4).
220 Microsoft FTP Service
331 Password required for mainuser.
230 User mainuser logged in.
221
Connected to 1.2.3.4 (1.2.3.4).
220 Microsoft FTP Service
331 Password required for mainuser.
230 User mainuser logged in.
221
Connected to 1.2.3.4 (1.2.3.4).
220 Microsoft FTP Service
331 Password required for mainuser.
230 User mainuser logged in.
221 ETC
Thanks so much for any information

Related

ECHO not working within a "TRUE" if condition

Being a relative beginner, I can't figure this out. I have a script that is started via cron. Within this script is an if/fi where I check to see if a (yearly archive) directory does not exist. It it does not, I create the directory, and ATTEMPT to echo that to the cron's log file that is created for each run. The directory is created, but the echo does not appear in the log file.
Here is a snippet of the code in question.
035: yyyy=`date +%Y`
036: today=`date +%m/%d/%Y`
037: time=`date +%r` #+%l:%M:%S%P`
038: dayofweek=`date +%A`
039: numDayOfWeek=`date +%u`
040:
041: echo "Run Date/Time: $today $time"
042:
043: WFADIR="/data/ssa1/home1/NEI/GAP-EFT-FLAT/$yyyy"
044: if [ ! -d $WFADIR ] ; then
045: mkdir /data/ssa1/home1/NEI/GAP-EFT-FLAT/$yyyy
046: chmod 777 /data/ssa1/home1/NEI/GAP-EFT-FLAT/$yyyy
047: echo ""
048: echo "New folder $yyyy created in GAP-EFT-FLAT"
049: fi
050:
051: #display test variables for output
052: echo ""
053: echo "HOSTNAME..........: ${HOSTNAME^^}"
054: echo ""
055:
And here is the FULL log file.
Run Date/Time: 01/03/2023 08:00:01 AM
HOSTNAME..........: BASYSPROD
EFT contribution file found...
Calling expect script to transmit contribution file...
spawn sftp -P 22 -i privatekey.pem username#domain.com:/inbound/NATIO080_ACH_3
Connected to domain.com.
Changing to: /inbound/NATIO080_ACH_3
sftp> put B06737_CON_20230103
Uploading B06737_CON_20230103 to /inbound/NATIO080_ACH_3/B06737_CON_20230103
B06737_CON_20230103 0% 0 0.0KB/s --:-- ETA
B06737_CON_20230103 100% 2470 70.0KB/s 00:00
sftp> Returned from contribution expect script...
Archiving sent contribution file...
Sending email confirmation...
Process completed...
EFT 401K file found...
Calling expect script to transmit 401K file...
spawn sftp -P 22 -i privatekey.pem username#domain.com:/inbound/NATIO080_ACH_4
Connected to domain.com.
Changing to: /inbound/NATIO080_ACH_4
sftp> put B06736_401K_20230103
Uploading B06736_401K_20230103 to /inbound/NATIO080_ACH_4/B06736_401K_20230103
B06736_401K_20230103 0% 0 0.0KB/s --:-- ETA
B06736_401K_20230103 100% 7980 216.4KB/s 00:00
sftp> Returned from 401K expect script...
Archiving sent 401K file...
Sending email confirmation...
As you can see, the echo from line 41 is in the log file. Then, as this was the first run for 2023, the 2023 directory did not yet exist. It WAS created and the permissions were changed as well, with lines 45 and 46, respectively.
drwxrwxrwx. 2 neiauto staff 61 Jan 3 08:00 2023
So why do lines 47 and 48 appear not to execute, and the next echo in the log file is from line 52, 53 and 54, with the hostname display, surrounded by blank lines?
I was expecting a blank line, and "New folder 2023 created in GAP-EFT-FLAT" to be echoed after the Run date/time (first) line of the log file, and before the host name display.
Very likely your directory already existed. Add an else echo $WFADIR already exists to your code to have your answer next year :-). My guess would be that the same code was run twice (on the same, or another host if shared disk-space was used).

Mac os x terminal mail: send multiple outputs in one mail

I have a backupscript that executes every 2 weeks with cron on my mac os high sierra.
And that part works and now I want to mail the log to myself using these 2 lines:
df -Ph /Volumes/USB_Storage >> "/Users/ralphschipper/Documents/Logs/rsync"date +"%Y-%m-%d".log
cat "/Users/ralphschipper/Documents/Logs/rsync"date +"%Y-%m-%d".log | /usr/bin/mail -s "Backuplog" user#gmail.com
the thing is: my backup starts at 10:00 pm september 15 so the logfile is created on the 15th
The backup was ready at 1:00 am september 16 so a new logfile is created.
At the end the mail was send using the logfile that contains the df command from the 16th.
does anyone now how to fix this?
can I create a variable at the begin of the proces that stores the current date and use that?
or can I send a mail that sends the logfile and the df results?
Regards,
Ralph
Store the date you want to use (and do the same with the complete filename).
backupdate=$(date +"%Y-%m-%d")
backupfile="/Users/ralphschipper/Documents/Logs/rsync${backupdate}.log"
df -Ph /Volumes/USB_Storage >> "${backupfile}"
cat "${backupfile}" | /usr/bin/mail -s "Backuplog of ${backupdate}" user#gmail.com

script from cron doesn't run

I have a script:
-rwx------. 1 root root 135 Oct 15 12:00 /backup/purge.sh
#!/bin/bash
volume=`echo "list volumes" | bconsole|grep -i "Append\|Full"|awk '{print $4}'`
echo "purge volume=$volume yes" | bconsole
If I run it manually it runs.
If I put the script to crontab it doesn't run, however the log says it ran.
Oct 15 16:07:01 sdfdsfdsf CROND[36326]: (root) CMD (/backup/purge.sh)
The schedule:
07 16 * * * /backup/purge.sh
If I run manually:
/backup/purge.sh
Connecting to Director weewr:9101
1000 OK: 1 werewrewrewr Version: 7.0.5 (28 July 2014)
Enter a period to cancel a command.
purge volume=Vol-0001 yes
This command can be DANGEROUS!!!
It purges (deletes) all Files from a Job,
JobId, Client or Volume; or it purges (deletes)
all Jobs from a Client or Volume without regard
to retention periods. Normally you should use the
PRUNE command, which respects retention periods.
Automatically selected Catalog: MyCatalog
Using Catalog "MyCatalog"
1 File on Volume "Vol-0001" purged from catalog.
There are no more Jobs associated with Volume "Vol-0001". Marking it purged.
bconsole hasn't been in the PATH so I used full path for the bconsole command like this:
!/bin/bash
volume=echo "list volumes" | /sbin/bconsole|grep -i "Append\|Full"|awk '{print $4}'
echo "purge volume=$volume yes" | /sbin/bconsole

FTP Client Output Response standard

After successful FTP file transfer, the the response is used to be "226 File send OK", but suddenly, it has changed to be "226 Transfer complete"
I have below questions:
Does FTP response code has any standard?
Can we customize FTP output response for a specific status code?
Find the standard FTP response for file transfer
$ ftp canopus
Connected to canopus.austin.century.com.
220 canopus.austin.century.com FTP server (Version 4.1 Sat Nov 23 12:52:09 CST 1991) ready.
Name (canopus:eric): dee
331 Password required for dee.
Password:
230 User dee logged in.
ftp> pwd
257 "/home/dee" is current directory.
ftp> cd desktop
250 CWD command successful.
ftp> type ascii
200 Type set to A.
ftp> send typescript
200 PORT command successful.
150 Opening data connection for typescript (128.114.4.99,1412).
226 File send OK.
ftp> cdup
250 CWD command successful.
ftp> bye
221 Goodbye.
Note: suddenly the response text 226 File send OK has changed to 226 Transfer complete
Find the details about FTP responses on wikipedia
RFC 959, 4.2. FTP REPLIES:
An FTP reply consists of a three digit number (transmitted as
three alphanumeric characters) followed by some text. The number
is intended for use by automata to determine what state to enter
next; the text is intended for the human user. It is intended
that the three digits contain enough encoded information that the
user-process (the User-PI) will not need to examine the text and
may either discard it or pass it on to the user, as appropriate.
In particular, the text may be server-dependent, so there are
likely to be varying texts for each reply code.

shell script display grep results

I need some help with displaying how many times two strings are found on the same line! Lets say I want to search the file 'test.txt', this file contains names and IP's, I want to enter a name as a parameter when running the script, the script will search the file for that name, and check if there's an IP-address there also. I have tried using the 'grep' command, but I don't know how I can display the results in a good way, I want it like this:
Name: John Doe IP: xxx.xxx.xx.x count: 3
The count is how many times this line was found, this is how my grep script looks like right now:
#!/bin/bash
echo "Searching $1 for the Name '$2'"
result=$(grep "$2" $1 | grep -E "(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)")
echo $result
I will run the script like 'sh search test.txt John'.
I'm having trouble displaying the information I get from the grep command, maybe there's a better way to do this?
EDIT:
Okey, I will try to explain a little better, let's say I want to search a .log file, I want a script to search that file for a string the user enters as a parameter. i.e if the user enters 'sh search test.log logged in' the script will search for the string "logged in" within the file 'test.log'. If the script finds this line on the same line as a IP-address the IP address is printed, along with how many times this line was found.
And I simply don't know how to do it, I'm new to shell scripting, and was hoping I could use grep along with regular expressions for this! I will keep on trying, and update this question with an answer if I figure it out.
I don't have said file on my computer, but it looks something like this:
Apr 25 11:33:21 Admin CRON[2792]: pam_unix(cron:session): session opened for user 192.168.1.2 by (uid=0)
Apr 25 12:39:01 Admin CRON[2792]: pam_unix(cron:session): session closed for user 192.168.1.2
Apr 27 07:42:07 John CRON[2792]: pam_unix(cron:session): session opened for user 192.168.2.22 by (uid=0)
Apr 27 14:23:11 John CRON[2792]: pam_unix(cron:session): session closed for user 192.168.2.22
Apr 29 10:20:18 Admin CRON[2792]: pam_unix(cron:session): session opened for user 192.168.1.2 by (uid=0)
Apr 29 12:15:04 Admin CRON[2792]: pam_unix(cron:session): session closed for user 192.168.1.2
Here is a simple Awk script which does what you request, based on the log snippet you posted.
awk -v user="$2" '$4 == user { i[$11]++ }
END { for (a in i) printf ("Name: %s IP: %s count: %i\n", user, a, i[a]) }' "$1"
If the fourth whitespace-separated field in the log file matches the requested user name (which was passed to the shell script as its second parameter), add one to the count for the IP address (from field 11).
At the end, loop through all non-zero IP addresses, and print a summary for each. (The user name is obviously whatever was passed in, but matches your expected output.)
This is a very basic Awk script; if you think you want to learn more, I urge you to consult a simple introduction, rather than follow up here.
If you want a simpler grep-only solution, something like this provides the information in a different format:
grep "$2" "$1" |
grep -o -E '(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)' |
sort | uniq -c | sort -rn
The trick here is the -o option to the second grep, which extracts just the IP address from the matching line. It is however less precise than the Awk script; for example, a user named "sess" would match every input line in the log. You can improve on that slightly by using grep -w in the first grep -- that still won't help against users named "pam" --, but Awk really gives you a lot more control.
My original answer is below this line, partly becaus it's tangentially useful, partially because it is required in order to understand the pesky comment thread below.
The following
result=$(command)
echo $result
is wrong. You need the second line to be
echo "$result"
but in addition, the detour over echo is superfluous; the simple way to write that is simply
command

Resources