Iam trying to build a script to monitor any modifications in files in my FTP site. The script is given below. I have used wc -l to count the no. of files in the directory and have defined the constant value of files if there is going to be any modification in files like if my co-worker updates in my FTP this will send me a notification. Am trying to cron this to achieve. But the script actually just hangs after the count . It doesn't provide me the expected result. Is there anything that am wrong with the code. Am just a beginner in Bash could anyone help me solve this
#!/usr/bin/bash
curl ftp://Sterst:abh89TbuOc#############################/Test/| wc -l ;
read b;
a=9
if [ "$b" != "$a" ];
then
echo "FTP dir has modified mail" -s "dir notification" sni912#######.com;
fi
A couple of notes about your code:
#!/usr/bin/bash
curl ftp://Sterst:abh89TbuOc#############################/Test/| wc -l ;
read b;
That does not do what you think it does: the wc output goes to stdout, not into the read command. Do this instead: b=$( curl ... | wc -l )
a=9
if [ "$b" != "$a" ];
Since the wc output will have some extra whitespace, better to do a numeric comparison:
if (( a != b ))
then
echo "FTP dir has modified mail" -s "dir notification" sni912#######.com;
You probably mean this:
echo "FTP dir has modified" | mail -s "dir notification" sni912#######.com;
I would write this:
listing=$( curl ftp://... )
num=$( echo "$listing" | wc -l )
if (( num != 9 )); then
mail -s "ftp dir modification" email#example.com <<END
FTP directory modification
$listing
END
fi
My code
#!/usr/local/bin/bash
listing=$( curl ftp://username:password#ftp.com/test/ )
num=$( wc -l | echo "$listing" )
if (( num != 9 ));
then
mail -s "ftp dir modification" email#example.com
fi `
Related
I'm creating a tool to parse an input file to re-construct a mailx command (server that creates the data for mailx cannot send emails, so I need to store data into a file so another server can rebuild the command and send it). I could output the whole command to a file and execute the file on the other server, but that's hardly secure / safe.... anyone could intercept the file and insert malicious stuff that would be run as root - this parsing tool is checking every minute for any files to parse and email using a systemd timer and service.
I have created the file, using 'markers / separators' with this format:
-------MESSAGE START-------
Email Body Text
Goes Here
-------MESSAGE END-------
-------SUBJECT START-------
Email Subject
-------SUBJECT END-------
-------ATTACHEMENT START-------
path to file to attach if supplied
-------ATTACHEMENT END-------
-------S OPTS START-------
list of mailx '-S' options eg from=EMAILNAME <email#b.c> or sendwait etc each one on a new line
-------S OPTS END-------
-------EMAIL LIST START-------
string of recipient emails comma separated eg. email1,email2,email3 etc..
-------EMAIL LIST END-------
And I have a program to parse this file and rebuild, and run the mailx command:
#!/bin/bash
## Using systemd logging to journal for this as its now being called as part of a service
## See: https://serverfault.com/questions/573946/how-can-i-send-a-message-to-the-systemd-journal-froma-the-command-line (kkm answer)
start_time="$(date +[%c])"
exec 4>&2 2> >(while read -r REPLY; do printf >&4 '<3>%s\n' "$REPLY"; done)
echo >&4 "<5>$start_time -- Started gpfs_flag_email.sh"
trap_exit(){
exec >2&
}
trap 'trap_exit' EXIT
email_flag_path="<PATH TO LOCATION>/email_flags/"
mailx_message_start="-------MESSAGE START-------"
mailx_message_end="-------MESSAGE END-------"
mailx_subject_start="-------SUBJECT START-------"
mailx_subject_end="-------SUBJECT END-------"
mailx_attachement_start="-------ATTACHEMENT START-------"
mailx_attachement_end="-------ATTACHEMENT END-------"
mailx_s_opts_start="-------S OPTS START-------"
mailx_s_opts_end="-------S OPTS END-------"
mailx_to_email_start="-------EMAIL LIST START-------"
mailx_to_email_end="-------EMAIL LIST END-------"
no_attachment=false
no_additional_opts=false
additional_args_switch="-S "
num_files_in_flag_path="$(find $email_flag_path -type f | wc -l)"
if [[ $num_files_in_flag_path -gt 0 ]]; then
for file in $email_flag_path*; do
email_message="$(awk "/$mailx_message_start/,/$mailx_message_end/" $file | egrep -v -- "$mailx_message_start|$mailx_message_end")"
email_subject="$(awk "/$mailx_subject_start/,/$mailx_subject_end/" $file | egrep -v -- "$mailx_subject_start|$mailx_subject_end")"
email_attachment="$(awk "/$mailx_attachement_start/,/$mailx_attachement_end/" $file | egrep -v -- "$mailx_attachement_start|$mailx_attachement_end")"
email_additional_opts="$(awk "/$mailx_s_opts_start/,/$mailx_s_opts_end/" $file | egrep -v -- "$mailx_s_opts_start|$mailx_s_opts_end")"
email_addresses="$(awk "/$mailx_to_email_start/,/$mailx_to_email_end/" $file | egrep -v -- "$mailx_to_email_start|$mailx_to_email_end" | tr -d '\n')"
if [[ -z "$email_message" || -z "$email_subject" || -z "$email_addresses" ]]; then
echo >&2 "MISSING DETAILS IN INPUT FILE $file.... Exiting With Error"
exit 1
fi
if [[ -z "$email_attachment" ]]; then
no_attachment=true
fi
if [[ -z "$email_additional_opts" ]]; then
no_additional_opts=true
else
additional_opts_string=""
while read -r line; do
if [[ ! $line =~ [^[:space:]] ]]; then
continue
else
additional_opts_string="$additional_opts_string \"${additional_args_switch} '$line'\""
fi
done <<<"$(echo "$email_additional_opts")"
additional_opts_string="$(echo ${additional_opts_string:1} | tr -d '\n')"
fi
if [[ $no_attachment = true ]]; then
if [[ $no_additional_opts = true ]]; then
echo "$email_message" | mailx -s "$email_subject" $email_addresses
else
echo "$email_message" | mailx -s "$email_subject" $additional_opts_string $email_addresses
fi
else
if [[ $no_additional_opts = true ]]; then
echo "$email_message" | mailx -s "$email_subject" -a $email_attachment $email_addresses
else
echo "$email_message" | mailx -s "$email_subject" -a $email_attachment $additional_opts_string $email_addresses
fi
fi
done
fi
find $email_flag_path -type f -delete
exit 0
There is however an issue with the above that I just can work out..... the -S opts completely screw up the email headers and I end up with emails being sent to the wrong people (I have set a reply-to and from options, but the email header is jumbled and the reply-to email ends up in the to: field) like this:
To: Me <a#b.com>, sendwait#a.lan , -S#a.lan <-s#a.lan>, another-email <another#b.com>
All I'm trying to do is rebuild the command as if I'd typed it in the CLI:
echo "EMAIL BODY MESSAGE" | mailx -s "EMAIL SUBJECT" -S "from=EMAILNAME <email#b.c>" -S "replyto=EMAILNAME <email#b.c>" -S sendwait my.email#b.com
I've tried quoting in ' ' and " " quoting the other mailx parameters around it etc etc... I have written other tools that pass variables as input arguments so I just cannot understand how I'm screwing this up.....
Any help would be appreciated...
EDIT
Thanks to Gordon Davisson's really helpful comments I was able to not only fix it but understand the fix as well using an array and appropriately quoting the variables... the tip about using printf was really really helpful in helping me understand what I was doing wrong and how to correct it :P
declare -a mailx_args_array
...
num_files_in_flag_path="$(find $email_flag_path -type f | wc -l)"
if [[ $num_files_in_flag_path -gt 0 ]]; then
for file in $email_flag_path*; do
....
mailx_args_array+=( -s "$email_subject" )
if [[ ! -z "$email_attachment" ]]; then
mailx_args_array+=( -a "$email_attachment" )
fi
if [[ ! -z "$email_additional_s_opts" ]]; then
while read -r s_opt_line; do
mailx_args_array+=( -S "$s_opt_line" )
done < <(echo "$email_additional_s_opts")
fi
mailx_args_array+=( "$email_addresses" )
echo "$email_message" | mailx "${mailx_args_array[#]}"
done
fi
Here is my code which will hit URL and check URL is working or not and will mail to respective webadmins
#!/bin/bash
curl -s --head http://myurl | head -n 1 | grep "HTTP/1.[01] [23].." > /dev/null
if [ $? -eq 0 ]
then
url1="myurl2 is working fine"
else
url1="myurl2 is not working"
fi
curl -s --head http://myurl2 | head -n 1 | grep "HTTP/1.[01] [23].." > /dev/null
if [ $? -eq 0 ]
then
url2="myurl is working fine"
else
url2="myurl is not working"
fi
exec 1<>/dev/tcp/127.0.0.1/25
a=$(cat <<"MAILEND"
HELO local.domain.name
MAIL FROM: <send#mydomain.com>
RCPT TO: <recieve#mail2.com>
DATA
From: send#mydomain.com
To: recieve#mail2.com
Subject: test
$url1 $url2.
.
QUIT
.
MAILEND
)
IFS='
'
declare -a b=($a)
for x in "${b[#]}"
do
echo $x
sleep 1
done
How ever in Mail I am getting $url1 $url2
This must be due to $url1 & $url2 is not getting substituted in line $a
Can Someone please help
let me mention one thing "mail" or "sendmail" mails get spamed in my domain so sending mails with above method only works fine
I am expecting "myurl is working" " myurl2 is not working"
as myurl2 httpd service I have already stopped
Thanks
Replace
cat <<"MAILEND"
with
cat <<MAILEND
Putting quotes around the identifier to a "here document" signals to the shell not to expand any variables. This is explained in the "Here Documents" section of man bash
This is intended as an aside, but since comments don't support code formatting, posting this refactoring here.
( cat <<____HERE
From: send#mydomain.com
To: recieve#mail2.com
Subject: test
____HERE
while read label url; do
if curl -s --head "$url" | head -n 1 | grep "HTTP/1.[01] [23].." > /dev/null
then
echo "$label is working fine"
else
echo "$label is broken"
fi
done <<____HERE |
myurl http://myurl2
myurl2 http://myurl
____HERE
) | sendmail -oi -t
I am assuming that the labels cross-matching each other's URLs is unintentional and/or incidental and/or beyond my limited comprehension.
So I have been struggling with this task for eternity and still don't get what went wrong. This program doesn't seem to download ANY pdfs. At the same time I checked the file that stores final links - everything stored correctly. The $PDFURL also checked, stores correct values. Any bash fans ready to help?
#!/bin/sh
#create a temporary directory where all the work will be conducted
TMPDIR=`mktemp -d /tmp/chiheisen.XXXXXXXXXX`
echo $TMPDIR
#no arguments given - error
if [ "$#" == "0" ]; then
exit 1
fi
# argument given, but wrong format
URL="$1"
#URL regex
URL_REG='(https?|ftp|file)://[-A-Za-z0-9\+&##/%?=~_|!:,.;]*[-A-Za-z0-9\+&##/%=~_|]'
if [[ ! $URL =~ $URL_REG ]]; then
exit 1
fi
# go to directory created
cd $TMPDIR
#download the html page
curl -s "$1" > htmlfile.html
#grep only links into temp.txt
cat htmlfile.html | grep -o -E 'href="([^"#]+)\.pdf"' | cut -d'"' -f2 > temp.txt
# iterate through lines in the file and try to download
# the pdf files that are there
cat temp.txt | while read PDFURL; do
#if this is an absolute URL, download the file directly
if [[ $PDFURL == *http* ]]
then
curl -s -f -O $PDFURL
err="$?"
if [ "$err" -ne 0 ]
then
echo ERROR "$(basename $PDFURL)">&2
else
echo "$(basename $PDFURL)"
fi
else
#update url - it is always relative to the first parameter in script
PDFURLU="$1""/""$(basename $PDFURL)"
curl -s -f -O $PDFURLU
err="$?"
if [ "$err" -ne 0 ]
then
echo ERROR "$(basename $PDFURLU)">&2
else
echo "$(basename $PDFURLU)"
fi
fi
done
#delete the files
rm htmlfile.html
rm temp.txt
P.S. Another minor problem I have just spotted. Maybe the problem is with the if in regex? I pretty much would like to see something like that there:
if [[ $PDFURL =~ (https?|ftp|file):// ]]
but this doesn't work. I don't have unwanted parentheses there, so why?
P.P.S. I also ran this script on URLs beginning with http, and the program gave the desired output. However, it still doesn't pass the test.
SHELL SCRIPT TO GET MAIL IF FILE GET MODIFIED
I am writing script to get mail if file has been modified
recip="mungsesagar#gmail.com"
file="/root/sagar/ldapadd.sh"
#stat $file
last_modified=$(stat --printf=%y $file | cut -d. -f1)
#echo $last_modified
mail -s "File ldapadd.sh has changed" $recip
Now I get mail when I run this script but I want to compare two variables so that I can get mail only if file modified or content changed.
How can I store output in variable to compare
Thanks in advance
Sagar
I'd do it this way:
recip="you#example.com"
file="/root/sagar/ldapadd.sh"
ref="/var/tmp/mytimestamp.dummy"
if [ "$file" -nt "$ref" ]; then
mail -s "File ldapadd.sh has changed" $recip
fi
touch -r "$file" "$ref" # update our dummy file to match
The idea is to store the last seen timestamp of the file of interest by copying it to another file (using touch). Then we always know what the last time was, and can compare it against the current timestamp on the file and email as needed.
If I understand your question correct, the logic can be changed by storing the output of "ls -ltr filename" in a temp1 file and comparing the same with the ls -ltr output
I would use find to see the last modifyed file
#!/bin/bash
file=timestamp.txt
if [ ! -f timestamp.txt ];
then
stat -f %Sm -t %Y%m%d%H%M%S $file > timestamp.txt
else
timestamp=$(stat -f %Sm -t %Y%m%d%H%M%S $file)
filetime=$(cat filetime.txt)
if [ "$filetime" = "$timestamp" ];
then
#Do nothing
else
echo "$file has been modified" >> /tmp/email.txt
mail -s "File has changed" "email#domain.com" < /tmp/email.txt
fi
fi
I want to have two youtube-dl processes (or as much as possible )to run in parallel. Please show me how. thanks in advance.
#!/bin/bash
#package: youtube-dl axel
#file that contains youtube links
FILE="/srv/backup/temp/youtube.txt"
#number of lines in FILE
COUNTER=`wc -l $FILE | cut -f1 -d' '`
#download destination
cd /srv/backup/transmission/completed
if [[ -s $FILE ]]; then
while [ $COUNTER -gt 0 ]; do
#get video link
URL=`head -n 1 $FILE`
#get video name
NAME=`youtube-dl --get-filename -o "%(title)s.%(ext)s" "$URL" --restrict-filenames`
#real video url
vURL=`youtube-dl --get-url $URL`
#remove first link
sed -i 1d $FILE
#download file
axel -n 10 -o "$NAME" $vURL &
#update number of lines
COUNTER=`wc -l $FILE | cut -f1 -d' '`
done
else
break
fi
This ought to work with GNU Parallel:
cd /srv/backup/transmission/completed
parallel -j0 'axel -n 10 -o $(youtube-dl --get-filename -o "%(title)s.%(ext)s" "{}" --restrict-filenames) $(youtube-dl --get-url {})' :::: /srv/backup/temp/youtube.txt
Learn more: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Solution
You need to run your command in a subshell, i.e. put your command into ( cmd ) &.
Definition
A shell script can itself launch subprocesses. These subshells let the
script do parallel processing, in effect executing multiple subtasks
simultaneously.
Code
For you it will look like this I guess (I add quote to $vURL) :
( axel -n 10 -o "$NAME" "$vURL" ) &
I don't know if it is the best way, you can define a function and then call it in background
something like this:
#!/bin/bash
#package: youtube-dl axel
#file that contains youtube links
FILE="/srv/backup/temp/youtube.txt"
# define a function
download_video() {
sleep 3
echo $1
}
while read -r line; do
# call it in background, with &
download_video $line &
done < $FILE
script ends quick but function still runs in background, after 3 seconds it will show echos
also used read and while loop to simplify the file reading
Here's my take on it. By avoiding several commands you should see some minor improvement in speed though it might not be noticeable. I did add error checking which can save you time on broken URLs.
#file that contains youtube links
FILE="/srv/backup/temp/youtube.txt"
while read URL ; do
[ -z "$URL" ] && continue
#get video name
if NAME=$(youtube-dl --get-filename -o "%(title)s.%(ext)s" "$URL" --restrict-filenames) ; then
#real video url
if vURL=$(youtube-dl --get-url $URL) ; then
#download file
axel -n 10 -o "$NAME" $vURL &
else
echo "Could not get vURL from $URL"
fi
else
echo "Could not get NAME from $URL"
fi
done << "$FILE"
By request, here's my proposal for paralleling the vURL and NAME fetching as well as the download. Note: Since the download depends on both vURL and NAME there is no point in creating three processes, two gives you about the best return. Below I've put the NAME fetch in its own process, but if it turned out that vURL was consistently faster, there might be a small payoff in swapping it with the NAME fetch. (That way the while loop in the download process won't waste even a second sleeping.) Note 2: This is fairly crude, and untested, it's just off the cuff and probably needs work. And there's probably a much cooler way in any case. Be afraid...
#!/bin/bash
#file that contains youtube links
FILE="/srv/backup/temp/youtube.txt"
GetName () { # URL, filename
if NAME=$(youtube-dl --get-filename -o "%(title)s.%(ext)s" "$1" --restrict-filenames) ; then
# Create a sourceable file with NAME value
echo "NAME='$NAME'" > "$2"
else
echo "Could not get NAME from $1"
fi
}
Download () { # URL, filename
if vURL=$(youtube-dl --get-url $1) ; then
# Wait to see if GetName's file appears
timeout=300 # Wait up to 5 minutes, adjust this if needed
while (( timeout-- )) ; do
if [ -f "$2" ] ; then
source "$2"
rm "$2"
#download file
if axel -n 10 -o "$NAME" "$vURL" ; then
echo "Download of $NAME from $1 finished"
return 0
else
echo "Download of $NAME from $1 failed"
fi
fi
sleep 1
done
echo "Download timed out waiting for file $2"
else
echo "Could not get vURL from $1"
fi
return 1
}
filebase="tempfile${$}_"
filecount=0
while read URL ; do
[ -z "$URL" ] && continue
filename="$filebase$filecount"
[ -f "$filename" ] && rm "$filename" # Just in case
(( filecount++ ))
( GetName "$URL" "$filename" ) &
( Download "$URL" "$filename" ) &
done << "$FILE"