If Condition logical error in multiple conditions in loop with cron - bash

I need to run 2 loops in cron which work after every 5 minutes
*/5 * * *  *
loop by cron 1. work in every 5 minutes and check if the file is uploaded or not. ($yesterday mean file with name of backdate)
1st cron-loop work fine to me, 2nd cron-loop I am not able to resolve, 2nd loop have 3 conditions
1. It should work when it found $yesterday.zip
2. It should only work once after $yesterday.zip (because its cron so it work after every 5 minutes when $yesterday.zip found)
3. it should not work 00:00 till $yesterday.zip downloaded
($yesterday file has no fix time to download so i run cron every 5 minutes) 
I made this (writing below so you . guys dont think i didnt made effort and didnt say show sampel code, just need a if statement with cron include these 3 conditions)
FILE=/fullpath/$yesterday.zip
if test -f "$FILE"; then
touch /fullpath/loop2.txt ##########for loop 2
echo "I am the best"
else
cd /fullpath/
wget -r -np -nH "url/$yesterday.zip" ###########it should be 20+ mb file
find . -name "*.zip" -type 'f' -size -160k -delete ########## it will delete is some garbage downloaded
rm -rf /fullpath/loop2.txt  ########## delete the file so it stopped loop 2 for working evry 5 minutes .
fi
FILE2=/fullpath/loop2.txt
if test -f "$FILE2"; then
echo -e "Script will work only once" | mailx -v -s "Script will work only once"  myemail#gmail.com
else
echo "full script work"
touch /fullpath/loop2.txt
fi
You guys can ignore my above code and simple let me know if statement for such 3 conditons loop

I would use something like this :
if lockfile -r0 /tmp/lockfile_$(date +%F); then #only run if there's no lockfile for the day
cd /fullpath/
# while we don't have a correct file (absent or truncated)
while [[ ! -f "$yesterday.zip" ]] || [[ $(stat -c %s "$yesterday.zip") < 20971520 ]]; do
wget -r -np -nH "url/$yesterday.zip" # try to download it
if [ $? -ne 0 ]; then # if the file isn't available yet
rm /tmp/lockfile_$(date +%F) # delete the lock in order to attempt the download again in 5"
fi
done
fi

Related

How to write mv command that will work as cron job

I'm running Centos 7. I need to have a cron job that moves everything from /media/tmp to /media/tv except the .grab and Transcode folders. Yesterday I thought that the following worked, but today it moves the Transcode folder as well.
mv -n /media/tmp/*!(Transcode)!(.grab) /media/tv/
I've found that the above does not work as a cron job as the '(' causes and error. I learned that I needed to escape those, but now I get
mv: cannot stat ‘/media/tmp/!(Transcode)!(.grab)’: No such file or directory
My current attempt at a bash script is
#!/bin/bash
mv -n '/media/tmp'/*!\(Transcode\)!\(.grab\) '/media/tv/'
My understanding is that the * is the problem, but using either ' or " on the file path doesn't seem to fix it like that post I found said it would.
Any ideas on how to get this to work correctly?
You're trying to use extglob, which may not be enabled for cron. I would avoid that option entirely, iterating over the glob with a negative ! regex match.
for file in /media/tmp/*; do
[[ ! "$file" =~ Transcode|\.grab$ ]] && mv -n "$file" /media/tv/
done
I'd just do it as something simple like (untested):
mkdir -p /media/tv || exit 1
for i in /media/tmp/*; do
case $(basename "$i") in
Transcode|.grab ) ;;
* ) mv -n -- "$i" /media/tv ;;
esac
done

Bash execute a for loop while time is outside working hours

I am trying to make the below script to execute a Restore binary between hours 17:00 - 07:00 for each folders which name starts with EAR_* in /backup_local/ARCHIVES/ but for some reason it is not working as expected, meaning that the for loop is not breaking if the time condition gets invalid.
Should I add the while loop inside the for loop?
#! /usr/bin/bash
#set -x
while :; do
currenttime=$(date +%H:%M)
if [[ "$currenttime" > "17:00" ]] || [[ "$currenttime" < "07:00" ]]; then
for path in /backup_local/ARCHIVES/EAR_*; do
[ -d "${path}" ] || continue # if not a directory, skip
dirname="$(basename "${path}")"
nohup /Restore -a /backup_local/ARCHIVES -c -I 0 -force -v > /backup_local/$dirname.txt &
wait $!
if [ $? -eq 0 ]; then
rm -rf $path
rm /backup_local/$dirname.txt
echo $dirname >> /backup_local/completed.txt
fi
done &
else
echo "Restore can be ran only outside working hours!"
break
fi
done &
your script looks like this in pseudo-code:
START
IF outside workinghours
EXIT
ELSE
RUN /Restore FOR EACH backupdir
GOTO START
The script only checks the time once, before starting a restore run (which will call /Restore for each directory to restore in a for loop)
It will continue to start the for loop, until the working hours start. Then it will exit.
E.g. if you have restore 3 folders to restore, each taking 2 hours; and you start the script at midnight; then the script will check whether it's outside working hours (it is), and will start the restore for the first folder (at 0:00), after two hours of work it will start the restore the 2nd folder (at 2:00), after another two hours it will start the restore of the 3rd folder (at 4:00). Once the 3rd folder has been restored, it will check the working hours again. Since it's now only 6:00, that is: outside the working hours, it will start the restore for the first folder (at 6:00), after two hours of work it will start the restore the 2nd folder (at 8:00), after another two hours it will start the restore of the 3rd folder (at 10:00).
It's noon when it does the next check against the working hours; since 12:00 falls within 7:00..17:00, the script will now stop. With an error message.
You probably only want the restore to run once for each folder, and stop proceeding to the next folder if the working hours start.
#!/bin/bash
for path in /backup_local/ARCHIVES/EAR_*/; do
currenttime=$(date +%H:%M)
if [[ "$currenttime" > "7:00" ]] && [[ "$currenttime" < "17:00" ]]; then
echo "Not restoring inside working hours!" 1>&2
break
fi
dirname="$(basename "${path}")"
/Restore -a /backup_local/ARCHIVES -c -I 0 -force -v > /backup_local/$dirname.txt
# handle exit code
done
update
I've just noticed your liberal spread of & for backgrounding jobs.
This is presumably to allow running the script from a remote shell. don't
What this will really do is:
it will run all the iterations over the restore-directories in parallel. This might create a bottleneck on your storage (if the directories to restore to/from share the same hardware)
it will background the entire loop-to-restore and immediately return to the out-of-hours check. if the check succeeds, it will spawn another loop-to-restore (and background it). then it will return to the out-of-hours check and spawn another backgrounded loop-to-restore.
Before dawn you probably have a few thousands background threads to restore directories. More likely you've exceeded your ressources and the process get's killed.
My example script above has omitted all the backgrounding (and the nohup).
If you want to run the script from a remote shell (and exit the shell after launching it), just run it with
nohup restore-script.sh &
Alternatively you could use
echo "restore-script.sh" | at now
or use a cron-job (if applicable)
The shebang contains an unwanted space. On my ubuntu the bash is found at /bin/bash.
Yours, is located there :
type bash
The while loop breaks in my test, replace the #!/bin/bash path with the result of the previous command:
#!/bin/bash --
#set -x
while : ; do
currenttime=$(date +%H:%M)
if [[ "$currenttime" > "17:00" ]] || [[ "$currenttime" < "07:00" ]]; then
for path in /backup_local/ARCHIVES/EAR_*; do
[ -d "${path}" ] || continue # if not a directory, skip
dirname="$(basename "${path}")"
nohup /Restore -a /backup_local/ARCHIVES -c -I 0 -force -v > /backup_local/$dirname.txt &
wait $!
if [ $? -eq 0 ]; then
rm -rf $path
rm /backup_local/$dirname.txt
echo $dirname >> /backup_local/completed.txt
fi
done &
else
echo "Restore can be ran only outside working hours!"
break
fi
done &

bash check for subdirectories under directory

This is my first day scripting, I use linux but needed a script that I have been racking my brain until i finally ask for help. I need to check a directory that has directories already present to see if any new directories are added that are not expected.
Ok I think i have got this as simple as possible. The below works but displays all files in the directory as well. I will keep working at it unless someone can tell me how not to list the files too | I tried ls -d but it is doing the echo "nothing new". I feel like an idiot and should have got this sooner.
#!/bin/bash
workingdirs=`ls ~/ | grep -viE "temp1|temp2|temp3"`
if [ -d "$workingdirs" ]
then
echo "nothing new"
else
echo "The following Direcetories are now present"
echo ""
echo "$workingdirs"
fi
If you want to take some action when a new directory is created, used inotifywait. If you just want to check to see that the directories that exist are the ones you expect, you could do something like:
trap 'rm -f $TMPDIR/manifest' 0
# Create the expected values. Really, you should hand edit
# the manifest, but this is just for demonstration.
find "$Workingdir" -maxdepth 1 -type d > $TMPDIR/manifest
while true; do
sleep 60 # Check every 60 seconds. Modify period as needed, or
# (recommended) use inotifywait
if ! find "$Workingdir" -maxdepth 1 -type d | cmp - $TMPDIR/manifest; then
: Unexpected directories exist or have been removed
fi
done
Below shell script will show directory present or not.
#!/bin/bash
Workingdir=/root/working/
knowndir1=/root/working/temp1
knowndir2=/root/working/temp2
knowndir3=/root/working/temp3
my=/home/learning/perl
arr=($Workingdir $knowndir1 $knowndir2 $knowndir3 $my) #creating an array
for i in ${arr[#]} #checking for each element in array
do
if [ -d $i ]
then
echo "directory $i present"
else
echo "directory $i not present"
fi
done
output:
directory /root/working/ not present
directory /root/working/temp1 not present
directory /root/working/temp2 not present
directory /root/working/temp3 not present
**directory /home/learning/perl present**
This will save the available directories in a list to a file. When you run the script a second time, it will report directories that have been deleted or added.
#!/bin/sh
dirlist="$HOME/dirlist" # dir list file for saving state between runs
topdir='/some/path' # the directory you want to keep track of
tmpfile=$(mktemp)
find "$topdir" -type d -print | sort -o "$tmpfile"
if [ -f "$dirlist" ] && ! cmp -s "$dirlist" "$tmpfile"; then
echo 'Directories added:'
comm -1 -3 "$dirlist" "$tmpfile"
echo 'Directories removed:'
comm -2 -3 "$dirlist" "$tmpfile"
else
echo 'No changes'
fi
mv "$tmpfile" "$dirlist"
The script will have problems with directories that have very exotic names (containing newlines).

Sleep and check for a matching file (Bash) until it gets created

I'm working on a shell script in Jenkins, that must check for a file every 2 minutes until it gets generated on a remote server. Once the file is found, the job must be successful. But in my case I am experiencing issues with my script, which is keep on sleeping every 2 minutes and not finding the matching file. I know that the issue is with wildcard. But is there any alternate way to fix this? my script:
while [ ! -f ${DONE_DIR}/issxxx*.xml ];
do
sleep 120;
done;
sleep 120;
cat ${DONE_DIR}/isxxx*.xml;
You can try like this,
while true;
do
if [[ $(find ${DONE_DIR} -iname "issxxx*.xml") ]]; then
break
else
sleep 120
fi
done;
sleep 120;
cat ${DONE_DIR}/issxxx*.xml

Recursive File Renaming script doesn't work from crontab

Crontab launches the script, the podcast updates correctly, but none of the file renaming (1st loop) or file moving (2nd loop).
If I run the script from the command-line, works perfectly.
I have added the "echo" lines to troubleshoot, the $file variable is consistent when run through command-line and crontab.
#/bin/sh
# Mad Money updates at 6:40 pm (timezone?) M-F
# At 6:30 pm CST it was ready to download
# http://podcast.cnbc.com/mmpodcast/lightninground.xml
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
echo "paths"
echo $PATH
podcast_folder=$"/home/zenon/podcasts/MAD_MONEY_W__JIM_CRAMER_-_Full_Episode"
episode_folder=$"/mnt/black-2tb-001/plex-server/shows/Mad-Money/Season-1"
hpodder update
sleep 1
hpodder download
sleep 1
cd ${podcast_folder}
for file in "$podcast_folder"/*.mp4; do
echo "Processing ${file}"
#"MadMoney-" Name
name=${file:60:9}
echo "podcast name is ${name}"
#"04" Month
month=${file:69:2}
echo "month is ${month}"
#"18" Day
day=${file:71:2}
echo "day is ${day}"
#"13" yr
yr=${file:73:2}
echo "year is 20${yr}"
title="${name}20${yr}.${month}.${day}.mp4"
echo "file ${file}"
echo "title ${title}"
# cp ${file} ${title}
mv ${file} ${title}
done
cd ${podcast_folder}
for file in "$podcast_folder"/*.mp4; do
chown zenon:plex ${file}
mv ${file} ${episode_folder}
done
# deletes any files older than 9 days
find ${episode_folder} -type f -mtime +9 -exec rm {} \;
exit
here is the debugging output from the script
cat cron.log
paths
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
1 podcast(s) to consider
Get: 4 MAD MONEY W/ JIM CRAMER - Full Episode
100% 1 B/s 0s
0 episode(s) to consider from 1 podcast(s)
0% 0 B/s 0s
Processing $/home/zenon/podcasts/MAD_MONEY_W__JIM_CRAMER_-_Full_Episode/*.mp4
you have a mistake in the first line, you should change
#/bin/sh to #!/bin/sh

Resources