Bash script if else execute once cronjob every minute - bash

I have a bash script in crontab that runs every minute.
In this bash script i have a sql query which goes and check for the number
If the number is greater than predefined number then I want to move files and replace files.
This works absolutely fine, the problem is that since this script runs every minute via crontab, when the script runs next time it overwrites the file.
Is there any logic that I can put that this code is only run once but let the cron run every minute.
here is the code
#!/bin/bash
count=`mysql -B -u root -ppassword -e 'select count(*) from column' table | tail -n +2`
allowed="500"
if [ "$count" -ge "$allowed" ]
then
mv /netboot/var/www/html /usr/html/
mv /netboot/var/www/back /netboot/var/www/html
echo "Not Allowed - Disable Code goes here"
else
echo "all is good for now $count"
fi
exit 0
Your help is appreciated.

I have managed to fix this by creating another if statement within the parent if.
See below.
#!/bin/bash
count=`mysql -B -u root -ppassword -e 'select count(*) from column' table | tail -n +2`
allowed="500"
if [ "$count" -ge "$allowed" ]
then
if
html folder exists in /usr/
then
mv /netboot/var/www/html /usr/html/
mv /netboot/var/www/back /netboot/var/www/html
else
echo " "
fi
echo "Not Allowed - Disable Code goes here"
else
echo "all is good for now $count"
fi
exit 0

Related

Performing a command after while loop ended

I'm trying to find a way to use the while loop to check for a certain condition, and if this condition is met, another command will be executed.
For example:
while [ -z "$(ls -A test-dir)" ];
do
echo "directory is empty, checking again in 5 seconds"
sleep 5
done
This loop will end once the directory will have any files inside, I'm trying to find a proper way to execute another command at the end when the loop completed.
Figured out that I need to simply add the last command after the while:
while [ -z "$(ls -A test-dir)" ];
do
sleep 2
echo "sleeping 2 seconds"
done
echo "reached the end"
The last command will be executed only after the loop is done.

How to display long script logs to one liner?

Lets Say I have multiple scripts which need to invoke sequentially for the job.
These scripts has long and lengthy output this is my bash script,
How to avoid that but still can understand that the process is running.
Here is an example,
#!/bin/bash
echo "Script to prepare Final BUILD"
rm -vf module1.out
module1_build_script.sh #FIXME: This scripts outputs 10000 lines
#module1_build_script.sh &> /dev/null #Not interested as this makes difficult if the process hangs or running.
if [ ! -f ./out/module1.out ];then
echo "Module 1 build failed"
exit 1
fi
.
.
.
rm -vf module1.out
module4_build_script.sh # This scripts outputs 5000 lines
if [ ! -f ./out/module4.out ];then
echo "Module 4 build failed"
exit 4
fi
Now I am expecting some code gives me effect like below output as one liner without scroll
example: module1_build_script.sh | "magical code here" #FIXME:
Like below output
user#bash#./myscript
#-------content of myscript ---------------
#!/bin/bash
while (( i < 10))
do
echo -en "\r Process is running...$i"
sleep 0.5
((i++))
done
#------------------------------------------

sqlplus within a loop - unix

Is there a way send multiple sqlplus commands within a loop but waiting for one to run with success in order for the next one to start ?
Here is a sample of my code. I have tried to add that sleep 15 because the functions that I'm going to execute are taking like 10-20s to run. I want to get rid of that 15s constant and make them run one after the other.
if [ "$#" -eq 1 ]; then
checkUser "$1"
while read line; do
sqlplus $user/$pass#$server $line
sleep 15
done < "$wrapperList"
fi
The instruction in a while loop are done in sequence. It would be equivalent to do like that, using ; to chain instructions:
sqlplus $user/$pass#$server $line1
sqlplus $user/$pass#$server $line2
So you don't need the sleep 15 here, since the sqlplus commands will not be called in parallel. The way you did it already is calling them one after the other.
Nota: It is even better to stop running if first line did not return correctly, using && to say: run only if previous return code is 0
sqlplus $user/$pass#$server $line1 &&\
sqlplus $user/$pass#$server $line2
To have this in the while loop:
checkUser "$1"
while read line; do
sqlplus $user/$pass#$server $line
RET_CODE=$? # check return code, and break if not ok.
if [ ${RET_CODE} != 0 ]; then
echo "aborted." ; break
fi
done < "$wrapperList"
On the other hand, When you want to run in parallel, syntax is different, like here: Unix shell script run SQL scripts in parallel

How can I make my screen locker script work?

I've been having some problems with my screenlocker program. Spent a day trying to solve it but nothing worked, so I decided to write a program that locks my screen:
LOCKTIME=60
lastIdleTime=0
extra=0
while [ 1 ]; do
sound=$(pacmd list-sink-inputs | grep -c "state: RUNNING")
idleTime=$(($(xprintidle) / 1000))
lock=$(gnome-screensaver-command -q | grep -c " active")
if [[ $lock != 0 ]]; then
extra=$idleTime
else
if [[ $sound != 0 || $idleTime -lt $lastIdleTime ]]; then
extra=$idleTime
fi
if [[ $(($idleTime - $extra)) -gt $LOCKTIME ]]; then
gnome-screensaver-command -l
fi
fi
lastIdleTime=$idleTime
sleep 1
done
If I execute it manually, everything goes well. But I want to run it at startup, so I tried to use crontab and create a desktop entry at ~/.config/autostart folder. But it seems that crontab doesn't execute the program, or it executes but the script can't lock my screen, and it runs with desktop entry, but xprintidle doesn't update and gnome-screensaver-command -q | grep -c " active" returns 0 all the time, so after 60 seconds it stays locking my screen every second.
I also wrote it in python, and it doesn't work either. The only diference is that gnome-screensaver-command -q | grep -c " active" returns 1 all the time.
Is there a better way to execute and keep it running (and working) every startup?
Btw, I'm using Antergos with GNOME and GDM.

Stop Bash Script if Hive Fails

I have a bash script that loops through a folder and processes all *.hql files. Sometimes one of the hive script fails (syntax, resource constraint, etc), instead of the script failing it will continue onto the next .hql file.
Anyway I can stop the bash from processing the remaining? Below is my sample bash:
for i in `ls ${layer}/*.hql`; do
echo "Processing $i ..."
hive ${hiveconf_all} -hiveconf DATE=${date} -f ${i} &
if [ $j -le 5 ]; then
j=$(( j+1 ))
else
wait
j=0
fi
done
I would check the process completion state of the previous command and invoke the exit command to come out the loop
(( $? == 0 )) && exit 1
Introduce the above line after the hive command and should do the trick.
add
set -e
to the top of your script
Use this template for running parallel processes and wait for their completion. Add your date, layer, hiveconf_all and other variables:
#!/bin/bash
set -e
#Run parallel processes and write their logs
log_dir=/tmp/my_script_logs
for i in `ls ${layer}/*.hql`; do
echo "Processing $i ..."
#Run hive in parallel and redirect to the log file
hive ${hiveconf_all} -hiveconf DATE=${date} -f ${i} 2>&1 | tee "log_dir/${i}".log &
done
#Now wait for all processes to complete
FAILED=0
for job in `jobs -p`
do
echo "job=$job"
wait $job || let "FAILED+=1"
done
if [ "$FAILED" != "0" ]; then
echo "Execution FAILED! ($FAILED)"
#Do something here, log or send message, etc
exit 1
fi
#All processes are completed successfully!
#Do something here
echo "Done successfully"
Then you will be able to inspect each process log individually.

Resources