I'm on Freebsd9.2.(I have to use this operating system) I want to run multiple scripts with at command but I want to ignore running a script in a same time.
For example: I have 3 script files: 1.sh, 2.sh, 3.sh
I have a job to execute 1.sh at today 16:20, when I run the at command with the same time and script, the number of the jobs in /var/at/jobs changed to 2 jobs. I want to ignore this, but the script 2.sh can run with thw same time. Do you have any idea what should I do?
I don't know if I understood correctly the problem, but maybe the command lockf could help.
For example try this in one terminal:
$ lockf -t 0 /tmp/a.lock sleep 5
In another terminal run:
$ lockf -t 0 /tmp/a.lock echo "sleep finished"
In this example until the command sleep 5 doesn't exit, if you try to run another command you will get something like:
lockf: /tmp/a.lock: already locked
A cron example:
15 4 * * * lockf -t 0 /tmp/poudriere.lock /usr/local/etc/poudriere.d/cron 12amd64 default
This prevents running the script/app if the lock exists, so probably you can get an idea of how you could use it with at
Related
I have 3 services to be started in the system reboot in a sequnece
Tried blow in crontab -e but no use
#reboot sleep 10 && bash /myfolder/hazelcast-x.x/bin/start.sh
#reboot sleep 20 && /myfolder/apache-activemq-x.x/bin/activemq start
#reboot sleep 30 && bash /myfolder/apache-tomcat-x.x/bin/startup.sh
in your cron command you can use && so if the command on the left hand side fails, the command on the right hand side will not run. for example,
command 1 && command 2 && command 3
Also be aware that the unmodified version of bash /myfolder/hazelcast-x.x/bin/start.sh looks for a pid file if found it will not start. So you may want to clean up the pid file prior to running start.sh after a reboot. We do this to prevent more than one node from starting on a given server since without partition groups that could lead to data loss.
I have a requirement where I need to trigger 10 shell scripts at a time. I may have 200+ shell scripts to be executed.
e.g. if I trigger 10 jobs and two jobs completed, I need to trigger another 2 jobs which will make number of jobs currently executing to 10.
I need your help and suggestion to cater this requirement.
Yes with GNU Parallel like this:
parallel -j 10 < ListOfJobs.txt
Or, if your jobs are called job_1.sh to job_200.sh:
parallel -j 10 job_{}.sh ::: {1..200}
Or. if your jobs are named with discontiguous, random names but are all shell scripts named with .sh suffix in one directory:
parallel -j 10 ::: *.sh
There is a very good overview here. There are lots of questions and answers on Stack Overflow here.
Simply run them as background jobs:
for i in {1..10}; { ./script.sh & }
Adding more jobs if less than 10 are running:
while true; do
pids=($(jobs -pr))
((${#pids[#]}<10)) && ./script.sh &
done &> /dev/null
There are different ways to handle this:
Launch them together as background tasks (1)
Launch them in parallel (1)
Use the crontab (2)
Use at (3)
Explanations:
(1) You can launch the processes exactly when you like (by launching a command, click a button or whatever event you choose)
(2) The processes will be launched at the same time, every (working) day, periodically.
(3) You choose a time when the processes will be launched together once.
I have used below to trigger 10 jobs a time.
max_jobs_trigger=10
while mapfile -t -n ${max_jobs_trigger} ary && ((${#ary[#]})); do
jobs_to_trigger=`printf '%s\n' "${ary[#]}"`
#Trigger script in background
done
I am testing a bash script I hope to run as a cron job to scan a download log and perform labor-intensive conversions on image files. In order to run several conversions at once, the first script loops through the download log and sends the filename to the second script, which I set to run as a background process using &.
The script pair works well, but when the process is complete, I must press the enter key to return to a command prompt. This is a non-issue when I am running a test, but I am not sure if this behavior has ramifications when run as a cron job.
Will this be an issue? If so, is there a way to close the "terminal" running the first script from the crontab?
Here's a truncated form of my code:
Script 1 (to be launched by crontab):
for i in file1 file2 file3 etc
do
bash /path/to/convert.sh $i &
done
exit 0
Script 2 (convert.sh)
fileName=${1?no file given}
jpegName=$(echo $fileName | sed s/tif/jpg/g)
convert $fileName $jpegName
exit 0
Thanks for any help/assurances you can give!
you don't need script 2. you can convert it to function and put it inside script1.
Another problem is you are running convert.sh in an uncontrolled way. You cannot foresee how many processes will be created (background processes) and this may lead to severe performance overheads.
finally, if you cannot end process in normal way, you may choose to terminate it again using cron by issueing pkill script1.sh
I have a bash script which will be running a main command, let's say for one hour. I would like to execute another command after a certain time since the main command has been started (at t_x). Something like this:
main starts -------> main ends
|
|
at time t_x, second command is executed
At the moment I have something like this:
mpirun main_command & sleep 1m && second_command
and the problem is that after second command is executed, the main command is killed. Can anyone help me? Thanks!
The first command is failing to lock the console, as another process is also using it. You will need to redirect the standard io pipelines, 0<&- mpirun main_command >/dev/null 2>/dev/null If this still does not work, use shell -c 'mpirun main_command' & sleep 1m;second_command You can use ; instead of &&, unless you need a failing exit code when someone interrupts the sleep.
I would like to run n processes (in my case simulations) simultaneously, using bash.
Right now this is what I'm running:
for file in $ini/SAN*.ini;
do
echo "Running $file...";
temp=$(basename $file .ini)
mosrun -G opp_run -r 0 -u Cmdenv -n ..:../../src -l ../../src/inet SAN.ini > $outputs/$temp.out;
done
Problem is, the loop only progresses to the next iteration after the simulation is done. Any suggestions? Thanks!
You should be able to run your command in the background by adding a & after it.
Should make them run in parallell, although in the background.
(Small side note: the processes will continue to run even if you abort the script, so you might want to add a trap to kill the processes if you hit for eg. ctrl-c when script is running. Look at bash manual.)