Run Multiple Shell Scripts From One Shell Script - shell

Heres what I'm trying to do. I have 4 shell scripts. Script 1 needs to be run first, then 2, then 3, then 4, and they must be run in that order. Script 1 needs to be running (and waiting in the background) for 2 to function properly, however 1 takes about 3 seconds to get ready for use. I tried doing ./1.sh & ./2.sh & ./3.sh & ./4.sh, but this results in a total mess,since 2 starts requesting things from 1 when 1 is not ready yet. So, my question is, from one shell script, how do I get it to start script 1, wait like 5 seconds, start script 2, wait like 5 seconds, etc. without stopping any previous scripts from running (i.e. they all have to be running in the background for any higher numbered script to work). Any suggestions would be much appreciated!

May I introduce you to the sleep command?
./1.sh & sleep 5
./2.sh & sleep 5
./3.sh & sleep 5
./4.sh

#!/bin/sh
./1.sh &; sleep 5;./2.sh &; sleep 5; ./3.sh &; sleep 5; ./4.sh

Related

Bash - kill a command after a certain time [duplicate]

This question already has answers here:
Timeout a command in bash without unnecessary delay
(24 answers)
Closed 1 year ago.
In my bash script I run a command that activates a script. I repeat this command many times in a for loop and as such want to wait until the script is finished before running it again. My bash script is as follows
for k in $(seq 1 5)
do
sed_param='s/mu = .*/mu = '${mu}';/'
sed -i "$sed_param" brusselator.c
make brusselator.tst &
done
As far as I know the & at the end lets the script know to wait until the command is finished, but this isn't working. Is there some other way?
Furthermore, sometimes the command can take very very long, in this case I would maximally want to wait 5 seconds. But if the command is done earlier I would't want to wait 5 seconds. Is there some way to achieve this?
There is the timeout command. You would use it like
timeout -k 5 make brusselator.tst
Maybe you would like to see also if it exited successfully, failed or was killed because it timed out.
timeout -k 5 make brusselator.tst && echo OK || echo Failed, status $?
If the command times out, and --preserve-status is not set, then command exits with status 124. Different status would mean that make failed for different reason before timing out.

How to run multiple instances of command-line tool in bash script? + user input for script

I am trying to launch multiple instances of imagesnap simultaneously from a single bash script on a Mac. Also, it would be great to give (some of) the arguments by user input when running the script.
I have 4 webcams connected, and want to take series of images from each camera with a given interval. Being an absolute beginner with bash scripts, I don't know where to start searching. I have tested that 4 instances of imagesnap works nicely when running them manually from Terminal, but that's about it.
To summarise I'm looking to make a bash script that:
run multiple instances of imagesnap.
has user input for some of the arguments for imagesnap.
ideally start all the imagesnap instances at (almost) the same time.
--EDIT--
After thinking about this I have a vague idea of how this script could be organised using the ability to take interval images with imagesnap -t x.xx:
Run multiple scripts from within the main script
or
Use subshells to run multiple instances of imagesnap
Start each sub script or subshell in parallel if possible.
Since each instance of imagesnap will run until terminated it would be great if they could all be stopped with a single command
the following quick hack (saved as run-periodically.sh) might do the right thing:
#!/bin/bash
interval=5
start=$(date +%s)
while true; do
# run jobs in the background
for i in 1 2 3 4; do
"$#" &
done
# wait for all background jobs to finish
wait
# figure out how long we have to sleep
end=$(date +%s)
delta=$((start + interval - end))
# if it's positive sleep for this amount of time
if [ $delta -gt 0 ]; then
sleep $delta || exit
fi
start=$((start + interval))
done
if you put this script somewhere appropriate and make it executable, you can run it like:
run-periodically.sh imagesnap arg1 arg2
but while testing, I ran with:
sh run-periodically.sh sh -c "date; sleep 2"
which will cause four copies of "start a shell that displays the date then waits a couple of seconds" to be run in parallel every interval seconds. if you want to run different things in the different jobs, then you might want to put them into this script explicitly or maybe another script which this one calls…

Start background process with ssh, run experiments script, then stop it

I am running client-server performance experiments on several remote machines. I am trying to write a script to automate the experiments. Here is how it looks like (in a simplified way) for the moment.
for t in 0 1 2 3 4 5 6 7 8 9; do
cmd1="ssh user#${client1} runclient --threads=${t}"
cmd2="ssh user#${client2} runclient --threads=${t}"
$cmd1 &
$cmd2 &
wait
runclient connects to a server that I have started manually. It works fine, but I would like to automate starting and stopping the server as well. That means
Start the server in the background at the beginning of experiments
Run all the experiments
Stop the server at the end of experiments
I have found several suggestions but I am not sure which one is good for me exactly. Some recommend nohup, but I am not sure how to use it, and I don't understand why I should redirect stdin, stdout, and stderr. There is also the maybe the "-f" option to ssh to start a background process. In that case, how can I stop it later?
Edit: in response to the comments, the server is part of the performance experiments. I start it in a similar way to the client.
ssh user#${server} runserver
The only difference is that I want to start the server once, run several experiments on the clients with different parameters, and then stop the server. I could try something like that
ssh user#${server} runserver &
for t in 0 1 2 3 4 5 6 7 8 9; do
cmd1="ssh user#${client1} runclient --threads=${t}"
cmd2="ssh user#${client2} runclient --threads=${t}"
$cmd1 &
$cmd2 &
wait
But as the server does not stop, the script would never go past the first wait
Track your PIDs and wait for them individually.
This also lets you track failures, as shown below:
ssh "user#${server}" runserver & main_pid=$!
for t in 0 1 2 3 4 5 6 7 8 9; do
ssh "user#${client1}" "runclient --threads=${t}" & client1_pid=$!
ssh "user#${client2}" "runclient --threads=${t}" & client2_pid=$!
wait "$client1_pid" || echo "ERROR: $client1 exit status $? when run with $t threads"
wait "$client2_pid" || echo "ERROR: $client2 exit status $? when run with $t threads"
done
kill "$main_pid"

Starting several codes and stoping after one of them is finish [duplicate]

This question already has an answer here:
Kill background process when another process ends in Linux
(1 answer)
Closed 4 years ago.
So im starting 4 codes at the same time and I want 3 of them to run in a loop until the 4th program finish.
loopProgram1 &
loopProgram2 &
loopProgram3 &
Program4
So I want loopPrograms 1,2 and 3 to execute and then all of them exit once program4 is done. Is there any way I can do this?
Here is one in bash that takes care of the background processes too if you should kill the main script. First the looper:
$ cat loopProgram
while true # loops forever echoing its parameter number set in main
do
echo $1
sleep 1
done
and the main:
$ cat main.bash
function finish {
kill %1 %2 %3
}
trap finish EXIT
bash loopProgram 1 &
bash loopProgram 2 &
bash loopProgram 3 &
sleep 3 # this mimics your Program4
Run it:
$ bash main.bash
2
1
3
1
2
3
2
1
3
$
Since my loopers loop forever, if you'd leave the kill in the bottom of main.bash after the sleep and ^C'ing it would leave the loopers running in the background.
(I got the influensa, I hope the example is clear. I probably forget it after the next nap.)

Need to write a script that runs two scripts, but needs to stop the first one before the 2nd runs [duplicate]

This question already has answers here:
Timeout a command in bash without unnecessary delay
(24 answers)
Closed 6 years ago.
This is a CentOS 6.x box, on it I have two things that I need to run one right after the other - a shell script and a .sql script.
I want to write a shell script that calls the first script, lets it run and then terminates it after a certain number of hours, and then calls the .sql script (they can't run simultaneously).
I'm unsure how to do the middle part, that is terminating the first script after a certain time limit, any suggestions?
script.sh &
sleep 4h && kill $!
script.sql
This will wait 4 hours then kill the first script and run the second. It always waits 4 hours, even if the script exits early.
If you want to move on immediately, that's a little trickier.
script.sh &
pid=$!
sleep 4h && kill "$pid" 2> /dev/null &
wait "$pid"

Resources