I making CI build program.
I using three script
first is for build. second is for asset build. The last is for running the first and second scripts.
I want to wait for the first and second script until they done in The last script.
Here is the test script.
The first script
#!/bin/sh
sleep 1
echo test2
exit 0
The second script
#!/bin/sh
sleep 1
echo test3
exit 0
The last script
#!/bin/sh
open -a /Applications/Utilities/Terminal.app ./screenTest2.sh &
open -a /Applications/Utilities/Terminal.app ./screenTest3.sh &
wait
echo test
how can I wait new terminals die or scripts. I'm newly in OS X. So can you explain easily that solution?
Bash instructions are sequential just remove '&' character at the end of each line. Each instruction will execute one after the other. You can safely remove wait as well it will not be necessary to wait.
Related
I have a simple bash script, which uses the inotifywait command and based on the modification that is done, it run a rsync command. This is done in an infinite loop.
The script is run in a "screen" session, but unfortunately it crashes from time to time (no error as to why).
I've been searching for a way to "monitor" that specific screen/script and restart it when it crashes, but I'm struggling to find a solution to that.
The script is run "screen -AmdS script ./script.sh".
Script example:
#!/usr/bin/bash
while inotifywait --exclude "(.log)" -r -e modify,create,delete /home/backups/
do
rsync -avz -e --update --rsh='ssh -pxxxxx' /home/backups/* user#target:/location/ --delete --force
done
So my question basically is, is there a way to monitor the 'screen' session and if it stops to start a new one or is there another way to keep this script running constantly and restart it (possibly not utilizing screen).
You can rerun your failing script in a loop until it succeeds. You can do this with
screen -AmdS script bash -c 'until ./script.sh; do echo "Crashed with exit code $?. Restaring."; sleep 1; done'
Every time your script fails, this will print that your script crashed and what exit code it had, pause for 1 second, and then rerun your script. As soon as your script succeeds (i.e., ./script.sh terminates with a non-zero exit code), then the loop will terminate.
Note that if your script never succeeds, this is an infinite loop.
Edit: attempt to be clearer
You could just issue
until ./script.sh ; do echo crashed ; sleep 1 ; done
in your terminal. That will restart the script whenever it exits with non-zero result.
I am writing a bash script to execute 2 commands at a time on 2 different terminal & original terminal wait for both 2 terminal to finish & then continue with remaining script.
I am able to open a different terminal with required command, however the original terminal seems not waiting for the 2nd one to complete & auto close before proceeding with remaining of the script.
#!/bin/bash
read -p "Hello"
read -p "Press enter to start sql installation"
for i in 1
do
xterm -hold -e mysql_secure_installation &
done
echo "completed installation"
Use the Bash wait command to cause the calling script to wait for a background process to complete. Your for loop implies that you may be launching multiple background processes in parallel even though in your question there's only one. Without any options wait will wait for all of them.
I wonder why you're launching the processes in xterm instead of directly.
When running a bash script like source script or . script from the command line, then all the lines in the script are added to bash's "source buffer" and then the current command shell just continues. Stopping execution is impossible (apart from aborting the shell), using ctrl-C only interrupts the current command, but then the next command is executed.
Where is this buffer, and would it be possible to clear it??
Example script:
echo A
sleep 10
echo B
sleep 10
echo C
sleep 10
echo D
After having done "source script", is there any way to stop it executing any further after it has been 'submitted'?
There is - to the best of my knowledge - no such thing as a source buffer in bash, so there is nothing to erase. The source command just executes the commands found in its arguments in the current environment (i.e. not in a child process).
There is nothing in the handling of source which is in particular related to the handling of signals. Maybe your shell script is setup to ignore Control-C? I suggest that you run your script with -x in order to find the culprit.
From the command line, typing cat waits for user input.
But in the following script, wait ignores the background process.
#!/bin/bash
cat &
wait
echo "After wait"
This script immediately blasts right past the wait command. How can I make wait actually wait for the cat command to finish? I've tried waiting for the specific PID or job number, but the effect is the same.
That's because cat is exiting right away, because stdin is not inherited. Try this instead:
cat <&0 &
I have a bash script with a loop that calls a hard calculation routine every iteration. I use the results from every calculation as input to the next. I need make bash stop the script reading until every calculation is finished.
for i in $(cat calculation-list.txt)
do
./calculation
(other commands)
done
I know the sleep program, and i used to use it, but now the time of the calculations varies greatly.
Thanks for any help you can give.
P.s>
The "./calculation" is another program, and a subprocess is opened. Then the script passes instantly to next step, but I get an error in the calculation because the last is not finished yet.
If your calculation daemon will work with a precreated empty logfile, then the inotify-tools package might serve:
touch $logfile
inotifywait -qqe close $logfile & ipid=$!
./calculation
wait $ipid
(edit: stripped a stray semicolon)
if it closes the file just once.
If it's doing an open/write/close loop, perhaps you can mod the daemon process to wrap some other filesystem event around the execution? `
#!/bin/sh
# Uglier, but handles logfile being closed multiple times before exit:
# Have the ./calculation start this shell script, perhaps by substituting
# this for the program it's starting
trap 'echo >closed-on-calculation-exit' 0 1 2 3 15
./real-calculation-daemon-program
Well, guys, I've solved my problem with a different approach. When the calculation is finished a logfile is created. I wrote then a simple until loop with a sleep command. Although this is very ugly, it works for me and it's enough.
for i in $(cat calculation-list.txt)
do
(calculations routine)
until [[ -f $logfile ]]; do
sleep 60
done
(other commands)
done
Easy. Get the process ID (PID) via some awk magic and then use wait too wait for that PID to end. Here are the details on wait from the advanced Bash scripting guide:
Suspend script execution until all jobs running in background have
terminated, or until the job number or process ID specified as an
option terminates. Returns the exit status of waited-for command.
You may use the wait command to prevent a script from exiting before a
background job finishes executing (this would create a dreaded orphan
process).
And using it within your code should work like this:
for i in $(cat calculation-list.txt)
do
./calculation >/dev/null 2>&1 & CALCULATION_PID=(`jobs -l | awk '{print $2}'`);
wait ${CALCULATION_PID}
(other commands)
done