Stop process as soon as a file contains "Success" [duplicate] - shell

This question already has answers here:
How do I use a file grep comparison inside a bash if/else statement?
(5 answers)
How to kill a background process created in a script
(2 answers)
Closed 5 years ago.
I'm trying to stop a long running process as soon as the file /status contains the string Success.
I tried this without success:
cat & while [ `grep -q Success /status` ]; do sleep 1; done; kill %1
cat is the long running process that needs to be stopped when /status contains Success.
Cheers

Related

Bash user input while doing other stuff in background [duplicate]

This question already has answers here:
How do you run multiple programs in parallel from a bash script?
(19 answers)
Closed 10 months ago.
I am trying to do a while loop while waiting for user input.
read x
while [ $x == 3 ]
do
echo yay
done
does not do what I want. Of course it does not work, but I type that and make sure no one gets confused
Well, I figured it out.
loop(){ while true; do echo -n .; sleep 2; done; }
loop & read x
kill %1
This defines a function called loop, runs it in background, waits for user input, then stops the background process.

Bash executes comands out of sequance sometimes [duplicate]

This question already has answers here:
Why didn't the shell command execute in order as I expect?
(4 answers)
bash script order of execution
(2 answers)
Closed 3 years ago.
I running multiple commands in bash in parallel and need to get output that is delimited so receiving script could separate the values.
I attempted to do this in few ways but it seems that any echo is executed instantly and anything following after.
So I am trying to find a way to separate input from each output with separator preceding output.
I actually use curl request that may take 50-200ms to respond, but here for simplicity I will give example with time command.
Here is rough example:
echo ">" && time &
echo ">" && time &
echo ">" && time &
wait
This produces >>> time time time
I am looking for a way to make it produce >time>time>time
I had some success trying to call other bash scripts with trailing echo command instead of making actual commands and that works most of the time but inevitably things get mixed up because of timing.
I will post updates as I work on it, thank you for the help
Try this:
echo ">$(time)" &
echo ">$(time) &
echo ">$(time)" &
wait
That tells echo that it needs the output of the time command you have before it can do its thing.

How to handle sub-command errors in bash script? [duplicate]

This question already has answers here:
Why does "local" sweep the return code of a command?
(2 answers)
Why is bash errexit not behaving as expected in function calls?
(4 answers)
Closed 4 years ago.
I want to know how best to exit a script when an error occurs within a sub-command - specifically, in an assignment (i.e., of the form MYVAR="$(...)").
The minimal example of my problem is the following bash script.
#!/bin/bash
set -e
fail() {
echo "Some error" >&2
exit 1
}
main() {
local my_val="$(fail)"
echo 'Success!'
}
main
This will output the following:
Some error
Success!
What I am trying to figure out is how best to detect and handle the failure which occurs so that the Success stage is never reached.

Get running time of script in variable (bash) [duplicate]

This question already has answers here:
get values from 'time' command via bash script [duplicate]
(4 answers)
How to store a substring of the output of "time" function in bash script
(3 answers)
Closed 5 years ago.
I'm looking to capture the execution time of an rsync transfer and then store that time in a variable for a later step in my bash script.
I have tried the following:
ELAPSED=$(time $(rsync -azh source/ dest &> /dev/null)) to no avail. The elapsed time is printed to the screen, but it is not saved in the ELAPSED variable.
I have also tried: ELAPSED=$(/usr/bin/time sh -c "rsync -azh source/ dest &> /dev/null") but this actually takes the sh output and stores that output into the variable, not the time.
Any insights on a better method or corrections to my attempts are appreciated in advance!

Executing ssh command in a bash shell script within a loop [duplicate]

This question already has answers here:
in my bash loop over a list of some servers, if the ssh connects the bash script exits
(2 answers)
Closed 8 years ago.
I'm trying to execute an ssh command within a a bash shell script that should perform the following:
1) ssh to host
2) execute command
3) print value of command
4) repeat steps 1 -3
5) exit bash shell script
I have set up password less entry to the remote host, added host key to remote host
I want to test the various states of the httpd process running on the remote host
Within a text file, httpd_process.txt, I have:
/etc/init.d/httpd status (stop, start, restart)
I do the following in the script:
while read LINE
do
echo "Httpd Request: $LINE"
status=`$LINE`
echo "Status: $status"
sleep 5 # sleep so that next
done < /path_name/httpd_process.txt
exit 0
I assumed that each time through the loop another input string is read from the input text file and the request is made to the remote host.
However, what I experience is that after the first request the script terminates.
Am I correct to assume that as the first request is sent it creates a child process and once that process completes my script completes and the next turn through the loop is not executed?
ssh is consuming stdin. Pass it -n to prevent this.

Resources