How to monitore the stdout of a command with a timer? - bash

I'd like to know when an application hasn't print a line in stdout for N seconds.
Here is a reproducible example:
#!/bin/bash
dmesg -w | {
while IFS= read -t 3 -r line
do
echo "$line"
done
echo "NO NEW LINE"
}
echo "END"
I can see the NO NEW LINE but the pipe doesn't stop and the bash doesn't continue. END is never displayed.
How to exit from the braces' code?
Source: https://unix.stackexchange.com/questions/117501/in-bash-script-how-to-capture-stdout-line-by-line

How to exit from the brackets' code?
Not all commands exit when they can't write to output or receive SIGPIPE, and they will not exit until they actually notice they can't write to output. Instead, run the command in the background. If the intention is not to wait on the process, in bash you could just use process substitution:
{
while IFS= read -t 3 -r line; do
printf "%s\n" "$line"
done
echo "end"
} < <(dmesg -w)
You could also use coprocess. Or just run the command in the background with a pipe and kill it when done with it.

Related

Use read builtin command to read from parent stdin while in a subshell

I have script that is launching a subshell/background command to read input and then doing more work:
#!/bin/bash
(
while true; do
read -u 0 -r -e -p "test_rl> " line || break
echo "line: ${line}"
done
) &
sleep 3600 # more work
With the above I don't even get a prompt. If I exec 3>&0 prior to launching the subshell and then read from descriptor 3 (-u 3) then I at least get the prompt, but the read command still doesn't get any input that I type.
How do I get the read builtin to read correctly from the terminal (parent's stdin file descriptor)?
How do I get the read builtin to read correctly from the terminal
(parent's stdin file descriptor)?
You might want to try this (using the parent's filedescriptors):
#!/bin/bash
(
while true; do
read -u 0 -r -e -p "test_rl> " line || break
echo "line: ${line}"
done
)<&0 >&1 &
sleep 3600 # more work

Fails to read lines from running process in bash

Using process substitution, we can get every lines of output of a command .
# Echoes every seconds using process substitution
while read line; do
echo $line
done < <(for i in $(seq 1 10); do echo $i && sleep 1; done)
By the same way above, I want to get the stdout output of 'wpa_supplicant' command, while discarding stderr.
But nothing can be seen on screen!
while read line; do
echo $line
done < <(wpa_supplicant -Dwext -iwlan1 -c${MY_CONFIG_FILE} 2> /dev/null)
I confirmed that typing the same command in prompt shows its output normaly.
$ wpa_supplicant -Dwext -iwlan1 -c${MY_CONFIG_FILE} 2> /dev/null
What is the mistake? Any help would be appreciated.
Finally I found the answer here!
The problem was easy... the buffering. Using stdbuf (and piping), the original code will be modified as below.
stdbuf -oL wpa_supplicant -iwlan1 -Dwext -c${MY_CONFIG_FILE} | while read line; do
echo "! $line"
done
'stdbuf -oL' make the stream line buffered, so I can get every each line from the running process.

Read full stdin until EOF when stdin comes from `cat` bash

I'm trying to read full stdin into a variable :
script.sh
#/bin/bash
input=""
while read line
do
echo "$line"
input="$input""\n""$line"
done < /dev/stdin
echo "$input" > /tmp/test
When I run ls | ./script.sh or mostly any other commands, it works fine.
However It doesn't work when I run cat | ./script.sh , enter my message, and then hit Ctrl-C to exit cat.
Any ideas ?
I would stick to the one-liner
input=$(cat)
Of course, Ctrl-D should be used to signal end-of-file.

"allowed" operations in bash read while loop

I have a file text.txt which contains two lines.
first line
second line
I am trying to loop in bash using following loop:
while read -r LINE || [[ -n "$LINE" ]]; do
# sed -i 'some command' somefile
echo "echo something"
echo "$LINE"
sh call_other_script.sh
if ! sh some_complex_script.sh ; then
echo "operation failed"
fi
done <file.txt
When calling some_complex_script.sh only the first line is processed, however when commenting it out all two lines are processed.
some_complex_script.sh does all kind of stuff, like starting processes, sqlplus, starting WildFly etc.
./bin/call_some_script.sh | tee $SOME_LOGFILE &
wait
...
sqlplus $ORACLE_USER/$ORACLE_PWD#$DB<<EOF
whenever sqlerror exit 1;
whenever oserror exit 2;
INSERT INTO TABLE ....
COMMIT;
quit;
EOF
...
nohup $SERVER_DIR/bin/standalone.sh -c $WILDFLY_PROFILE -u 230.0.0.4 >/dev/null 2>&1 &
My question is if there are some operations which are not supposed to be called in some_complex_script.sh and in the loop (it may as well take 10 minutes to finish, is this a good idea at all?) which may break that loop.
The script is called using Jenkins and the Publish over SSH Plugin. When some_complex_script.sh is called on its own, there are no problems.
You should close or redirect stdin for the other commands you run, to stop them reading from the file. eg:
sh call_other_script.sh </dev/null

Reading STDOUT from Shell Script (EASY)

I'm sure this is SUPER easy. I am trying to read a string from STDOUT. I submit a job onto another machine and then I WANT to be able to send "bjobs" which checks if the job has been finished. I want to be able to read STDOUT and detect when it has finished then move on.
This is what I have and it isn't working but I feel super close!
Waiting for stdout to read "No unfinished job found"
bjobs
IFS= read -r line
echo "$line"
while "$line" != "No unfinished job found"
do
echo "$line"
sleep 30s
bjobs
IFS= read -r line
done
any help would be appreciated! This is one of my first shell scripts
The thing that you are missing is that read will read from its stdin ... not its stdout. So you have to arrange that its stdin is corresponds to the stdout of the command that you want it to read. The straight-forward way to do that is to use a pipe (|)
For example:
$ bjobs | ( IFS= ; read -r line ; echo "$line" ; while "$line" != "No unfinished job found" ; do ; echo "$line" ; sleep 30s ; read -r line ; done )
The ( ... ) is creating a subshell ...

Resources