Capture output of a timed out command using Ubuntu `timeout` - bash

Related questions:
This question ended up not requiring a resolution.
This question had a resolution not involving timeout, but suggested that timeout should work for this purpose when available.
My question:
This command produces no output into foo.txt:
$ cat foo.sh
#!/bin/sh
sh -c "echo foo; sleep 5; echo bar" | awk '/foo/ {print $1}'
$ timeout 2 ./foo.sh > foo.txt
If I don't redirect into foo.txt, I see foo print out immediately as expected.
On the other hand, the following produces "foo" int the file foo.txt as expected:
$ timeout 2 sh -c "echo foo; sleep 5; echo bar" > foo.txt
$ cat foo.txt
foo
Does anyone know why this may be happening and how best to resolve it? This is a toy example, but the actual script I'm running that led to this problem produces around 100 lines of output on the command line, but also leaves foo.txt empty if it times out before terminating.

I found a solution to this. The key is to add fflush() inside the awk script, which seemed to be buffering the output:
#!/bin/sh
sh -c "echo foo; sleep 5; echo bar" | awk '/foo/ {print $1; fflush()}'
$ timeout 2 ./foo.sh > foo.txt

In my experience, this is because pipe "|" wait "echo foo; sleep 5; echo bar" run complete .So after 5s awk can get the output, but timeout terminate the command in 2s so it cannot get the text.
Edit:
Maybe this helps, you can move char (") to the end like this:
$ cat foo.sh
#!/bin/sh
sh -c "echo foo; sleep 5; echo bar | awk '/foo/ {print $1;}'"
$ timeout 2 ./foo.sh > foo.txt
$ cat foo.txt
foo

Related

Parse file to .aliasrc

I want to transform a string given in this form:
xyx some commands
into this form:
alias xyz="some commands"
I tried different combinations in the terminal. It seems (i'm not sure) that it worked once, but never when i run this from the script. I've read somewhere that this is a variable problem.
Alias for readability:
alias first="sed 's/\s.*//'"
alias rest="sed 's/\S*\s*//'"
cat f_in | tee -a >(one=$(first)) >(two=$(rest)) | tee >(awk '{print "alias "$1"=\""$2"\""}' > f_out )
I used awk in this way to parse "cat f_in" into "print". It doesn't work. Then, i used "awk -v" but it still doesn't work too. How to redirect variable $one and $two into awk:
{one=$(first) === first | read -r one }?
Is this what you're trying to do:
$ echo 'xyx some commands' |
awk '{var=$1; sub(/^[^[:space:]]+[[:space:]]+/,""); printf "alias %s=\"%s\"\n", var, $0}'
alias xyx="some commands"
$ echo 'xyx some commands' |
sed 's/\([^[:space:]]*\)[[:space:]]*\(.*\)/alias \1="\2"/'
alias xyx="some commands"

Ignoring all but the (multi-line) results of the last query sent to a program

I have an executable that accepts queries from stdin and responds to them, reading until EOF. Additionally I have an input file and a special command, let's call those EXEC, FILE and CMD respectively.
What I need to do is:
Pass FILE to EXEC as input.
Disregard all the output corresponding to commands read from FILE (/dev/null/).
Pass CMD as the last command.
Fetch output for the last command and save it in a variable.
EXEC's output can be multiline for each query.
I know how to pass FILE + CMD into the EXEC:
echo ${CMD} | cat ${FILE} - | ${EXEC}
but I have no idea how to fetch only output resulting from CMD.
Is there a magical one-liner that does this?
After looking around I've found the following partial solution:
mkfifo mypipe
(tail -f mypipe) | ${EXEC} &
cat ${FILE} | while read line; do
echo ${line} > mypipe
done
echo ${CMD} > mypipe
This allows me to redirect my input, but now the output gets printed to screen. I want to ignore all the output produced by EXEC in the while loop and get only what it prints for the last line.
I tried what first came into my mind, which is:
(tail -f mypipe) | ${EXEC} > somefile &
But it didn't work, the file was empty.
This is race-prone -- I'd suggest putting in a delay after the kill, or using an explicit sigil to determine when it's been received. That said:
#!/usr/bin/env bash
# route FD 4 to your output routine
exec 4> >(
output=; trap 'output=1' USR1
while IFS= read -r line; do
[[ $output ]] && printf '%s\n' "$line"
done
); out_pid=$!
# Capture the PID for the process substitution above; note that this requires a very
# new version of bash (4.4?)
[[ $out_pid ]] || { echo "ERROR: Your bash version is too old" >&2; exit 1; }
# Run your program in another process substitution, and close the parent's handle on FD 4
exec 3> >("$EXEC" >&4) 4>&-
# cat your file to FD 3...
cat "$file" >&3
# UGLY HACK: Wait to let your program finish flushing output from those commands
sleep 0.1
# notify the subshell writing output to disk that the ignored input is done...
kill -USR1 "$out_pid"
# UGLY HACK: Wait to let the subprocess actually receive the signal and set output=1
sleep 0.1
# ...and then write the command for which you actually want content logged.
echo "command" >&3
In validating this answer, I'm doing the following:
EXEC=stub_function
stub_function() {
local count line
count=0
while IFS= read -r line; do
(( ++count ))
printf '%s: %s\n' "$count" "$line"
done
}
cat >file <<EOF
do-not-log-my-output-1
do-not-log-my-output-2
do-not-log-my-output-3
EOF
file=file
export -f stub_function
export file EXEC
Output is only:
4: command
You could pipe it into a sed:
var=$(YOUR COMMAND | sed '$!d')
This will put only the last line into the variable
I think, that your proram EXEC does something special (open connection or remember state). When that is not the case, you can use
${EXEC} < ${FILE} > /dev/null
myvar=$(echo ${CMD} | ${EXEC})
Or with normal commands:
# Do not use (printf "==%s==\n" 1 2 3 ; printf "oo%soo\n" 4 5 6) | cat
printf "==%s==\n" 1 2 3 | cat > /dev/null
myvar=$(printf "oo%soo\n" 4 5 6 | cat)
When you need to give all input to one process, perhaps you can think of a marker that you can filter on:
(printf "==%s==\n" 1 2 3 ; printf "%s\n" "marker"; printf "oo%soo\n" 4 5 6) | cat | sed '1,/marker/ d'
You should examine your EXEC what could be used. When it is running SQL, you might use something like
(cat ${FILE}; echo 'select "DamonMarker" from dual;' ; echo ${CMD} ) |
${EXEC} | sed '1,/DamonMarker/ d'
and write this in a var with
myvar=$( (cat ${FILE}; echo 'select "DamonMarker" from dual;' ; echo ${CMD} ) |
${EXEC} | sed '1,/DamonMarker/ d' )

Pipe stdout to command which itself needs to read from own stdin

I would like to get the stdout from a process into another process not using stdin, as that one is used for another purpose.
In short I want to accomplish something like that:
echo "a" >&4
cat | grep -f /dev/fd/4
I got it running using an file as source for file descriptor 4, but that is not what I want:
# Variant 1
cat file | grep -f /dev/fd/4 4<pattern
# Variant 2
exec 4<pattern
cat | grep -f /dev/fd/4
exec 4<&-
My best try is that, but I got the following error message:
# Variant 3
cat | (
echo "a" >&4
grep -f /dev/fd/4
) <&4
Error message:
test.sh: line 5: 4: Bad file descriptor
What is the best way to accomplish that?
You don't need to use multiple streams to do this:
$ printf foo > pattern
$ printf '%s\n' foo bar | grep -f pattern
foo
If instead of a static file you want to use the output of a command as the input to -f you can use a process substitution:
$ printf '%s\n' foo bar | grep -f <(echo foo)
foo
For POSIX shells that lack process substitution, (e.g. dash, ash, yash, etc.).
If the command allows string input, (grep allows it), and the input string containing search targets isn't especially large, (i.e. the string doesn't exceed the length limit for the command line), there's always command substitution:
$ printf '%s\n' foo bar baz | grep $(echo foo)
foo
Or if the input file is multi-line, separating quoted search items with '\n' works the same as grep OR \|:
$ printf '%s\n' foo bar baz | grep "$(printf "%s\n" foo bar)"
foo
bar

writing file using cat with here-document

$ cat > out.txt <<EOF
> Hello world
> EOF
$
How do I do this in single statement?
Somethig like 'echo' in following statement
$ for i in {1..5}; do echo "Hello world" > out_$i.txt; done
You can use a here-string, which is a shortcut for short here documents in bash.
cat <<< "Hello world" > out.txt

tab delimit a file in bash

I have two files. I would like to join them by column and convert them from tab delimited to space delimted.
What is needed on top of
paste fileA fileB
to make that work?
Through awk,
awk 'FNR==NR{a[FNR]=$1; next} {print a[FNR]"\t"$2}' file1 file2
Example:
$ cat m
cat
dog
$ cat r
foo bar
bar foo
$ awk 'FNR==NR{a[FNR]=$1; next} {print a[FNR]"\t"$2}' m r
cat bar
dog foo
Talking about pure bash, something like this, haven't tested but you should be able to fix any bugs:
exec 3<file1
exec 4<file2
while :; do
read -r -u 3 f1_w || exit
read -r -u 4 f2_w1 f2_w2 || exit 1
echo -e "${f1_w}\t${f2_w2}"
done

Resources