bash : parse output of command and store into variable - bash

I have made a command witch return 'version:X' .
ie:
$>./mybox -v
$>version:2
I don't understand why this isn't working :
$>VERSION=$( /home/mybox -v | sed 's/.*version:\([0-9]*\).*/\1/')
$>echo $VERSION
$>
if I write this, it is ok :
$>VERSION=$( echo "version:2" | sed 's/.*version:\([0-9]*\).*/\1/')
$>echo $VERSION
$>2
Regards

It's pretty common for version/error/debugging information to be sent to stderr, not stdout. When running the command from a terminal, both will be printed, but only stdout will make it through the pipe to sed.
echo output always goes to stdout by default, which is why you're not having trouble there.
If the above is correct, you'll just need to redirect stderr (file descriptor 2) to stdout (file descriptor 1) before passing it along:
VERSION=$( /home/mybox -v 2>&1 | sed 's/.*version:\([0-9]*\).*/\1/')
# ^^^^

Related

shell: send grep output to stderr and leave stdout intact

i have a program that outputs to stdout (actually it outputs to stderr, but i can easily redirect that to stdout with 2>&1 or the like.
i would like to run grep on the output of the program, and redirect all matches to stderr while leaving the unmatched lines on stdout (alternatively, i'd be happy with getting all lines - not just the unmatched ones - on stdout)
e.g.
$ myprogram() {
cat <<EOF
one line
a line with an error
another line
EOF
}
$ myprogram | greptostderr error >/dev/null
a line with an error
$ myprogram | greptostderr error 2>/dev/null
one line
another line
$
a trivial solution would be:
myprogram | tee logfile
grep error logfile 1>&2
rm logfile
however, i would rather get the matching lines on stderr when they occur, not when the program exits...
eventually, I found this, which gave me a hint to for a a POSIX solution like so:
greptostderr() {
while read LINE; do
echo $LINE
echo $LINE | grep -- "$#" 1>&2
done
}
for whatever reasons, this does not output anything (probably a buffering problem).
a somewhat ugly solution that seems to work goes like this:
greptostderr() {
while read LINE; do
echo $LINE
echo $LINE | grep -- "$#" | tee /dev/stderr >/dev/null
done
}
are there any better ways to implement this?
ideally i'm looking for a POSIX shell solution, but bash is fine as well...
I would use awk instead of grep, which gives you more flexibility in handling both matched and unmatched lines.
myprogram | awk -v p=error '{ print > ($0 ~ p ? "/dev/stderr" : "/dev/stdout")}'
Every line will be printed; the result of $0 ~ p determines whether the line is printed to standard error or standard output. (You may need to adjust the output file names based on your file system.)

Redirect both stdout and stderr to file, print stdout only [duplicate]

This question already has answers here:
Separately redirecting and recombining stderr/stdout without losing ordering
(2 answers)
Closed 4 years ago.
I have a large amount of text coming in stdout and stderr; I would like to log all of it in a file (in the same order), and print only what comes from stdout in the console for further processing (like grep).
Any combination of > file or &> file, even with | or |& will permanently redirect the stream and I cannot pipe it afterwards:
my_command > output.log | grep something # logs only stdout, prints only stderr
my_command &> output.log | grep something # logs everything in correct order, prints nothing
my_command > output.log |& grep something # logs everything in correct order, prints nothing
my_command &> output.log |& grep something # logs everything in correct order, prints nothing
Any use of tee will either
print what comes from stderr then log everything that comes from stdout and print it out, so I lose the order of the text that comes in
log both in the correct order if I use |& tee but I lose control over the streams since now everything is in stdout.
example:
my_command | tee output.log | grep something # logs only stdout, prints all of stderr then all of stdout
my_command |& tee output.log | grep something # logs everything, prints everything to stdout
my_command | tee output.log 3>&1 1>&2 2>&3 | tee -a output.log | grep something # logs only stdout, prints all of stderr then all of stdout
Now I'm all out of ideas.
This is what my test case looks like:
testFunction() {
echo "output";
1>&2 echo "error";
echo "output-2";
1>&2 echo "error-2";
echo "output-3";
1>&2 echo "error-3";
}
I would like my console output to look like:
output
output-2
output-3
And my output.log file to look like:
output
error
output-2
error-2
output-3
error-3
For more details, I'm filtering the output of mvn clean install with grep to only keep minimal information in the terminal, but I also would like to have a full log somewhere in case I need to investigate a stack trace or something. The java test logs are sent to stderr so I choose to discard it in my console output.
While not really a solution which uses redirects or anything of that order, you might want to use annotate-output for this.
Assume that script.sh contains your function, then you can do:
$ annotate-output ./script.sh
13:17:15 I: Started ./script.sh
13:17:15 O: output
13:17:15 E: error
13:17:15 E: error-2
13:17:15 O: output-2
13:17:15 E: error-3
13:17:15 O: output-3
13:17:15 I: Finished with exitcode 0
So now it is easy to reprocess that information and send it to the files you want:
$ annotate-output ./script.sh \
| awk '{s=substr($0,13)}/ [OE]: /{print s> "logfile"}/ O: /{print s}'
output
output-2
output-3
$ cat logfile
output
error
error-2
output-2
error-3
output-3
Or any other combination of tee sed cut ...
As per comment from #CharlesDuffy:
Since stdout and stderr are processed in parallel, it can happen that some lines received on
stdout will show up before later-printed stderr lines (and vice-versa).
This is unfortunately very hard to fix with the current annotation strategy. A fix would
involve switching to PTRACE'ing the process. Giving nice a (much) higher priority over the
executed program could, however, cause this behaviour to show up less frequently.
source: man annotate-output

Another way to redirect output in Bash

in bash when we want to read file we use the cat command
cat file.txt
but if we don't want to use whitespace , we can type:
{cat,file.txt}
Is there a way to redirect output without using the symbols > or < or &
i mean is there an equivalent to this command:
cat file.txt > /dev/null
and Thanks.
|tee (called pipe-T) - will redirect output(same as >) to file but also prints at the stdout:
cat file.txt|tee outfile.txt
|tee -a: will append output to file (same as >>) but also prints at the stdout:
cat file.txt|tee -a outfile.txt
I don't understand why you'd want to, but you could do:
eval {cat,input} "$(echo _/dev/null | tr _ '\076')"
Apart from tee You may use exec to redirect the output
exec 3>&1 # making file descriptor 3 to point to 1 where 1 is stdout
exec 1>fileout #redirecting 1 ie stdout to a file
{cat,file} # this goes to fileout & question requirement
exec 1>&3 # restoring 1 to default
{cat,38682813.c} # This will be printed at the stdout

why cant I redirect the output from sed to a file

I am trying to run the following command
./someprogram | tee /dev/tty | sed 's/^.\{2\}//' > output_file
But the file is always blank when I go to check it. If I remove > output_file from the end of the command, I am able to see the output from sed without any issues.
Is there any way that I can redirect the output from sed in this command to a file?
Remove output-buffering from sed command using the -u flag and make sure what you want to log isn't on stderr
-u, --unbuffered
load minimal amounts of data from the input files and flush the output buffers more often
Final command :
./someprogram | tee /dev/tty | sed -u 's/^.\{2\}//' > output_file
This happens with streams (usually a program sending output to stdout during its whole lifetime).
sed / grep and other commands do some buffering in those cases and you have to explicitly disable it to be able to have an output while the program is still running.
You got a Stderr & stdout problem. Checkout In the shell, what does " 2>&1 " mean? on this topic. Should fix you right up.

Capture log4J output with grep

I know that log4j by default outputs to stderror.
I have been capturing the out put of my application with the following command:
application_to_run 2> log ; cat log | grep FATAL
Is there a way to capture the output without the auxiliary file?
If you want both stdout and stderr, use:
( application_to_run 2>&1 ) | grep FATAL
If you want both stderr alone, you can use:
( application_to_run 2>&1 >/dev/null ) | grep FATAL
The first sends all output destined for file handle 2 (stderr) to file handle 1 (stdout), then pipes that through grep. The second does the same but also sends stdout to the bit bucket. This will work since redirection is a positional thing. First, stderr is redirected to the current stdout, then stdout is redirected to /dev/null.
If you are asking how to redirect stderr to stdout so you can use it in a pipe, there are two ways I know of:
$ command 2>&1 | ...
$ command |& ..

Resources