Assign Valgrind's report to a variable in bash script - bash

I'm working on a test suite for programs written in C. For this I made a bash script in which I run the submitted programs on all availabe test cases and compare their output to the expected output.
As a last test case, I'd also like to do a check for memory leaks. The idea is to run Valgrind using the last availabe test case as input, and then assign Valgrind's output to a variable (discarding the program's output), which I would then use to look for certain erros using grep in order to output a summary in case some errors or leaks were indeed detected.
I've tried several things, but so far I'm unable to assign Valgrind's output to a variable.
Last thing I tried was:
TEST=$(valgrind ./a.out < "${infiles["$((len-1))"]}" >/dev/null)
I still get Valgrind's report displayed in the terminal and if I try to echo "$TEST" in the bash script, I get nothing.

valgrind is writing its output to stderr, not stdout, but $(...) only captures stdout. So you have to redirect stderr to stdout.
TEST=$(valgrind ./a.out < "${infiles["$((len-1))"]}" 2>&1 >/dev/null)

Related

Error piping before output

Let’s you have a group of commands, and some give error as follows:
begin;
echo “Starting Test";
ls;
bad_command -xyz;
end
If you don’t redirect the output or error the results is as expected i.e.
Starting Test
foo.txt
someotherfile.png
someDir
Unknown command: ‘bad_command’
However, is I pipe the whole block to TextEdit by changing last line to end 2&>1 | open -f -a TextEdit the error comes first in the file, and the order is messed up. This also happens when piping to other commands. Why does that happen, and how can I prevent that?
You are having this problem because pipes will buffer stdout but not stderr, and therefore you are getting stderr output first. The only way to solve this is to not use pipes, and instead redirect your output to a temporary file. Then, work with that file for what you need to do.

Pipe direct tty output to sed

I have a script using for a building a program that I redirect to sed to highlight errors and such during the build.
This works great, but the problem is at the end of this build script it starts an application which usually writes to the terminal, but stdout and stderr redirection doesn't seem to capture it. I'm not exactly sure how this output gets printed and it's kind of complicated to figure out.
buildAndStartApp # everything outputs correctly
buildAndStartApp 2>&1 | colorize # Catches build output, but not server output
Is there any way to capture all terminal output? The "script" command catches everything, but I would like the output to still print to my terminal rather than redirecting to a file.
I found out script has a -c option which runs a command and all of the output is printed to stdout as well as to a file.
My command ended up being:
script -c "buildAndStartApp" /dev/null | colorize
First, when you use script, the output does still go to the terminal (as well as redirecting to the file). You could do something like this in a second window to see the colorized output live:
tail -f typescript | colorize
Second, if the output of a command is going to the terminal even though you have both stdout and stderr redirected, it's possible that the command is writing directly to /dev/tty, in which case something like script that uses a pseudo-terminal is the only thing that will work.

Bash stderr and stdout redirection failing

I have a FORTRAN program output I want to redirect to file. I've done this before and use
$myprog.out>>out.txt 2>&1
and for some reason this is not working. I test it with a another simple test program
$myprog.test>>out.txt 2>&1
and it works
I run myprog.out and the output goes to screen as usual but redirecting it seems to fail! It was working and now seems to have stopped working. It's very strange. I commented out a few print statements that I no longer wanted, recompiled and then band redirect does not work.
There is clearly something different going on with my output but how to diagnose where it is going?
You probably need to flush your output. See for example this SO topic. How to do that depends on your compiler I guess. Because only Fortran 2003 Standard includes flush() statement and the ability to determine numbers that corresponds to stdout/stderr units.
However in gfortran (for example) you can use flush() intrinsic procedure with the equivalents of Unix file descriptors: UNIT=5 for stdin, UNIT=6 for stdout and UNIT=0 for stderr.
PROGRAM main
PRINT *, "Hello!"
CALL flush(6)
CALL flush(0)
END PROGRAM main
With >> you are appending the output of your program to out.txt every time you run it.
Can you just try scrolling to the end of out.txt and see if your output is there?

Log invoked commands of make

Is there a way to log the commands, make invokes to compile a program? I know of the parameters -n and -p, but they either don't resolve if-conditions but just print them out. Or they don't work, when there are calls to 'make' itself in the Makefile.
This
make SHELL="sh -x -e"
will cause the shell (which make invokes to evaluate shell constructs) to print information about what it's doing, letting you see how any conditionals in shell commands are being evaluated.
The -e is necessary to ensure that errors in a Makefile target will be properly detected and a non-zero process exit code will be returned.
You could try to log execve calls with strace
strace -f -e execve make ...
Make writes each command it executes to the console, so
make 2>&1 | tee build.log
will create a log file named build.log as a side effect which contains the same stuff written to the screen. (man tee for more details.)
2>&1 combines standard output and errors into one stream. If you didn't include that, regular output would go into the log file but errors would only go to the console. (make only writes to stderr when a command returns an error code.)
If you want to suppress output entirely in favor of logging to a file, it's even simpler:
make 2>&1 > build.log
Because these just capture console output they work just fine with recursive make.
You might find what you're looking for in the annotated build logs produced by SparkBuild. That includes the commands of every rule executed in the build, whether or not "#" was used to prevent make from printing the command-line.
Your comment about if-conditions is a bit confusing though: are you talking about shell constructs, or make constructs? If you mean shell constructs, I don't think there's any way for you to get exactly what you're after except by using strace as others described. If you mean make constructs, then the output you see is the result of the resolved conditional expression.
Have you tried with the -d parameter (debug)?
Note that you can control the amount of infos with --debug instead. For instance, --debug=a (same as -d), or --debug=b to show only basic infos...

bash - redirecting of stdoutput and stderror does not catch all output

I am writing some testing scripts and want to catch all error output and write it to an error log as well as all regular output and write that to a separate log. I am using a command of the form
cmd > output.file 2> error.file
The command I am writing test scripts for can cause a segmentation fault. When the command segfaults, bash still prints out segmentation fault to the terminal.
I want this to not happen or get redirected along with standard error.
Is it possible? It must be of bash's doing because both output streams are being redirected.
bash -c 'cmd >output.file 2>error.file' >bash_output.file 2>&1
I don't think segfaults are part of your program's output from the shell's point of view. So use
Expect for more reliable output
http://en.wikipedia.org/wiki/Expect

Resources