I'm working on a test suite for programs written in C. For this I made a bash script in which I run the submitted programs on all availabe test cases and compare their output to the expected output.
As a last test case, I'd also like to do a check for memory leaks. The idea is to run Valgrind using the last availabe test case as input, and then assign Valgrind's output to a variable (discarding the program's output), which I would then use to look for certain erros using grep in order to output a summary in case some errors or leaks were indeed detected.
I've tried several things, but so far I'm unable to assign Valgrind's output to a variable.
Last thing I tried was:
TEST=$(valgrind ./a.out < "${infiles["$((len-1))"]}" >/dev/null)
I still get Valgrind's report displayed in the terminal and if I try to echo "$TEST" in the bash script, I get nothing.
valgrind is writing its output to stderr, not stdout, but $(...) only captures stdout. So you have to redirect stderr to stdout.
TEST=$(valgrind ./a.out < "${infiles["$((len-1))"]}" 2>&1 >/dev/null)
If I have this batch:
ScriptA.bat
someprog.exe
And this one:
ScriptB.bat
CALL ScriptA.bat
And I execute a command like:
ScriptB.bat > test.log
The output from someprog.exe is not logged. It flows through to the console. How can I avoid having to explicitly pipe the output of someprog.exe to a file, and instead just capture that from a higher level?
(Note I ultimately want to do this with a great many scripts launching assorted exes from inside those nesting, and I can't edit them all to redirect the output of each and every sub process they invoke).
I found the answer to this on another SO thread:
https://stackoverflow.com/a/11955380/3220983
As you'll see if reading the comments under the question, the problem I was encountering was that the messages I couldn't capture were not being piped to stdout or stderr at all! They were going straight to the console via something akin to a CON redirect from inside the specific executable I was trying to use.
The link I posted shows how to launch a PowerShell script from a batch script, which captures the entire console window contents, inclusive of CON output!
I want to invoke mpg123 from PHP (using exec) and monitor the diagnostic output generated by the program while it is running.
I have been searching the Internet and cannot find any way to see the redirected output of a command line program while it is running.
Instead, the output file is always written out AFTER the process finishes, but I need to access the output while it still running, hence my question.
Testing with:
mpg123.exe http://148.251.184.14:8192/stream | tee.exe streaming.txt
... file streaming.txt` is always empty while running the exe.
[Editors note: and so it would be, mpg123 sends diagnostic output to stderr].
Also, I tested this:
mpg123.exe http://148.251.184.14:8192/stream > streaming.txt
... and still no luck, because again, file streaming.txt is always empty while mpg123 is still running.
[Editor's note: of course, for the same reason as above, the command should be:
mpg123.exe http://148.251.184.14:8192/stream 2> streaming.txt
But still you see nothing in file streaming.txt until the program terminates.
end note]
Is there a way to do this? Seems to be a hard nut or not even possible...
Thank you for any help.
PS:
Using static binary from: https://mpg123.de/download/win64/1.25.10/
Tee.exe: https://sourceforge.net/projects/unxutils/files/unxutils/current/
You could, for example, get tail from GnuWin32 (it's in package coreutils). Then:
In one command prompt window run tail -F output-file. This will initially sit there because there is no output-file yet. Let it sit.
In another command prompt window run your-command > output.file.
In the first command prompt window tail will display the contents of output-file as it is generated.
Note 1: The program your-command may buffer its output, so that it written in chunks. Some programs have options to minimize output buffering, for example sed -u or grep --line-buffered.
Note 2: tail works as fast as it can, but console output is quite slow on Windows. It is perfectly possible for a program to generate output much faster than tail can display it.
I have tested this procedure with dir /s C:\ > Ls-lR.txt and tail Ls-lR.txt.
The quirks of MPG123
The specific program which the querent wants to monitor is MPG123. This program:
Does not normally write to standard output, and it actually closes stdandard output unless it wants to write WAV data.
Writes diagnostic messages to standard error, but only if standard error is not redirected or the option -v is given.
So...
Open a command prompt window and type tail -F mpg123.out. Since there is no file named mpg123.out, tail will sit and wait. Let it wait.
C> tail -F MPG123.out
Open a second command prompt window, and run mpg123
Redirecting stdandard error to mpg123.out, and
With the option -v.
C> mpg123.exe 2>MPG123.out -v "\path\to\the\music\file.mp3"
In the first window, watch the diagnostic messages of MPG123.
I have decided to delete my original answer and post a new one, because although the old one was factually correct it didn't answer the question very well. Now that I understand what the OP is actually doing, I can answer this properly.
The issue is actually very simple. Most programs, especially command line programs, on most platforms contain logic to detect if stdout or stderr has been redirected to a file (> file) or a pipe (e.g. | tee). This logic is usually actually buried in the runtime library so programs get it for free, which is why they pretty much all do it, and I'm sure that's true of mpg123 which is a relatively simple beast. What I say below will apply to almost any program.
Now, what this logic does is to decide whether or not to buffer output to stdout / stderr (it may make a different decision for each one). If output is going directly to the console (or, in Unix, the terminal) then it is not buffered at all (or maybe just on a per-line basis). Everything is sent out pretty much as soon as the program generates it.
If, on the other hand, output is redirected then mpg123 detects this and writes the data out in chunks (often 4k chunks), and if the total amount of output generated while the program is running is smaller than the size of the buffer then you won't see anything in the output file or pipe until the program terminates, at which point the buffer is flushed and the file closed (so you see it then, as the OP noted).
Now, knowing all that, we can explain the behaviour that the OP observes when running mpg123. This is not in fact down to any intricate juggling that mpg123 might do with file handles and the change in behaviour when you add in -v is just a side-effect. What you see is a direct result of the different buffer strategy used when the output is redirected.
So, using the binary linked to by the OP, this command:
mpg123 http://148.251.184.14:8192/stream
Generates the following output on the console straightaway (because nothing is buffered):
High Performance MPEG 1.0/2.0/2.5 Audio Player for Layers 1, 2 and 3
version 1.25.10; written and copyright by Michael Hipp and others
free software (LGPL) without any warranty but with best wishes
Directory: http://148.251.184.14:8192/
Playing MPEG stream 1 of 1: stream ...
ICY-NAME: Chroma Metal
ICY-URL: http://chromaradio.com
MPEG 1.0 L III cbr128 44100 j-s
ICY-META: StreamTitle='Avantasia - The Seven Angels';
It then goes on to play the stream though the sound card, which takes quite a while. The above information is written to stdout (and mpg123 always writes diagostic information to stdout).
This command, however, behaves differently, because the output is buffered (note the redirection of stdout):
mpg123 http://148.251.184.14:8192/stream 2>x.txt
As noted by the OP, this just creates a zero length file while the stream is playing, because the total amount of diagnostic output fits in mpg123s internal buffer so it just stays there until the program terminates, at which point the output duly turns up in the file for the reason given above.
And finally, this command, with the -v parameter added in:
mpg123 -v http://148.251.184.14:8192/stream 2>x.txt
does generate some output in x.txt while the program is running because the buffer fills up with the extra diagnostic information that the -v flag generates and at that point mpg123 has to write it to disk. The -v flag means verbose. That's where the extra output comes from.
Please note though that when you do this the data in the file is still always some way behind (because the next buffer-full is building up and won't be output until it's full), so while adding -v might get you what you want (or at least some of it), it hasn't changed the underlying problem. You can see this quite clearly if you run the above command in one console window and tail -F x.txt in another. When you do that, nothing shows up for the first 5 seconds or so. Then some (partial) output appears, and so it goes on.
So I hope that clears things up. Windows and Unix behave pretty much the same in this regard. I will edit the OP's question to make it a little less confusing. It's a bit untidy at the moment.
Perhaps the "tee" already on the machine could be used. I do not have you mpg123.exe executable, so I cannot test it.
powershell -NoProfile -Command "& mpg123.exe [StreamURL] | Tee-Object -FilePath .\streaming.txt"
Edit
Based on the information from #AlexP that mpg123.exe is writing to stderr, I would try:
powershell -NoProfile -Command "& mpg123.exe [StreamURL] 2>&1 | Tee-Object -FilePath .\streaming.txt"
I need to redirect output & error streams from one Windows process (GNU make.exe executing armcc toolchain) to some filter written on perl. The command I am running is:
Make Release 2>&1 | c:\cygwin\bin\perl ../tools/armfilt.pl
The compilation process throws out some prints which should be put then to STDOUT after some modifications. But I encountered a problem: all prints generated by the make are actually postponed till end of the make's process and only then are shown to a user. So, my questions are:
Why has it happen? I have tried to change the second process (perl.exe) priority from "Normal" to "Above normal" but it didn't help...
How to overcome this problem?
I think that one of possible workarounds may be to send only STDERR prints to the perl (that is what I actually need), not STDOUT+STDERR. But I don't know how to do it in Windows.
The Microsoft explanation concerning pipe operator usage says:
The pipe operator (|) takes the output (by default, STDOUT) of one
command and directs it into the input (by default, STDIN) of another
command.
But how to change this default STDOUT piping is not explained. Is it possible at all?
I want to put my program's output into a file. I keyed in the following :
./prog > log 2>&1
But there is nothing in the file "log". I am using the Ubuntu 11.10 and the default shell is bash.
Anybody know the cause of this AND how I can debug this?
There are many possible causes:
The program reads the input from log file while you try to redirect into it with truncation (see Why doesn't "sort file1 > file1" work?)
The output is buffered so that you don't see data in the file until the output buffer is flushed. You can manually call fflush or output std::flush if using C++ I/O stream etc.
The program is smart enough and disables output if the output stream is not a terminal.
You look at the wrong file (i.e. in another directory).
You try to dump file's contents incorrectly.
Your program outputs '\0' as the first character so the output appears to be empty, even though there is some data.
Name your own.
Your best bet is to run this application under a debugger (like gdb) or use strace or ptrace (or both) and see what the program is doing. I mean, really, output redirection works for the last like 40 years, so the problem must be somewhere else.