Does anyone happen to know how to direct STDOUT in Terminal to Cache? Sometimes I would like to copy text from STDOUT somewhere else, e.g. my mail program, and it seems always a bit inconvenient to me to either copy the output manually or create a new temporary file.
Is there an easy way to do this?
Thanks a lot!
Alex
It's not clear exactly what you're asking. But if you're talking about capturing stdout to file whilst still being able to see it on the console, then you can use tee (assuming you're using *nix):
./myApp | tee stdout.txt
Related
i know what they are, but i dunno when i should use them. Are they useful? I think yes, but I want you to tell me in which situations a file descriptor could be useful. Thanks :D
The most obvious case which springs to mind is:
myProgram >myProgram.output_and_error 2>&1
which sends both standard output and error to the same file.
I've also used:
myProgram 2>&1 | less
which will allow me to page through the output and error in sequence (rather than having error got to the terminal in "arbitrary" places in the output).
Basically, any time when you need to get at an already existing file descriptor, you'll find yourself using this.
is there a way to make stdin and stderr visible in unix bash/zsh/whatever?
Maybe turn the stderr output to red or something like that.
It is always a pain if you are figuring out why you can't parse the output from command x. This often drives me crazy. Then I do the 2> thing but at this point 30 minutes are already gone...
-Timo
I usually just do
command | grep '.*'
I have set $GREP_COLORS to
ms=01;32:mc=01;32:sl=:cx=:fn=35:ln=32:bn=33:se=36
which means the stdout is green while stderr stays black (or white).
Here is my solution a tiny little dylib from github: https://github.com/sickill/stderred
You can use stderred to automagically colorize stderr. It's a shared library that intercepts certain stream functions, so works with any program that uses those functions to write to your terminal.
I have a project, which builds by using make, and I want to add possibility
to analyze overall state of warning messages.
My idea is change make rules in order to duplicate stderr compilation output into separate file during full rebuild. Means each time make all will be done, all output will be printed in console and in addition stderr output will be duplicating into separate file.
This warning report file will be added into repository, so that I will have possibility to compare warnings existing in repository and local warnings.
The question is how to DUPLICATE (not redirect) stderr output into separate file? Means how I should change all target in Makefile?
I know how to redirect stderr output (make all 2>warning_report.txt), but it is not
what I need. Warning messages should be both in main console output and in warning file.
I use Windows 7 as work environment but I had no any deal with Windows command line or batch files before.
Thanks in advance.
Edited:
In my case final solution looks like below:
make all 3>&1 1>&2- 2>&3- | tee.bat warning_report.txt
In this case script tee.bat, which is written in JScript for Windows, I took from link specified by PA (thanks).
What about swapping, I took it from here and here.
I don't know about windows but you can do it using tee command in Linux. tee is used to redirect STDOUT to file as well as console so you can take its advantage and check if you can solve your problem.
make all 2>&1 1>stdout.log | tee stderr.log
redirect STDERR to STDOUT, redirect STDOUT to stdout.log and all the STDERR is copied to stderr.log ans echoed on the console as well.
But the solution is not complete yet. The STDOUT is not printed on the console but only copied to the stdout.log. Try playing around the commands you will get the solution .
I just re-read your question and decided I would try to answer it.
Here's a snippet made to export stderr and display it.
#echo off
if exist stderr.error del stderr.error
this_is_not_a_command 2>stderr.error
if exist stderr.error type stderr.error & del stderr.error
This would export stderr to a file and then display the contents of that file.
Hope that helps.
I have say two terminal sessions pts/10 and pts/11. In pts/10, I want to capture the stdout of any process that takes place in pts/11 and redirect it to a file. I know that the output can be redirected from pts/11 itself (using >/dev/pts/10), but I don't want to do that. As I said, I want to 'capture' whatever is being printed in the stdout by pts/11. Is there some utility to do that?
I don't think, you can do that, UNLESS, you start something on pts/11 (Either output redirect, or tee /dev/pts/10 or script command.)
If it had been possible, it could essentially be used in hacking/snooping.
Imagine getting passwords in case of wget --user=someuser --password=plain_text_password command run on the terminal pts/11 & captured by pts/10. (EDIT: Ok, that was stdin, not stdout.) But there may be serious security issue if it had been possible.
Quick question, hopefully... I'm building an application with a fairly extensive log file. I'd like the ability at any time to monitor what a specific instance of my application is doing. I could open and close the log file a bunch of times, but its kind of a pain. Optimally, as lines are written to the log file, they would be written to the console as well. So I'm hoping something along the lines of "cat" exists that will actually block and wait for more content to be available in the input file. Anyone have any ideas?
tail -f logfile
this will keep it open and 'follow' the new output.
tail -f yourlogfile
tail -f logfile
An alternate answer for variety: If you're already looking at the log file with less, press capital F to get it to do the same thing tail -f does: wait for new content to be appended and show it.
Look at tee utility
http://www.devdaily.com/blog/post/linux-unix/use-unix-linux-tee-command-send-output-two-or-more-directions-a