Redirecting stdin to a file with the file content being reflected on the console - bash

Is there a way to redirect stdin to a file but at the same time reflect what's being read from the file on the console?
Update: I'm trying to redirect the contents of a file to the standard input of a program, but at the same time reflect the standard input and output of that program on the console. I've tried something like:
echo "$(cat inputfile)" | tee /dev/tty | ./program
which doesn't seem to be the right thing to do.

What you are doing seems fine to me. You can avoid the crazy stuff, though:
tee /dev/tty <inputfile | ./program
echo $(cat) will coincidentally squish whitespace. I assume you used this by mistake, but if that's what you genuinely want to accomplish, try
tr -s '\n\t' ' ' <inputfile | tee /dev/tty | ./program

Related

shell: send grep output to stderr and leave stdout intact

i have a program that outputs to stdout (actually it outputs to stderr, but i can easily redirect that to stdout with 2>&1 or the like.
i would like to run grep on the output of the program, and redirect all matches to stderr while leaving the unmatched lines on stdout (alternatively, i'd be happy with getting all lines - not just the unmatched ones - on stdout)
e.g.
$ myprogram() {
cat <<EOF
one line
a line with an error
another line
EOF
}
$ myprogram | greptostderr error >/dev/null
a line with an error
$ myprogram | greptostderr error 2>/dev/null
one line
another line
$
a trivial solution would be:
myprogram | tee logfile
grep error logfile 1>&2
rm logfile
however, i would rather get the matching lines on stderr when they occur, not when the program exits...
eventually, I found this, which gave me a hint to for a a POSIX solution like so:
greptostderr() {
while read LINE; do
echo $LINE
echo $LINE | grep -- "$#" 1>&2
done
}
for whatever reasons, this does not output anything (probably a buffering problem).
a somewhat ugly solution that seems to work goes like this:
greptostderr() {
while read LINE; do
echo $LINE
echo $LINE | grep -- "$#" | tee /dev/stderr >/dev/null
done
}
are there any better ways to implement this?
ideally i'm looking for a POSIX shell solution, but bash is fine as well...
I would use awk instead of grep, which gives you more flexibility in handling both matched and unmatched lines.
myprogram | awk -v p=error '{ print > ($0 ~ p ? "/dev/stderr" : "/dev/stdout")}'
Every line will be printed; the result of $0 ~ p determines whether the line is printed to standard error or standard output. (You may need to adjust the output file names based on your file system.)

How to save the terminal screen output in piping command

I have few commands that I'm piping. The first command gives a big file output, while its output on the screen is only a very short statistical summary of it. The big file output is being processed fine through the piping, but I'd like to save the screen output into a text file, so my question is how to do it within the piping?
So far I've tried using tee the below:
&> someFile.txt
> someFile.txt
>> someFile.txt
But all of them gave me the big file output, but I'd like only the screen short output.
Any ideas how to do that?
If you just want the output of command_to_refine_big_output on stdout and in a file called log in the current directory, this works:
command_with_big_output | command_to_refine_big_output | tee log
Note that this only writes stdout to log file, if your want stderr to, you can do:
command_with_big_output | command_to_refine_big_output 2>&1 | tee log
or, if you want all output, errors include of the complete chain:
command_with_big_output 2>&1 | command_to_refine_big_output 2>&1 | tee log

why cant I redirect the output from sed to a file

I am trying to run the following command
./someprogram | tee /dev/tty | sed 's/^.\{2\}//' > output_file
But the file is always blank when I go to check it. If I remove > output_file from the end of the command, I am able to see the output from sed without any issues.
Is there any way that I can redirect the output from sed in this command to a file?
Remove output-buffering from sed command using the -u flag and make sure what you want to log isn't on stderr
-u, --unbuffered
load minimal amounts of data from the input files and flush the output buffers more often
Final command :
./someprogram | tee /dev/tty | sed -u 's/^.\{2\}//' > output_file
This happens with streams (usually a program sending output to stdout during its whole lifetime).
sed / grep and other commands do some buffering in those cases and you have to explicitly disable it to be able to have an output while the program is still running.
You got a Stderr & stdout problem. Checkout In the shell, what does " 2>&1 " mean? on this topic. Should fix you right up.

Write output to file with tabs/text added in ksh script

I am writing a KornShell (ksh) script that is logging to a file. I am redirecting the output of one of my commands (scp) to the same file, but I would like to add a tab at the start of those lines in the log file if possible.
Is this possible to do?
EDIT: Also I should mention that the text I am redirecting is coming from stderr. My line currently looks like this:
scp -q ${wks}:${file_location} ${save_directory} >> ${script_log} 2>&1
Note: the below doesn't work for ksh (see this question for possible solutions).
You probably can do something like
my_command | sed 's/^/\t/' >> my.log
The idea is to process the output of the command with a stream editor like sed in some manner. In this case, a tab will be added at the beginning of every line. Consider:
$ echo -e 'Test\nfoobar' | sed 's/^/\t/'
Test
foobar
I haven't tested this in ksh, but a quick web search suggests that it should work.
Also note that some commands can write to both stdout and stderr, don't forget to handle it.
Edit: in response to the comment and the edit in the question, the adjusted command can look like
scp -q ${wks}:${file_location} ${save_directory} 2>&1 | \
sed 's/^/\t/' >> ${script_log}
or, if you want to get rid of stdout completely,
scp -q ${wks}:${file_location} ${save_directory} 2>&1 >/dev/null | \
sed 's/^/\t/' >> ${script_log}
The technique is described in this answer.

bash, nested commands and redirects

I am trying to track the CPU usage of a process using a command like this:
top -b -d 1 | grep myprocess.exe
Next, I would like to redirect this to a log file, e.g.
top -b -d 1 | grep myprocess.exe > output.log
Now, this does not actually work because it thinks I am grepping myprocess.exe > output.log
instead of myprocess.exe
Does anybody know how I can get this redirect to work?
Now, this does not actually work because it thinks I am grepping myprocess.exe > output.log instead of myprocess.exe
Wrong. All should be fine. The 1st example executes the pipeline with stdout set to your terminal (thus you see the output, but nothing is written to the file). The 2nd example executes the pipeline with stdout set to output.log (thus you don't see output, but it will go right in your file).
If you want the output written to both, you need another process that gets your previous pipeline's stdout as stdin, and duplicates it. Like:
previous_pipeline | tee output.log
tee will print on stdout what it gets on stdin (So for stdout, everything is the same as before), but additionally open another file (given as cmdline arg) and write a copy to it.
Try tee:
top -b -d 1 | grep myprocess.exe | tee output.log
If you want it to show no output:
top -b -d 1 | grep myprocess.exe | tee output.log > /dev/null

Resources