redirect stdin and stdout using tee and keep previous std - bash

How can I both write to a file and display to screen using pipe with tee?
This command actually do it, the problem is that it writes to a new file and tail -f give me an error "truncate file".
ls -al | tee file.txt

-a option of tee is what you are looking for
-a, --append
append to the given FILEs, do not overwrite
so your line would be:
ls -al | tee -a file.txt

Related

Bash output from expect script to two different files

I am trying to output to two different files using tee. My first file will basically be tail -f /myfile and my second output will be a subset of the first file. I have looked online that they were saying we can use `|
tee >(proc1) >(proc2)
I have tried the above but both my files are blank.
Here is what i have so far:
myscript.sh
ssh root#server 'tail -f /my/dir/text.log' | tee >(/mydir/my.log) >(grep 'string' /mydir/my.log > /mydir/mysecond.log)
myexpect.sh
#!/usr/bin/expect -f
set pass password
spawn /my/dir/myexpect.sh
expect {
"key fingerprint" {send "yes/r"; exp_contiue}
"assword: " {send "$pass\r"}
}
interact
In your script, there are some problems in the usage of tee,
tee >(/mydir/my.log): can be substitute with tee /mydir/my.log, since tee would write to stdout and files, i.e. /mydir/my.log
grep 'string' /mydir/my.log > /mydir/mysecond.log: as I mentioned, tee would also write to stdout, so no need to grep the string from file, you can grep from stdout directly. Use pipeline to do it.
So the whole command shall be modified as followed,
ssh root#server 'tail -f /my/dir/text.log | tee /mydir/my.log | grep --line-buffered "string" > /mydir/mysecond.log'
Edit:
For your further question
The command would hang because of tail -f was still waiting for output the growing file. If you don't want the command hanged, try to remove -f for tail.
Depends on the option -f existed for tail, you shall use two different way to allow the grep write file.
For tail case: grep can successfully write file
For tail -f case: --line-buffered for grep would use line buffering on output

how does `ls "*" 2>&1 | tee ls.txt` work?

I was finding a way to save all output to a file and print it.
And the command like the following does work perfectly!
ls "*" 2>&1 | tee ls.txt
But I think I don't understand it well.
And I tried ls "*" | tee ls.txt. It doesn't work. The error message was not saved into ls.txt.
Also I tried ls "*" 1>&2 | tee ls.txt. It behaved some strange. What happened?
2>&1 says "redirect stderr (2) to wherever stdout (1) goes". 1 and 2 are the file descriptors of stdout and stderr respectively.
When you pipe ls output to tee, only stdout goes to tee (without 2>&1). Hence, the error messages are not saved into ls.txt.
You can actually use:
ls "*" |& tee ls.txt
to pipe both stdout and stderr to tee command.
ls '*' is actually trying to list a file with the name as * since '*' is inside the quotes.
Your command:
ls '*' 2>&1 | tee out
works by by first redirecting stderr(2) to stdout(1) then using the pipe to tee
As mentionned by l3x, you are redirecting the "standard error: 2" to "standard output: 1".
The solution is to trigger the redirection before the actual error occurs. So instead of using
ls "*" 2>&1 | tee ls.txt
You should use
ls 2>&1 "*" | tee ls.txt
This way the "standard error" will not be empty, and will be redirected to "standard output" and the tee will work because "standard output" will not be empty.
I already tested it and it works.
I hope that this was helpful

bash command to grep something on stderr and save the result in a file

I am running a program called stm. I want to save only those stderr messages that contain the text "ERROR" in a text file. I also want the messages on the console.
How do I do that in bash?
Use the following pipeline if only messages containing ERROR should be displayed on the console (stderr):
stm |& grep ERROR | tee -a /path/to/logfile
Use the following command if all messages should be displayed on the console (stderr):
stm |& tee /dev/stderr | grep ERROR >> /path/to/logfile
Edit: Versions without connecting standard output and standard error:
stm 2> >( grep --line-buffered ERROR | tee -a /path/to/logfile >&2 )
stm 2> >( tee /dev/stderr | grep --line-buffered ERROR >> /path/to/logfile )
This looks like a duplicate of How to pipe stderr, and not stdout?
Redirect stderr to "&1", which means "the same place where stdout is going".
Then redirect stdout to /dev/null. Then use a normal pipe.
$ date -g
date: invalid option -- 'g'
Try `date --help' for more information.
$
$ (echo invent ; date -g)
invent (stdout)
date: invalid option -- 'g' (stderr)
Try `date --help' for more information. (stderr)
$
$ (echo invent ; date -g) 2>&1 >/dev/null | grep inv
date: invalid option -- 'g'
$
To copy the output from the above command to a file, you can use a > redirection or "tee". The tee command will print one copy of the output to the console and second copy to the file.
$ stm 2>&1 >/dev/null | grep ERROR > errors.txt
or
$ stm 2>&1 >/dev/null | grep ERROR | tee errors.txt
Are you saying that you want both stderr and stdout to appear in the console, but only stderr (not stdout) that contains "ERROR" to be logged to a file? It is that last condition that makes it difficult to find an elegant solution. If that is what you are looking for, here is my very ugly solution:
touch stm.out stm.err
stm 1>stm.out 2>stm.err & tail -f stm.out & tail -f stm.err & \
wait `pgrep stm`; pkill tail; grep ERROR stm.err > error.log; rm stm.err stm.out
I warned you about it being ugly. You could hide it in a function, use mktemp to create the temporary filenames, etc. If you don't want to wait for stm to exit before logging the ERROR text to a file, you could add tail -f stm.err | grep ERROR > error.log & after the other tail commands, and remove the grep command from the last line.

bash output redirect prob

I want to count the number of lines output from a command in a bash script. i.e.
COUNT=ls | wc -l
But I also want the script to output the original output from ls. How to get this done? (My actual command is not ls and it has side effects. So I can't run it twice.)
The tee(1) utility may be helpful:
$ ls | tee /dev/tty | wc -l
CHANGES
qpi.doc
qpi.lib
qpi.s
4
info coreutils "tee invocation" includes this following example, which might be more instructive of tee(1)'s power:
wget -O - http://example.com/dvd.iso \
| tee >(sha1sum > dvd.sha1) \
>(md5sum > dvd.md5) \
> dvd.iso
That downloads the file once, sends output through two child processes (as started via bash(1) process substitution) and also tee(1)'s stdout, which is redirected to a file.
ls | tee tmpfile | first command
cat tmpfile | second command
Tee is a good way to do that, but you can make something simpler:
ls > __tmpfile
cat __tmpfile | wc -l
cat __tmpfile
rm __tmpfile

Print STDOUT in the middle of 2 Pipes in Solaris(bash)

http://www.webdesignerdepot.com/rss.htm
I have the same issue. This command:
./somescript.sh > ../log/scriptlog.log
requires the output of a command go to std out. but inside the script
command | mailx -s "Subject" recipient#somedomain.tld
what I would like to do is something like :
command | tee > /dev/stdout | mailx -s "Subject" recipient#somedomain.tld
Where the output of the command goes to stdout( to be redirected into the ..log/scriptlog.log file )
and also into stdin for the mailx command.
Any way to do that?
tee already sends to stdout.
... | tee -a log/scriptlog.log | ...
exec 3>&1
command | tee /dev/fd/3 | mailx ...
or, using process substitution:
command | tee >(mailx ...)
I'll try process substitution. To clarifily, I have a cron'd shell script . The cron entry is similar to:
/usr/script/myscript.sh > /usr/log/myscript.log
inside the script is a line similar to:
command | mailx -s "Subject" recipient
Since stdout from 'command' is being piped into the mailx command, it does appear in the log file 'myscript.log', but I want it to.
I tried capturing it into a variable but the line feeds appear to be lost that way. I could use a temporary file, but I was hoping for something more elegant.

Resources