When I go and type
./script.txt
It displays the output in terminal, but if I want to display it on the screen and store it at the same time, how do I do this? Because If I do
./script.txt >> example.txt
It will only store it.
try
./script.txt 2>&1 | tee -a example.txt
The 2>&1 redirects stderr into the stdout stream. Now both streams come thru the pipe to tee, which dupes a file for output AND also sends a copy of all input to its stdout.
I hope this helps.
P.S. you're not really naming your scripts with .txt extension are you? ;-)
Related
I am using exim 4.92.3 on a CentOS 7.8.
I wanted to capture all the output from the command used for testing aliases resolution (exim -d -bt adres#domain |& tee exim-test.out), but only stdout was displayed on the terminal and written to the file. When I split outputs with exim [...] 1>1.out 2>2.out the streams are separated and recorded as expected. How to send both stdout and stderr from exim to one file, and why it is behaving like this?
Thank you in advance for help.
why it is behaving like this?
This can only be answered if you specify it, i. e. which shell is used. It may be that it doesn't offer |&.
How to send both stdout and stderr from exim to one file
2>&1 will work, i. e. exim -d -bt adres#domain 2>&1 | tee exim-test.out.
tee is changing the order of lines
You may be able to avoid the perceived reordering by prepending stdbuf -oL to the exim command.
It's easy to redirect standard output and standard error to the same file or to separate files. What if I want to do both at the same time? That is, I'd like three files as output: standard output and standard input mixed together in order and standard output and standard error in separate files. Maybe something involving the "tee" command?
Thanks!
Following ideas in comments, use tee to place stdout/stderr into specific file, and into combined file.
rm -f both.log
some-command 2> >(tee err.log >>both.log) | tee out.log >> both.log
I would like to write some filtered output of a given command (rsync) into a file but keep the complete unfiltered output on stdout (the screen/terminal).
I've tried some combinations of sed, tee & process substitution but cannot make this work.
Here's what I've got so far:
rsync -aAXz --stats -v src dest > >(sed '0,/^$/d' | tee -a "summary.log")
sed '0,/^$/d' deletes everything before the first blank line which leaves rsync's summary and deletes the leading verbose output. This is working as expected and only prints the summary to summary.log.
Obviously it also deletes the verbose output from stdout since the tee command only receives the filtered sed output over the pipe.
How can I write to stdout before filtering with sed to see all the verbose output on the screen/terminal?
To write your complete output from stdout to the log file, do that first (using tee), and then deal with the terminal. Something like this:
rsync -aAXz --stats -v src dest | tee -a "summary.log" | sed '0,/^$/d'
That "splits" or duplicates the output stream, with one copy of the complete stream being diverted by tee to the output log, and another copy being sent to its stdout, which becomes the input to sed....
Sorry for the title, i couldn't find proper words to explain my problem.
Here's the code:
wlan_c=$(iwconfig | sed '/^\(w.*$\)/!d;s/ .*//' > ./wifi_iface)
wlan=$(<./wifi_iface)
echo "$wlan"
I get the following output:
lo no wireless extensions.
enp4s0 no wireless extensions.
wlp2s0
The last line is the result of execution the echo "$wlan".
The previous lines coming from the iwconfig, those that are not getting formatted by sed.
And the file ./wifi_iface also has the info i need.
Everything works as intended.
So i really want to get rid of that unwanted output before the wlp2s0 line.
How do i manage to do this?
That output must be going to stderr rather than stdout. Redirect it to /dev/null
iwconfig 2>/dev/null | sed '/^\(w.*$\)/!d;s/ .*//' > ./wifi_iface
There's no need to assign this to wlan_c. Since you're writing to the file, nothing will be written to stdout, so the assignment will always be empty.
So I have a Linux program that runs in a while(true) loop, which waits for user input, process it and print result to stdout.
I want to write a shell script that open this program, feed it lines from a txt file, one line at a time and save the program output for each line to a file.
So I want to know if there is any command for:
- open a program
- send text to a process
- receive output from that program
Many thanks.
It sounds like you want something like this:
cat file | while read line; do
answer=$(echo "$line" | prog)
done
This will run a new instance of prog for each line. The line will be the standard input of prog and the output will be put in the variable answer for your script to further process.
Some people object to the "cat file |" as this creates a process where you don't really need one. You can also use file redirection by putting it after the done:
while read line; do
answer=$(echo "$line" | prog)
done < file
Have you looked at pipes and redirections ? You can use pipes to feed input from one program into another. You can use redirection to send contents of files to programs, and/or write output to files.
I assume you want a script written in bash.
To open a file you just need to type a name of it.
To send a text to a program you either pass it through | or with < (take input from file)
To receive output you use > to redirect output to some file or >> to redirect as well but append the results instead of truncating the file
To achieve what you want in bash, you could write:
#/bin/bash
cat input_file | xargs -l1 -i{} your_program {} >> output_file
This calls your_program for each line from input_file and appends results to output_file