Write command output to file plus console from another console - shell

I am developing a script where I have to run it from the command line (say T1). The script has to open another terminal (T2) and the output of this terminal (T2) has to be redirected to a file, so that I can parse the file from the main script (T1). I know how to open a new terminal (T2) from the main terminal (T1).
gnome-terminal -e "ant" 2>&1
I also know how to throw command output to file plus console by using tee
ls | tee /home/xyz.txt
So I try to run T2 from T1 and redirect T2's output to xyz.txt by doing this:
gnome-terminal -e "ant" 2>&1 | tee /home/xyz.txt
However xyz.txt doesn't get output from T2.
So how to get the output of T2 to xyz.txt from T1?

While this sounds very convoluted and looks like an XY-Problem, here's one way to do it (tested with xterm instead of gnome-terminal).
gnome-terminal -e "ant | tee $(tty) xyz.txt; read dummy"
The tty names the terminal device where you start the terminal, not the new terminal. The read is optional and waits for ENTER so you see what's on the terminal display.

Related

How to capture the output of a bash command which prompts for a user's confirmation without blocking the output nor the command

I need to capture the output of a bash command which prompts for a user's confirmation without altering its flow.
I know only 2 ways to capture a command output:
- output=$(command)
- command > file
In both cases, the whole process is blocked without any output.
For instance, without --assume-yes:
output=$(apt purge 2>&1 some_package)
I cannot print the output back because the command is not done yet.
Any suggestion?
Edit 1: The user must be able to answer the prompt.
EDIT 2: I used dash-o's answer to complete a bash script allowing a user to remove/purge all obsolete packages (which have no installation candidate) from any Debian/Ubuntu distribution.
To capture partial output from that is waiting for a prompt, one can use a tail on temporary file, potentiality with 'tee' to keep the output flowing if needed. The downside of this approach is that stderr need to be tied with stdout, making it hard to tell between the two (if this is an issue)
#! /bin/bash
log=/path/to/log-file
echo > $log
(
while ! grep -q -F 'continue?' $log ; do sleep 2 ; done ;
output=$(<$log)
echo do-something "$output"
) &
# Run command with output to terminal
apt purge 2>&1 some_package | tee -a $log
# If output to terminal not needed, replace above command with
apt purge 2>&1 some_package > $log
There is no generic way to tell (from a script) when exactly a program prompts for input. The above code looks for the prompt string ('continue?'), so this will have to be customized per command.

How to save command output to a file in windows and show output on terminal also?

I want to save output of a command in a file as well as display output on terminal.
For saving output to a file I am using following command:
>> ps -ef > D:\temp.txt
But this saves output into files but do not display on terminal. Which command should I use to perform both requirements?
Use MinGW, and than use the following command:
ps -ef 2>&1 | tee D:\temp.txt
This will work.

Can't redirect command output to file

I've been writing a bash script to install PBIS Open when I execute the following command domainjoin-cli join $domain $join_account $password I can see the output on the terminal. However, if I try to capture the terminal output and save the output to a file, the file is empty.
I've tried adding <cmd> > output.txt
I've tried using
script output.txt
<cmd>
exit
I've searched for a day now but I can't seem to find a working solution.
There are two types of output stream stdout and stderr. It is probably coming out on the stderr stream. The > by itself will only capture the stdout.
Try executing with
<cmd> &> filename

How do I automatically save the output of the last command I've run (every time)?

If I wanted to have the output of the last command stored in a file such as ~/.last_command.txt (overwriting output of previous command), how would I go about doing so in bash so that the output goes to both stdout and that file? I imagine it would involve piping to tee ~/.last_command.txt but I don't know what to pipe to that, and I definitely don't want to add that to every command I run manually.
Also, how could I extend this to save the output of the last n commands?
Under bash this seems to have the desired effect.
bind 'RETURN: "|tee ~/.last_command.txt\n"'
You can add it to your bashrc file to make it permanent.
I should point out it's not perfect. Just hitting the enter key and you get:
matt#devpc:$ |tee ~/.last_command.txt
bash: syntax error near unexpected token `|'
So I think it needs a little more work.
This will break program/feature expecting a TTY, but...
exec 4>&1
PROMPT_COMMAND="exec 1>&4; exec > >(mv ~/.last_command{_tmp,}; tee ~/.last_command_tmp)"
If it is acceptable to record all output, this can be simplified:
exec > >(tee ~/.commands)
Overwrite for 1 command:
script -c ls ~/.last_command.txt
If you want more than 1 command:
$ script ~/.last_command.txt
$ command1
$ command2
$ command3
$ exit
If you want to save during 1 login session, append "script" to .bashrc
When starting a new session (after login, or after opening the terminal), you can start another "nested" shell, and redirect its output:
<...login...>
% bash | tee -a ~/.bash_output
% ls # this is the nested shell
% exit
% cat ~/.bash_output
% exit
Actually, you don't even have to enter a nested shell every time. You can simply replace your shell-command in /etc/passwd from bash to bash | tee -a ~USERNAME/.bash_output.

Bash Redirect to a file

I am trying to redirect output of a command to a file. The command I am using (zypper) downloads packages from the internet. The command I am using is
zypper -x -n in geany >> log.txt
The command gradually prints output to the console. The problem I am facing is that the above command writes the command output all at once after the command finishes executing. How do I redirect the bash output as I get it onto the terminal, rather than writing all the command output at the end.
Not with bash itself, but via the tee command:
zipper -x -n in geany | tee log.txt
&>>FILE COMMAND
will append the output of COMMAND to FILE
In your case
&>>log.txt zypper -x -n in geany
If you want to pipe a command through a filter, you must assure that the command outputs to standard output (file descriptor 1) -- if it outputs to standard error (file descriptor 2), you have to redirect the 2 to 1 before the pipe. Take into account that only stdout passed through a pipe.
So you have to do so:
2>&1 COMMAND | FILTER
If you want to grep the output and in the same keep it into a log file, you have to duplicate it with tee, and use a filter like ... | tee log-file | grep options

Resources