Copy script output to file - bash

I have a series of bash scripts that echo a lot of data to stdout and occasionally to stderr. There is a main bash script which then imports and/or invokes many other bash scripts.
I'm trying to implement something that will capture the output from not just the parent script, but all children scripts. I need to capture both stdout and stderr so that any issues from compilation, etc... get captured in this log file.
I'm aware of tee and of course the normal stdout redirect >... but these don't seem to work without either adding these commands to each line of every script, both parent and children. There's several thousand lines in these scripts, so adding a redirect to each line would be impractical, even using sed.
I've seen suggestions such as: Redirect stderr and stdout in a Bash script
But these require editing every line in the scripts. Forcing my users to install screen is also impractical.
UPDATE:
Forgot to mention I want the output to still display on the console as well as write to the log. The scripts take several hours to run, and the user needs to know something is happening...

You can use yourmainscript 2>&1 | tee log which will capture stdout and stderr from all imported/invoked scripts in yourmainscript while also showing it on screen.
Inside yourmainscript, you can get the same effect using:
echo "Redirecting the rest of script output to 'log'"
exec > >(tee log) 2>&1
rest of your code
To redirect just a certain section:
echo "Redirecting the next commands"
{
cmd1
cmd2
} > >(tee log) 2>&1
echo "Continuing as normal"

Related

Pipe direct tty output to sed

I have a script using for a building a program that I redirect to sed to highlight errors and such during the build.
This works great, but the problem is at the end of this build script it starts an application which usually writes to the terminal, but stdout and stderr redirection doesn't seem to capture it. I'm not exactly sure how this output gets printed and it's kind of complicated to figure out.
buildAndStartApp # everything outputs correctly
buildAndStartApp 2>&1 | colorize # Catches build output, but not server output
Is there any way to capture all terminal output? The "script" command catches everything, but I would like the output to still print to my terminal rather than redirecting to a file.
I found out script has a -c option which runs a command and all of the output is printed to stdout as well as to a file.
My command ended up being:
script -c "buildAndStartApp" /dev/null | colorize
First, when you use script, the output does still go to the terminal (as well as redirecting to the file). You could do something like this in a second window to see the colorized output live:
tail -f typescript | colorize
Second, if the output of a command is going to the terminal even though you have both stdout and stderr redirected, it's possible that the command is writing directly to /dev/tty, in which case something like script that uses a pseudo-terminal is the only thing that will work.

Does stdout get stored somewhere in the filesystem or in memory? [duplicate]

This question already has answers here:
Send output of last command to a file automatically in bash?
(3 answers)
Closed 8 years ago.
I know I can save the result of a command to a variable using last_output=$(my_cmd) but what I'd really want is for $last_output to get updated every time I run a command. Is there a variable, zsh module, or plugin that I could install?
I guess my question is does stdout get permanently written somewhere (at least before the next command)? That way I could manipulate the results of the previous command without having to re-run it. This would be really useful for commands that take a long time to run
If you run the following:
exec > >(tee save.txt)
# ... stuff here...
exec >/dev/tty
...then your stdout for everything run between the two commands will go both to stdout, and to save.txt.
You could, of course, write a shell function which does this for you:
with_saved_output() {
"$#" \
2> >(tee "$HOME/.last-command.err >&2) \
| tee "$HOME/.last-command.out"
}
...and then use it at will:
with_saved_output some-command-here
...and zsh almost certainly will provide a mechanism to wrap interactively-entered commands. (In bash, which I can speak to more directly, you could do the same thing with a DEBUG trap).
However, even though you can, you shouldn't do this: When you split stdout and stderr into two streams, information about the exact ordering of writes is lost, even if those streams are recombined later.
Thus, the output
O: this is written to stdout first
E: this is written to stderr second
could become:
E: this is written to stderr second
O: this is written to stdout first
when these streams are individually passed through tee subprocesses to have copies written to disk. There are also buffering concerns created, and differences in behavior caused by software which checks whether it's outputting to a TTY and changes its behavior (for instance, software which turns color-coded output on when writing directly to console, and off when writing to a file or pipeline).
stdout is just a file handle that by default is connected to the console, but could be redirected.
yourcommand > save.txt
If you want to display the output to the console and save it to a file at the same time yout could pipe the output to tee, a command that writes everything it receives on stdin to stdout and to a file of your choice:
yourcommand | tee save.txt

bash assign output of fuser to a variable oddity

when using the following bash script to assign variable to the output of fuser, it still outputs the part of result(before :) of fuser to screen.
why isn't it suppressed? I suspect it has to do with the ":" char output by fuser. how do I fix this?
test=`fuser -f /home/whois_database_collection_v4/whoisdatacollector/logs/com_log_2013_02_15_12_40_43.log`
/home/whois_database_collection_v4/whoisdatacollector/logs/com_log_2013_02_15_12_40_43.log:
fuser sends the part of the output you see to stderr, and the rest to stdout (I suspect that it tries to simplify its usage from scripts, while preserving some beauty from the user's point of view). It will disappear if you add 2> /dev/null redirection for stderr.

How to redirect the standard output of all the content within a shell script to a log file?

Let us consider there are many system commands inside a shell script with each of them returning some content to stdout or stderr.
Instead of performing redirection for each and every command separately is there any way to redirect all the stdout or stderr generated from the shell script to a log file ?
the obviously simple solution would be to use
./scriptfile.sh > foo.log
thus all stdout generated withing the scriptfile goes to foo.log
i guess however, that your question is directed towards a solution that works from withing the script.
you can (re)direct a file-descriptor to a given file using the exec command (line 2 in the following snippet will redirect stdout to foo.log):
#!/bin/sh
exec 1>>foo.log
echo blue
echo blart

Cryptic Bash command ... purports to log the whole script's execution

I have some code handed from someone long gone to another department. It purports to log everything to the $MBL location, however, it does not; it creates an empty file in the $MBL location :-(
exec > >(tee ${MBL}) 2>&1
I can tell that it takes stderr and sends it to stdout; I can tell that tee should output the result to stdout and to $MBL. However, I don't understand the exec > >() syntax.
Reading the bash(1) man page suggests that a fork happens....
Two things going on here: exec with only redirections redirects the shell's own file descriptors, and the >(command) syntax in bash and zsh creates a pipe and substitutes a /dev/fd/* reference to its input.
As written, this looks like it does what it is claimed to do.. but there may be other redirections in the script, or if it's being run in a shell that doesn't support >(command) then it will spit out an error and do nothing useful.

Resources