How to redirect the standard output of all the content within a shell script to a log file? - shell

Let us consider there are many system commands inside a shell script with each of them returning some content to stdout or stderr.
Instead of performing redirection for each and every command separately is there any way to redirect all the stdout or stderr generated from the shell script to a log file ?

the obviously simple solution would be to use
./scriptfile.sh > foo.log
thus all stdout generated withing the scriptfile goes to foo.log
i guess however, that your question is directed towards a solution that works from withing the script.
you can (re)direct a file-descriptor to a given file using the exec command (line 2 in the following snippet will redirect stdout to foo.log):
#!/bin/sh
exec 1>>foo.log
echo blue
echo blart

Related

stdout and stderr are out-of-order when a bash script is run from Atom

I tried to write a little program to list a non existing directory and echo done, in a .sh file:
#!/bin/bash
ls notexist
echo 'done'
But my console outputs done on the first line, before the error message to list the nonexisting directory:
done
ls: notexist: No such file or directory
I don't think bash creates a thread automatically for each line of code, does it? I'm using terminal in macOS Big Sur.
Edit: I'm accessing terminal indirectly from the script package of the Atom text editor in macOS Big Sur. The error goes away if I run code directly in console via ./file.sh.
If we look at the source code to the Atom script plugin, the problem becomes clear:
It creates a BufferedProcess with separate stdout and stderr callbacks (using them, among other things, to determine whether any output has been written to each of these streams).
Implementing this requires stdout and stderr to be directed to different FIFOs. This means that, unlike a typical terminal where there's an absolute ordering of which content was written to the single FIFO shared by both stdout and stderr at the same time, there's no strict guarantee that content will be processed through those functions in the same order it was written.
As a workaround, you can exec 2>&1 into your script to put all content on stdout, or exec >&2 to put all content on stderr. Ideally, if the script plugin doesn't need to track the two streams separately, it would do this itself, and put a callback only on the single stream to which all content has been redirected.

Assigning stdin to file also assigns stdout to that file

I executed the following bash script:
#!/bin/env bash
exec 0<log
My understanding is that it permanently reassigns the stdin for this bash process to the log file. So, when I ran
ls > log
I expected that I was passing "ls" to stdin. Since I have not reassigned stdout, I was also expecting to see the result of the "ls" command in the terminal (where I normally see stdout). However, I saw the output of "ls" in the log file. Why is stdout in the log file and not the terminal?
ls writes to whatever file it is given for standard output. Without a redirection, that is whatever file it inherits from its parent (your script). With the redirection, you are explicitly providing the file log for standard output.
This is independent of whatever else the file log might be used for.

How to get output redirect as parameter in bash?

I would like to know if it's possible to get the output redirection file name as a parameter in bash?
For example :
./myscript.sh parameter1 > outputfile
Is there a way to get "outputfile" as a parameter like $2? In my script I have to do few operations in outputfile but I don't know which file I have to update... The second problem is, this script is already running and used by several tasks so I cannot change the user input...
Best regards
Redirections are not parameters to the program. When a program's output is redirected, the shell opens the file and connects file descriptor 2 to it before running the program. The program then simply writes to fd 2 (aka stdout) and it goes to the file.
On Linux and similar systems you can use /dev/stdout, which is a symbolic link to the process's stdout file.

Is it possible to capture output from a system command and redirect it?

What I would like to do is:
run a ruby script...
that executes a shell command
and redirects it to a named pipe accessible outside the script
from the system shell, read from that pipe
That is, have the Ruby script capture some command output and redirect it in such a way that it's connectable to from outside the script?
I want to mention that the script cannot simply start and exit, since it's a REPL. The idea is that using the REPL you would be able to run a command and redirect its output elsewhere to consume it.
Using abort and an exit message, will pass the message to STDERR (and the script will fail with exit code 1). You can pass this shell command output in this way.
This is possibly not the only (or best) way, but it has worked for me in the past.
[edit]
You can also redirect the output to a file (using standard methods), and read that file outside the ruby script.
require 'open3'
stdin, stderr, status = Open3.capture3(commandline)
stdin.chomp #Here, you should ge
Incase, if someone wanted to use you can get the output via stdin.chomp

Copy script output to file

I have a series of bash scripts that echo a lot of data to stdout and occasionally to stderr. There is a main bash script which then imports and/or invokes many other bash scripts.
I'm trying to implement something that will capture the output from not just the parent script, but all children scripts. I need to capture both stdout and stderr so that any issues from compilation, etc... get captured in this log file.
I'm aware of tee and of course the normal stdout redirect >... but these don't seem to work without either adding these commands to each line of every script, both parent and children. There's several thousand lines in these scripts, so adding a redirect to each line would be impractical, even using sed.
I've seen suggestions such as: Redirect stderr and stdout in a Bash script
But these require editing every line in the scripts. Forcing my users to install screen is also impractical.
UPDATE:
Forgot to mention I want the output to still display on the console as well as write to the log. The scripts take several hours to run, and the user needs to know something is happening...
You can use yourmainscript 2>&1 | tee log which will capture stdout and stderr from all imported/invoked scripts in yourmainscript while also showing it on screen.
Inside yourmainscript, you can get the same effect using:
echo "Redirecting the rest of script output to 'log'"
exec > >(tee log) 2>&1
rest of your code
To redirect just a certain section:
echo "Redirecting the next commands"
{
cmd1
cmd2
} > >(tee log) 2>&1
echo "Continuing as normal"

Resources