Can bash -v output be redirected? - bash

starting bash with -v option produces a long output to the console
$ bash -v
source ~/Dropbox/bin/tim_functions.sh
\#!/bin/bash
...several hundred more lines
I would like to capture the output to a file to make it easier to browse through, but I have tried bash -v 2>&1 > out_bash.txt and bash -v | tee out_bash.txt and cannot capture the information on the terminal screen within a file. It is as if the verbose output is neither stderr or stdout. How can this be?
Can anyone suggest a way to capture the output of bash -v ?

bash -v 2>&1 > out_bash.txt
is not what you want, it should be
bash -v >out_bash.txt 2>&1

I poked around and found this http://www.commandlinefu.com/commands/view/3310/run-a-bash-script-in-debug-mode-show-output-and-save-it-on-a-file
On the website they use
bash -x test.sh 2>&1 | tee out.test, but I tested it with
bash -v test.sh 2>&1 | tee out.test and it worked fine.

you can also use the exec command in the script to redirect all output:
#!/bin/bash
exec >> out.txt 2>> out.txt
set -x
set -v
echo "testing debug of shell scripts"
ls

After reading other helpful answers, I believe this issue has to do with how bash is sending the verbose information to tty--which is somehow different than stderr or stdout. It can be caught with the following work around:
$ screen -L
$ bash -v
$ exit #from the bash session
$ exit #from the screen session
This results in a screenlog.0 file being generated containing the output.
The bash -v output of interest was on a mac running 10.7.3 (Lion) with
$ bash --version
GNU bash, version 3.2.48(1)-release (x86_64-apple-darwin11)
Copyright (C) 2007 Free Software Foundation, Inc.)
Another 10.6.8 mac I tried had a less (interesting/verbose) output, despite a similar .bashrc file.

You can use,
bash -v 2>&1 | tee file.txt
or
bash -v 2>&1 | grep search_string

Have you tried wrapping your child bash in a subshell?
( bash -v ) 2>&1 > out_bash.txt

Related

Bash. Parse error output without showing error

I want to get and parse the python (python2) version. This way (which works):
python2 -V 2>&1 | sed 's/.* \([0-9]\).\([0-9]\).*/\1\2/'
For some reason, python2 is showing the version using the -V argument on its error output. Because this is doing nothing:
python2 -V | sed 's/.* \([0-9]\).\([0-9]\).*/\1\2/'
So it needs to be redirected 2>&1 to get parsed (stderr to stdout). Ok, but I'd like to avoid the error shown if a user launching this command has no python2 installed. The desired output on screen for a user who not have python2 installed is nothing. How can I do that? because I need the error output shown to parse the version.
I already did a solution doing before a conditional if statement using the hash command to know if the python2 command is present or not... so I have a working workaround which avoids the possibility of launching the python2 command if it is not present... but just curiosity. Forget about python2. Let's suppose is any other command which is redirecting stderr to stdout. Is there a possibility (bash trick) to parse its output without showing it if there is an error?
Any idea?
Print output only if the line starts with Python 2:
python2 -V 2>&1 | sed -n 's/^Python 2\.\([0-9]*\).*/2\1/p'
or,
command -v python2 >/dev/null && python2 -V 2>&1 | sed ...
Include the next line in your script
command python2 >/dev/null 2>&1 || {echo "python2 not installed or in PATH"; exit 1; }
EDITED: Changed which into command

grep -v -f of empty file different between script and command line on OS X

In bash, in Terminal on my Mac, (but not in Linux), grep -v -f behaves differently depending on whether it's executed at the command line or in a script. From the command line:
$ touch empty-file #create an empty file
$ printf 'foo' | grep -v -f empty-file
foo
That's as expected. But when that line is in a script, it outputs nothing. Here's the script:
$ cat grep-v-in-script.sh
#!/usr/bin/env bash
printf 'foo\n' | grep -v -f empty-file
printf 'end of script\n'
When I execute that script:
$ ./grep-v-in-script.sh
end of script
If I run that same script in Linux it works as expected:
herdrick#some-linux-server:~$ ./grep-v-in-script.sh
foo
end of script
FWIW on my Mac if I change the 'grep -v -f' to 'grep -f', then it again outputs nothing, but this time that is expected.
Here's my bash version:
$ bash --version
GNU bash, version 3.2.57(1)-release (x86_64-apple-darwin17)
Copyright (C) 2007 Free Software Foundation, Inc.
The issue is simply an incompatibility between GNU grep and BSD grep. See the comment on the post by #that-other-guy.
My confusion was due to my having an alias set to use GNU grep. There is, otherwise, no difference between doing this at the command line and in a script.

How to refer to redirection file from within a bash script?

I'd like to write a bash script myscript such that issuing this command:
myscript > filename.txt
would return the name of the filename that it's output is being redirected to, filename.txt. Is this possible?
If you are running on Linux, check where /proc/self/fd/1 links to.
For example, the script can do the following:
#!/bin/bash
readlink /proc/self/fd/1
And then run it:
$ ./myscript > filename.txt
$ cat filename.txt
/tmp/filename.txt
Note that if you want to save the value of the output file to a variable or something, you can't use /proc/self since it will be different in the subshell, but you can still use $$:
outputfile=$(readlink /proc/$$/fd/1)
Using lsof:
outfile=$(lsof -p $$ | awk '/1w/{print $NF}')
echo $outfile

Copy stderr and stdout to a file as well as the screen in ksh

I'm looking for a solution (similar to the bash code below) to copy both stdout and stderr to a file in addition to the screen within ksh on Solaris.
The following code works great in the bash shell:
#!/usr/bin/bash
# Clear the logfile
>logfile.txt
# Redirect all script output to a logfile as well as their normal locations
exec > >(tee -a logfile.txt)
exec 2> >(tee -a logfile.txt >&2)
date
ls -l /non-existent/path
For some reason this is throwing a syntax error on Solaris. I assume it's because I can't do process substitution, and I've seen some posts suggesting the use of mkfifo, but I've yet to come up with a working solution.
Does anyone know of a way that all output can be redirected to a file in addition to the default locations?
Which version of ksh are you using? The >() is not supported in ksh88, but is supported in ksh93 - the bash code should work unchanged (aside from the #! line) on ksh93.
If you are stuck with ksh88 (poor thing!) then you can emulate the bash/ksh93 behaviour using a named pipe:
#!/bin/ksh
# Clear the logfile
>logfile.txt
pipe1="/tmp/mypipe1.$$"
pipe2="/tmp/mypipe2.$$"
trap 'rm "$pipe1" "$pipe2"' EXIT
mkfifo "$pipe1"
mkfifo "$pipe2"
tee -a logfile.txt < "$pipe1" &
tee -a logfile.txt >&2 < "$pipe2" &
# Redirect all script output to a logfile as well as their normal locations
exec >"$pipe1"
exec 2>"$pipe2"
date
ls -l /non-existent/path
The above is a second version to enable stderr to be redirected to a different file.
How about this:
(some commands ...) 2>&1 | tee logfile.txt
Add -a to the tee command line for subsequent invocations to append rather than overwrite.
In ksh, the following works very well for me
LOG=log_file.$(date +%Y%m%d%H%M%S).txt
{
ls
date
... whatever command
} 2>&1 | tee -a $LOG

If redirecting STDERR to STDOUT and redirecting STDOUT to a file, why are STDERR messages not showing in the file?

I made a quick little script, test.sh, that looks like the following:
echo "StdErr" > /dev/stderr
echo "StdOut" > /dev/stdout
According to the answers to this SO question, and the Advanced Bash-Scripting Guide, the following should redirect stderr to stdout from the script above:
$ sh /tmp/test.sh 2>&1
And indeed it does:
$ sh /tmp/test.sh 2>&1 |tee file;
$ cat file
StdErr
StdOut
The question that I am wondering is where does the output from stderr go in the following code?
$ sh /tmp/test.sh > file 2>&1
$ cat file
StdOut
I am using GNU bash, version 4.0.24(2)-release.
Your script writes "StdErr" to the file. Then, the next line overwrites the file with "StdOut". You should use >> if you want to append to the file. Since you are redirecting, /dev/stderr and /dev/stdout are regular files, subject to truncation. Try your test again, but this time make file a fifo instead of a regular file.
I'm seeing the output to stderr going to the file, as expected:
$> sh test.sh > file 2>&1
$> cat file
StdErr
StdOut
This is bash 3.2.48.
$> bash --version
GNU bash, version 3.2.48(1)-release (x86_64-apple-darwin11)
I have the same issue on RedHat with Bash version GNU bash, version 3.2.25(1)-release (x86_64-redhat-linux-gnu). On my Mac with Bash and ZSH both works as expected.
[13:31:05][user#machine: ~/src]$ bash test
1STD ERR
2STD ERR
3STD OUT
4STD OUT
[13:31:40][user#machine: ~/src]$ bash test >> log 2>&1
[13:31:48][user#machine: ~/src]$ cat log
4STD OUT
[13:31:52][user#machine: ~/src]$
[13:32:10][user#machine: ~/src]$ bash test > log 2>&1
[13:32:15][user#machine: ~/src]$ cat log
4STD OUT
[13:32:18][user#machine: ~/src]$

Resources