redirect STDERR in tcsh from .aliases - tcsh

in tcsh I'm trying to redirect STDERR from a command from my .aliases file.
I found that I can redirect STDERR from the command line like this. . .
$ (xemacs > /dev/tty) >& /dev/null
. . . but when I put this in my .aliases file I get an alias loop. . .
$ cat .aliases
alias xemacs '(xemacs > /dev/tty ) >& /dev/null'
$ xemacs &
Alias loop.
$
. . . so I put a backslash before the command in .aliases, which allows the command to run. . .
$ cat .aliases
alias xemacs '(\xemacs > /dev/tty ) >& /dev/null'
$ xemacs &
[1] 17295
$
. . . but now I can't give the command any arguments:
$ xemacs foo.txt &
Badly placed ()'s.
[1] Done ( \xemacs > /dev/tty ) >& /dev/null
$
Can anyone offer any solutions? Thank you in advance!
UPDATE: I'm still curious if it's possible to redirect STDERR in tcsh from .aliases, but as has been suggested here, I ended up with a shell script:
#!/bin/sh
# wrapper script to suppress messages sent to STDERR on launch
# from the command line.
/usr/bin/xemacs "$#" 2>/dev/null

I suspect this is a case where NOT using an alias is the best option - try using a shell script instead:
#!/bin/tcsh
(xemacs $* > /dev/tty ) >& /dev/null

Try
alias emacs '(\emacs \!* > /dev/tty) >& /dev/null'
The "badly placed ()'s" message comes from misplacing the input parameter to emacs. Without the "\!*" in the alias definition, "emacs abc" becomes
(/usr/bin/emacs > /dev/tty) >& /dev/null abc
With the "\!*" included, "emacs abc" becomes
(/usr/bin/emacs abc > /dev/tty) >& /dev/null

Related

Bash use of zenity with console redirection

In efforts to create more manageable scripts that write their own output to only one location themselves (via 'exec > file'), is there a better solution than below for combining stdout redirection + zenity (which in this use relies on piped stdout)?
parent.sh:
#!/bin/bash
exec >> /var/log/parent.out
( true; sh child.sh ) | zenity --progress --pulsate --auto-close --text='Executing child.sh')
[[ "$?" != "0" ]] && exit 1
...
child.sh:
#!/bin/bash
exec >> /var/log/child.out
echo 'Now doing child.sh things..'
...
When doing something like-
sh child.sh | zenity --progress --pulsate --auto-close --text='Executing child.sh'
zenity never receives stdout from child.sh since it is being redirected from within child.sh. Even though it seems to be a bit of a hack, is using a subshell containing a 'true' + execution of child.sh acceptable? Or is there a better way to manage stdout?
I get that 'tee' is acceptable to use in this scenario, though I would rather not have to write out child.sh's logfile location each time I want to execute child.sh.
Your redirection exec > stdout.txt will lead to error.
$ exec > stdout.txt
$ echo hello
$ cat stdout.txt
cat: stdout.txt: input file is output file
You need an intermediary file descriptor.
$ exec 3> stdout.txt
$ echo hello >&3
$ cat stdout.txt
hello

How to print shell script stdout/stderr to file/s and console

In the bash script I use the following syntax I order to print everything from the script to the files - $file and $sec_file
we are running the script on our Linux rhel server - version 7.8
exec > >(tee -a "$file" >>"$sec_file") 2>&1
so after bash script completed , we get on both files the content of stdout/stderr of every line in the bash script
now we want additionally to print to the console the stdout/stderr and not only to the files
I will appreciate of any suggestion
Example of the script:
# more /tmp/script.bash
#!/bin/bash
file=/tmp/file.txt
sec_file=/tmp/sec_file.txt
exec > >(tee -a "$file" >>"$sec_file") 2>&1
echo "hello world , we are very happy to stay here "
Example how to run the script:
/tmp/script.bash
<-- no output from the script -->
# more /tmp/file.txt
hello world , we are very happy to stay here
# more /tmp/sec_file.txt
hello world , we are very happy to stay here
example of expected output that should be as the following
/tmp/script.bash
hello world , we are very happy to stay here
and
# more /tmp/file.txt
hello world , we are very happy to stay here
# more /tmp/sec_file.txt
hello world , we are very happy to stay here
I think, the easiest is to just add multiple files as arguments to the tee like this:
% python3 -c 'import sys; print("to stdout"); print("to stderr", file=sys.stderr)' 2>&1 | tee -a /tmp/file.txt /tmp/file_sec.txt
to stdout
to stderr
% cat /tmp/file.txt
to stdout
to stderr
% cat /tmp/file_sec.txt
to stdout
to stderr
Your script would look like this then:
#!/bin/bash
file=/tmp/file.txt
sec_file=/tmp/sec_file.txt
exec > >(tee -a "$file" "$sec_file") 2>&1
echo "hello world , we are very happy to stay here "
I would suggest to just write console things to a new output channel:
#!/bin/bash
file=file.txt
sec_file=sec_file.txt
exec 4>&1 > >(tee -a "$file" >>"$sec_file") 2>&1
echo "stdout"
echo "stderr" >&2
echo "to the console" >&4
Output:
me#pc:~/⟫ ./script.sh
to the console
me#pc:~/⟫ cat file.txt
stdout
stderr
me#pc:~/⟫ cat sec_file.txt
stdout
stderr
If you want you can do this and even write to stderr again with >&5:
exec 4>&1 5>&1 > >(tee -a "$file" >>"$sec_file") 2>&1
echo "stderr to console" >&5
Edit: Changed &3 to &4 as &3 is sometimes used for stdin.
But maybe this is the moment to rethink what you are doing and keep &1 stdout and &2 stderr and use &4 and &5 to write to file?
exec 4> >(tee -a "$file" >>"$sec_file") 5>&1
This does require you though to add to all lines that should end up in your file to prepend >&4 2>&5

When this is executed: `"(exec -l -a specialname /bin/bash -c 'echo $0' ) 2> error"`, why it outputs `[7^[[r^[[999;999H^[[6n` to stderr

when I do the bash test:
(exec -l -a specialname /bin/bash -c 'echo $0' ) 2> error
the run-builtins fails, after some search, I found that it outputs
^[7^[[r^[[999;999H^[[6n
to the stderr, so I redirect it to a file error.
If I cat it, it output a blank line.
I opened it using vim with which I found the:
^[7^[[r^[[999;999H^[[6n
why?
After a long search, I found that bash read the /etc/profile file, and in this file, has the following:
if [ -x /usr/bin/resize ];then
/usr/bin/resize >/dev/null
fi
so the bash execute the resize program, this program is produced by busybox in my system, the busybox source code console-tools/resize.c has:
fprintf(stderr, ESC"7" ESC"[r" ESC"[999;999H" ESC"[6n")
so it output that to stderr.
run the command:
(exec -l -a specialname /bin/bash -c 'export PS1='test';echo ${PS1}') 2> err.log
vi err.log

Why does > not redirect output to text file

I run the following command in shell:
sh myscript.sh > test.txt
The output is displayed on shell. I was expecting that the output would be put into test.txt.
The output isn't displayed on the shell, instead it's the STDERR that's displayed on the shell.
If you want both the STDOUT and STDERR to be redirected to the log file, say:
sh myscript.sh > test.txt 2>&1
Since you've tagged the question bash, you could also say:
bash myscript.sh >& test.txt
Printed output maybe be standard error output.
Using following, you can also redirect standard error (file descriptor 2):
sh myscript.sh > test.txt 2>&1
In bash, you can also use following forms:
sh myscript.sh &> test.txt # This is preferred according to bash(1).
sh myscript.sh >& test.txt

Copy stderr and stdout to a file as well as the screen in ksh

I'm looking for a solution (similar to the bash code below) to copy both stdout and stderr to a file in addition to the screen within ksh on Solaris.
The following code works great in the bash shell:
#!/usr/bin/bash
# Clear the logfile
>logfile.txt
# Redirect all script output to a logfile as well as their normal locations
exec > >(tee -a logfile.txt)
exec 2> >(tee -a logfile.txt >&2)
date
ls -l /non-existent/path
For some reason this is throwing a syntax error on Solaris. I assume it's because I can't do process substitution, and I've seen some posts suggesting the use of mkfifo, but I've yet to come up with a working solution.
Does anyone know of a way that all output can be redirected to a file in addition to the default locations?
Which version of ksh are you using? The >() is not supported in ksh88, but is supported in ksh93 - the bash code should work unchanged (aside from the #! line) on ksh93.
If you are stuck with ksh88 (poor thing!) then you can emulate the bash/ksh93 behaviour using a named pipe:
#!/bin/ksh
# Clear the logfile
>logfile.txt
pipe1="/tmp/mypipe1.$$"
pipe2="/tmp/mypipe2.$$"
trap 'rm "$pipe1" "$pipe2"' EXIT
mkfifo "$pipe1"
mkfifo "$pipe2"
tee -a logfile.txt < "$pipe1" &
tee -a logfile.txt >&2 < "$pipe2" &
# Redirect all script output to a logfile as well as their normal locations
exec >"$pipe1"
exec 2>"$pipe2"
date
ls -l /non-existent/path
The above is a second version to enable stderr to be redirected to a different file.
How about this:
(some commands ...) 2>&1 | tee logfile.txt
Add -a to the tee command line for subsequent invocations to append rather than overwrite.
In ksh, the following works very well for me
LOG=log_file.$(date +%Y%m%d%H%M%S).txt
{
ls
date
... whatever command
} 2>&1 | tee -a $LOG

Resources