make nohup write other than nohup.out - bash

I've been using below command to make tail to write nohup.out and also print the output on the terminal.
nohup train.py & tail -f nohup.out
However, I need nohup to use different file names.
When I try
nohup python train.py & tail -F vanila_v1.out
I'm getting following error message.
tail: cannot open 'vanila_v1.out' for readingnohup: ignoring input and appending output to 'nohup.out': No such file or directory
I also tried
nohup python train.py & tail -F nohup.out > vanila_v1.txt
Then it doesn't write an output on stdout.
How do I make nohup to write other than nohup.out? I don't mind simultaneously writing two different files. But to keep track of different processes, I need the name to be different.
Thanks.

You need to pipe the STDOUT and STDERR for the nohup command like:
$ nohup python train.py > vanila_v1.out 2>&1 & tail -F vanila_v1.out
At this point, the process will go into the background and you can use tail -f vanila_v1.out. That's one way to do it.
A little more information is available here for the STDOUT and STDERR link. Here is another question that uses the tee command rather that > to achieve the same in one go.

Related

nohup bash myscript.sh > log.log yeilds empty file until the process is stopped manually

I have a python file that I run using a .sh file providing some args. When I run this file on the terminal using bash myscript.sh it runs fine, prints progress on the console. When I use nohup like nohup bash myscript.sh > log.log nothing gets saved to log.log file but the processes are running (it's a 4 GPU process and I can see the GPU usage in top and nvidia-smi). As soon as I kill the process using either ctrl+c or kill command, all the output gets printed to the file at once along with keyboard interrupt or process killed message.
I have tried nohup myscript.sh &> log.log nohup myscript.sh &>> log.log but the issue remains the same. What's the reason for such a behaviour?
myscript.sh runs a python file somewhat like
python main.py --arg1 val1 --arg2 val2
I tried
python -u main.py
but it doesn't help. I know the script is working fine as it occupies exactly same amount of memory as it should.
Use stdbuf:
nohup stdbuf -oL bash myscript.sh > log.log
Your problem is related to output buffering. In general, non-interactive output to files tend to be block buffered. The -oL changes output buffering to line mode. There is also the less efficient -o0 to change it to unbuffered.

Telling nohup to write output in real-time

When using nohup the output of a script is buffered and only gets dumped to the log file (nohup.out) after the script has finished executing. It would be really useful to see the script output in something close to real-time, to know how it is progressing.
Is there a way to make nohup write the output whenever it is produced by the script? Or, since such frequent file access operations are slow, to dump the output periodically during execution?
There's a special program for this: unbuffer! See http://linux.die.net/man/1/unbuffer
The idea is that your program's output routines recognize that stdout is not a terminal (isatty(stdout) == false), so they buffer output up to some maximum size. Using the unbuffer program as a wrapper for your program will "trick" it into writing the output one line at a time, as it would do if you ran the program in an interactive terminal directly.
What's the command you are executing? You could create a .bash file which inside is redirecting the output to files after each command (echo "blabh" >> your-output.txt) so you can check that file during the nohup execution (you should run: nohup script.bash &)
Cheers
Please use stdbuf. It's added in GNU coreutils from 7.5.
stdbuf -i0 -o0 -e0 cmd
Command with nohup like:
nohup stdbuf -i0 -o0 -e0 ./server >> log_server.log 2>&1
Reference

'tee' in makefile, can we copy stderr as well?

I hope to record the result of stderr & stdout to different files, while watching both outputs from the terminal.
So I use tee, and found a solution in this page.
But the sad thing is, it can't work when put into a makefile:
all:
#command > >(tee stdout.log) 2> >(tee stderr.log >&2)
It seems that make will use sh -c to execute this line, which doesn't understand well about the syntax.
Can we have another solution for this?
In order to use this syntax in your Makefile you need to change the shell that make uses for running commands by setting the SHELL variable.
By invoking make using: make SHELL=/bin/bash, or putting SHELL:=/bin/bash at the start of the Makefile it should accomplish this.
A brute-force way would be to not tee in the makefile but instead tail -f one of the files in the background:
$ tail -f stderr.log & tail -f stdout.log
[... ^C]
$ kill $!

Send Output errors of nohup to syslog

I'm attempting to write a bash script that uses nohup and passes errors to rsyslog.
I've tried this command with different variations of the log variable (see below) but can't get the output passed to anything but a std txt file. I can't get it to pipe.
nohup imageprocessor.sh > "$LOG" &
Is it possible to pipe nohup output or do I need a different command.
A couple of variations of log that I have tried
LOG="|/usr/bin/logger -t workspaceworker -p LOCAL5.info &2"
or
LOG="|logtosyslog.sh"
or
LOG="logtosyslog.sh"
A way in bash to redirect output to syslog is:
exec > >(logger -t myscript)
stdout is then sent to logger command
exec 2> >(logger -t myscript)
for stderr
Not directly. nohup will detach the child process, so piping the output of the nohup command isn't helpful. This is what you want:
nohup sh -c 'imageprocessor.sh | logger'

Write STDOUT & STDERR to a logfile, also write STDERR to screen

I would like to run several commands, and capture all output to a logfile. I also want to print any errors to the screen (or optionally mail the output to someone).
Here's an example. The following command will run three commands, and will write all output (STDOUT and STDERR) into a single logfile.
{ command1 && command2 && command3 ; } > logfile.log 2>&1
Here is what I want to do with the output of these commands:
STDERR and STDOUT for all commands goes to a logfile, in case I need it later--- I usually won't look in here unless there are problems.
Print STDERR to the screen (or optionally, pipe to /bin/mail), so that any error stands out and doesn't get ignored.
It would be nice if the return codes were still usable, so that I could do some error handling. Maybe I want to send email if there was an error, like this:
{ command1 && command2 && command3 ; } > logfile.log 2>&1 || mailx -s "There was an error" stefanl#example.org
The problem I run into is that STDERR loses context during I/O redirection. A '2>&1' will convert STDERR into STDOUT, and therefore I cannot view errors if I do 2> error.log
Here are a couple juicier examples. Let's pretend that I am running some familiar build commands, but I don't want the entire build to stop just because of one error so I use the '--keep-going' flag.
{ ./configure && make --keep-going && make install ; } > build.log 2>&1
Or, here's a simple (And perhaps sloppy) build and deploy script, which will keep going in the event of an error.
{ ./configure && make --keep-going && make install && rsync -av --keep-going /foo devhost:/foo} > build-and-deploy.log 2>&1
I think what I want involves some sort of Bash I/O Redirection, but I can't figure this out.
(./doit >> log) 2>&1 | tee -a log
This will take stdout and append it to log file.
The stderr will then get converted to stdout which is piped to tee which appends it to the log (if you are have Bash 4, you can replace 2>&1 | with |&) and sends it to stdout which will either appear on the tty or can be piped to another command.
I used append mode for both so that regardless of which order the shell redirection and tee open the file, you won't blow away the original. That said, it may be possible that stderr/stdout is interleaved in an unexpected way.
If your system has /dev/fd/* nodes you can do it as:
( exec 5>logfile.txt ; { command1 && command2 && command3 ;} 2>&1 >&5 | tee /dev/fd/5 )
This opens file descriptor 5 to your logfile. Executes the commands with standard error directed to standard out, standard out directed to fd 5 and pipes stdout (which now contains only stderr) to tee which duplicates the output to fd 5 which is the log file.
Here is how to run one or more commands, capturing the standard output and error, in the order in which they are generated, to a logfile, and displaying only the standard error on any terminal screen you like. Works in bash on linux. Probably works in most other environments. I will use an example to show how it's done.
Preliminaries:
Open two windows (shells, tmux sessions, whatever)
I will demonstrate with some test files, so create the test files:
touch /tmp/foo /tmp/foo1 /tmp/foo2
in window1:
mkfifo /tmp/fifo
0</tmp/fifo cat - >/tmp/logfile
Then, in window2:
(ls -l /tmp/foo /tmp/nofile /tmp/foo1 /tmp/nofile /tmp/nofile; echo successful test; ls /tmp/nofile1111) 2>&1 1>/tmp/fifo | tee /tmp/fifo 1>/dev/pts/2
Where you replace /dev/pts/2 with whatever tty you want the stderr to display.
The reason for the various successful and unsuccessful commands in the subshell is simply to generate a mingled stream of output and error messages, so that you can verify the correct ordering in the log file. Once you understand how it works, replace the “ls” and “echo” commands with scripts or commands of your choosing.
With this method, the ordering of output and error is preserved, the syntax is simple and clean, and there is only a single reference to the output file. Plus there is flexiblity in putting the extra copy of stderr wherever you want.
Try:
command 2>&1 | tee output.txt
Additionally, you can direct stdout and stderr to different places:
command > stdout.txt >& stderr.txt
command > stdout.txt |& program_for_stderr
So some combination of the above should work for you -- e.g. you could save stdout to a file, and stderr to both a file and piping to another program (with tee).
add this at the beginning of your script
#!/bin/bash
set -e
outfile=logfile
exec > >(cat >> $outfile)
exec 2> >(tee -a $outfile >&2)
# write your code here
STDOUT and STDERR will be written to $outfile, only STDERR will be seen on the console

Resources