How to include nohup inside a bash script? - bash

I have a large script called mandacalc which I want to always run with the nohup command. If I call it from the command line as:
nohup mandacalc &
everything runs swiftly. But, if I try to include nohup inside my command, so I don't need to type it everytime I execute it, I get an error message.
So far I tried these options:
nohup (
command1
....
commandn
exit 0
)
and also:
nohup bash -c "
command1
....
commandn
exit 0
" # and also with single quotes.
So far I only get error messages complaining about the implementation of the nohup command, or about other quotes used inside the script.
cheers.

Try putting this at the beginning of your script:
#!/bin/bash
case "$1" in
-d|--daemon)
$0 < /dev/null &> /dev/null & disown
exit 0
;;
*)
;;
esac
# do stuff here
If you now start your script with --daemon as an argument, it will restart itself detached from your current shell.
You can still run your script "in the foreground" by starting it without this option.

Just put trap '' HUP on the beggining of your script.
Also if it creates child process someCommand& you will have to change them to nohup someCommand& to work properly... I have been researching this for a long time and only the combination of these two (the trap and nohup) works on my specific script where xterm closes too fast.

Create an alias of the same name in your bash (or preferred shell) startup file:
alias mandacalc="nohup mandacalc &"

Why don't you just make a script containing nohup ./original_script ?

There is a nice answer here: http://compgroups.net/comp.unix.shell/can-a-script-nohup-itself/498135
#!/bin/bash
### make sure that the script is called with `nohup nice ...`
if [ "$1" != "calling_myself" ]
then
# this script has *not* been called recursively by itself
datestamp=$(date +%F | tr -d -)
nohup_out=nohup-$datestamp.out
nohup nice "$0" "calling_myself" "$#" > $nohup_out &
sleep 1
tail -f $nohup_out
exit
else
# this script has been called recursively by itself
shift # remove the termination condition flag in $1
fi
### the rest of the script goes here
. . . . .

the best way to handle this is to use $()
nohup $( command1, command2 ...) &
nohup is expecting one command and in that way You're able to execute multiple commands with one nohup

Related

Run bash script in background by default

I know I can run my bash script in the background by using bash script.sh & disown or alternatively, by using nohup. However, I want to run my script in the background by default, so when I run bash script.sh or after making it executable, by running ./script.sh it should run in the background by default. How can I achieve this?
Self-contained solution:
#!/bin/sh
# Re-spawn as a background process, if we haven't already.
if [[ "$1" != "-n" ]]; then
nohup "$0" -n &
exit $?
fi
# Rest of the script follows. This is just an example.
for i in {0..10}; do
sleep 2
echo $i
done
The if statement checks if the -n flag has been passed. If not, it calls itself with nohup (to disassociate the calling terminal so closing it doesn't close the script) and & (to put the process in the background and return to the prompt). The parent then exits to leave the background version to run. The background version is explicitly called with the -n flag, so wont cause an infinite loop (which is hell to debug!).
The for loop is just an example. Use tail -f nohup.out to see the script's progress.
Note that I pieced this answer together with this and this but neither were succinct or complete enough to be a duplicate.
Simply write a wrapper that calls your actual script with nohup actualScript.sh &.
Wrapper script wrapper.sh
#! /bin/bash
nohup ./actualScript.sh &
Actual script in actualScript.sh
#! /bin/bash
for i in {0..10}
do
sleep 10 #script is running, test with ps -eaf|grep actualScript
echo $i
done
tail -f 10 nohup.out
0
1
2
3
4
...
Adding to Heath Raftery's answer, what worked for me is a variation of what he suggested such as this:
if [[ "$1" != "-n" ]]; then
$0 -n & disown
exit $?
fi

Bash - Hiding a command but not its output

I have a bash script (this_script.sh) that invokes multiple instances of another TCL script.
set -m
for vars in $( cat vars.txt );
do
exec tclsh8.5 the_script.tcl "$vars" &
done
while [ 1 ]; do fg 2> /dev/null; [ $? == 1 ] && break; done
The multi threading portion was taken from Aleksandr's answer on: Forking / Multi-Threaded Processes | Bash.
The script works perfectly (still trying to figure out the last line). However, this line is always displaed: exec tclsh8.5 the_script.tcl "$vars"
How do I hide that line? I tried running the script as :
bash this_script.sh > /dev/null
But this hides the output of the invoked tcl scripts too (I need the output of the TCL scripts).
I tried adding the /dev/null to the end of the statement within the for statement, but that too did not work either. Basically, I am trying to hide the command but not the output.
You should use $! to get the PID of the background process just started, accumulate those in a variable, and then wait for each of those in turn in a second for loop.
set -m
pids=""
for vars in $( cat vars.txt ); do
tclsh8.5 the_script.tcl "$vars" &
pids="$pids $!"
done
for pid in $pids; do
wait $pid
# Ought to look at $? for failures, but there's no point in not reaping them all
done

Using nohup within a loop of a bash script

I have a bash script that contains a loop over a list of subdirectories. Inside the loop I cd into each subdirectory, run a command using nohup and then cd back out. In the following example I have replaced the executable by an echo command for simplicity.
#!/bin/bash
dList=("dname1" "dname2" "dname3")
for d in $dList; do
cd $d
nohup echo $d &
cd ..
done
The above causes nohup to hang during the first loop with the following output:
$ ./script.sh
./dname1
$ nohup: appending output to `nohup.out'
The script does not continue through the loop and in order to type again on the command line one must press the enter key.
OK, this is normal nohup behaviour when one is using it on the shell, but obviously it doesn't work for my script. How can I get nohup to simply run and then gracefully allow the script to continue?
I have already (unsuccessfully) tried variations on the nohup command including
nohup echo $d < /dev/null &
but that didn't help.
Further, I tried including
trap "" HUP
at the top of the script too, but this did not help either.
Please help!
EDIT: As #anubhava correctly pointed out my loop contained an error which was causing the script to only use the first entry in the array. Here is the corrected version.
#!/bin/bash
dList=("dname1" "dname2" "dname3")
for d in ${dList[#]}; do
cd $d
nohup echo $d &
cd ..
done
So now the script achieves what I wanted. However, we still get the annoying output from nohup, which was part of my original question.
Problem is here:
for d in $dList; do
That will only run for loop once for the 1st element of the array.
To iterate over an array use:
for d in ${dList[#]}; do
Full working script:
dList=("dname1" "dname2" "dname3")
for d in "${dList[#]}"; do
cd "$d"
{ nohup echo "$d" & cd -; } 2>/dev/null
done

Bash Script execute commands from a file but if cancel on to jump on next one

Im tring to make a script to execute a set of commands from a file
the file for example has a set of 3 commands perl script-a, perl script-b, perl script-c, each command on a new line and i made this script
#!/bin/bash
for command in `cat file.txt`
do
echo $command
perl $command
done
The problem is that some scripts get stuck or takes too long to finish and i want to see their outputs. It is possible to make the bash script in case i send CTRL+C on the current command that is executed to jump to the next command in the txt file not to cancel the wole bash script.
Thank you
You can use trap 'continue' SIGINT to ignore Ctrl+c:
#!/bin/bash
# ignore & continue on Ctrl+c (SIGINT)
trap 'continue' SIGINT
while read command
do
echo "$command"
perl "$command"
done < file.txt
# Enable Ctrl+c
trap SIGINT
Also you don't need to call cat to read a file's contents.
#!/bin/bash
for scr in $(cat file.txt)
do
echo $scr
# Only if you have a few lines in your file.txt,
# Then, execute the perl command in the background
# Save the output.
# From your question it seems each of these scripts are independent
perl $scr &> $scr_perl_execution.out &
done
You can check each of the output to see if the command is doing as you expect. If not, you can use kill to terminate each of the command.

Run shell command from child shell

I have a Unix shell script test.sh. Within the script i would like to invoke another shell and then execute the rest of the commands in the shell script from the child shell and exit
To make it clear:
test.sh
#! /bin/bash
/bin/bash /* create child shell */
<shell-command1>
<shell-command2>
......
<shell-commandN>
exit 0
What my intention is to run the shell-commands1 to shell-commandN from the child shell. Kindly tell me how to do this
You can setup in a group, like.
#!/bin/bash
(
Command1
Command2
etc..
)
subshell() {
echo "this is also within a subshell"
}
subshell
( and ) creates a subshell in which you run a group of commands, otherwise a simple function will do. I don't know if ( and ) is POSIX compatible.
Update: If I understand your comment correctly, you want to be using -c option with bash, like.
/bin/bash -c "Command1 && Command2...." &
From http://tldp.org/LDP/abs/html/subshells.html here is an example:
#!/bin/bash
# subshell-test.sh
(
# Inside parentheses, and therefore a subshell . . .
while [ 1 ] # Endless loop.
do
echo "Subshell running . . ."
done
)
# Script will run forever,
#+ or at least until terminated by a Ctl-C.
exit $? # End of script (but will never get here).

Resources