When running a bash script like source script or . script from the command line, then all the lines in the script are added to bash's "source buffer" and then the current command shell just continues. Stopping execution is impossible (apart from aborting the shell), using ctrl-C only interrupts the current command, but then the next command is executed.
Where is this buffer, and would it be possible to clear it??
Example script:
echo A
sleep 10
echo B
sleep 10
echo C
sleep 10
echo D
After having done "source script", is there any way to stop it executing any further after it has been 'submitted'?
There is - to the best of my knowledge - no such thing as a source buffer in bash, so there is nothing to erase. The source command just executes the commands found in its arguments in the current environment (i.e. not in a child process).
There is nothing in the handling of source which is in particular related to the handling of signals. Maybe your shell script is setup to ignore Control-C? I suggest that you run your script with -x in order to find the culprit.
Related
I know that have a file called .bash_profile that executes code (bashscript) when you open a terminal.
And there is another file that is called .bash_logout that executes code when you exit the terminal.
How I would execute some script when terminal is killed?
(.bash_logout do not cover this when terminal is killed).
How I would execute some script when terminal is killed?
I interpret this as "execute a script when the terminal window is closed". To do so, add the following inside your .bashrc or .bash_profile:
trap '[ -t 0 ] || command to execute' EXIT
Of course you can replace command to execute with source ~/.bash_exit and put all the commands inside the file .bash_exit in your home directory.
The special EXIT trap is executed whenever the shell exits (e.g. by closing the terminal, but also by pressing CtrlD on the prompt, or executing exit, or ...).
[ -t 0 ] checks whether stdin is connected to a terminal. Due to || the next command is executed only if that test fails, which it does when closing the terminal, but doesn't for other common ways to exit bash (e.g. pressing CtrlD on the prompt or executing exit).
Failed attempts (read only if you try to find and alternative)
In the terminals I have heard of, bash always receives a SIGHUP signal when the window is closed. Sometimes there are even two SIGHUPs; one from the terminal, and one from the kernel when the pty (pseudoterminal) is closed. However, sometimes both SIGHUPs are lost in interactive sessions, because bash's readline temporarily uses its own traps. Strangely enough, the SIGHUPs always seem to get caught when there is an EXIT trap; even if that EXIT trap does nothing.
However, I strongly advise against setting any trap on SIGHUP. Bash processes non-EXIT traps only after the current command finished. If you ran sh -c 'while true; do true; done' and closed the terminal, bash would continue to run in the background as if you had used disown or nohup.
I have a script (lets call it parent.sh) that makes 2 calls to a second script (child.sh) that runs a java process. The child.sh scripts are run in the background by placing an & at the end of the line in parent.sh. However, when i run parent.sh, i need to press Ctrl+C to return to the terminal screen. What is the reason for this? Is it something to do with the fact that the child.sh processes are running under the parent.sh process. So the parent.sh doesn't die until the childs do?
parent.sh
#!/bin/bash
child.sh param1a param2a &
child.sh param1b param2b &
exit 0
child.sh
#!/bin/bash
java com.test.Main
echo "Main Process Stopped" | mail -s "WARNING-Main Process is down." user#email.com
As you can see, I don't want to run the java process in the background because i want to send a mail out when the process dies. Doing it as above works fine from a functional standpoint, but i would like to know how i can get it to return to the terminal after executing parent.sh.
What i ended up doing was to make to change parent.sh to the following
#!/bin/bash
child.sh param1a param2a > startup.log &
child.sh param1b param2b > startup2.log &
exit 0
I would not have come to this solution without your suggestions and root cause analysis of the issue. Thanks!
And apologies for my inaccurate comment. (There was no input, I answered from memory and I remembered incorrectly.)
The following link from the Linux Documentation Project suggests adding a wait after your mail command in child.sh:
http://tldp.org/LDP/abs/html/x9644.html
Summary of the above document
Within a script, running a command in the background with an ampersand (&)
may cause the script to hang until ENTER is hit. This seems to occur with
commands that write to stdout. It can be a major annoyance.
....
....
As Walter Brameld IV explains it:
As far as I can tell, such scripts don't actually hang. It just
seems that they do because the background command writes text to
the console after the prompt. The user gets the impression that
the prompt was never displayed. Here's the sequence of events:
Script launches background command.
Script exits.
Shell displays the prompt.
Background command continues running and writing text to the
console.
Background command finishes.
User doesn't see a prompt at the bottom of the output, thinks script
is hanging.
If you change child.sh to look like the following you shouldn't experience this annoyance:
#!/bin/bash
java com.test.Main
echo "Main Process Stopped" | mail -s "WARNING-Main Process is down." user#gmail.com
wait
Or as #SebastianStigler states in a comment to your question above:
Add a > /dev/null at the end of the line with mail. mail will otherwise try to start its interactive mode.
This will cause the mail command to write to /dev/null rather than stdout which should also stop this annoyance.
Hope this helps
The process was still linked to the controlling terminal because STDOUT needs somewhere to go. You solved that problem by redirecting to a file ( > startup.log ).
If you're not interested in the output, discard STDOUT completely ( >/dev/null ).
If you're not interested in errors, either, discard both ( &>/dev/null ).
If you want the processes to keep running even after you log out of your terminal, use nohup — that effectively disconnects them from what you are doing and leaves them to quietly run in the background until you reboot your machine (or otherwise kill them).
nohup child.sh param1a param2a &>/dev/null &
I have a script (lets call it parent.sh) that makes 2 calls to a second script (child.sh) that runs a java process. The child.sh scripts are run in the background by placing an & at the end of the line in parent.sh. However, when i run parent.sh, i need to press Ctrl+C to return to the terminal screen. What is the reason for this? Is it something to do with the fact that the child.sh processes are running under the parent.sh process. So the parent.sh doesn't die until the childs do?
parent.sh
#!/bin/bash
child.sh param1a param2a &
child.sh param1b param2b &
exit 0
child.sh
#!/bin/bash
java com.test.Main
echo "Main Process Stopped" | mail -s "WARNING-Main Process is down." user#email.com
As you can see, I don't want to run the java process in the background because i want to send a mail out when the process dies. Doing it as above works fine from a functional standpoint, but i would like to know how i can get it to return to the terminal after executing parent.sh.
What i ended up doing was to make to change parent.sh to the following
#!/bin/bash
child.sh param1a param2a > startup.log &
child.sh param1b param2b > startup2.log &
exit 0
I would not have come to this solution without your suggestions and root cause analysis of the issue. Thanks!
And apologies for my inaccurate comment. (There was no input, I answered from memory and I remembered incorrectly.)
The following link from the Linux Documentation Project suggests adding a wait after your mail command in child.sh:
http://tldp.org/LDP/abs/html/x9644.html
Summary of the above document
Within a script, running a command in the background with an ampersand (&)
may cause the script to hang until ENTER is hit. This seems to occur with
commands that write to stdout. It can be a major annoyance.
....
....
As Walter Brameld IV explains it:
As far as I can tell, such scripts don't actually hang. It just
seems that they do because the background command writes text to
the console after the prompt. The user gets the impression that
the prompt was never displayed. Here's the sequence of events:
Script launches background command.
Script exits.
Shell displays the prompt.
Background command continues running and writing text to the
console.
Background command finishes.
User doesn't see a prompt at the bottom of the output, thinks script
is hanging.
If you change child.sh to look like the following you shouldn't experience this annoyance:
#!/bin/bash
java com.test.Main
echo "Main Process Stopped" | mail -s "WARNING-Main Process is down." user#gmail.com
wait
Or as #SebastianStigler states in a comment to your question above:
Add a > /dev/null at the end of the line with mail. mail will otherwise try to start its interactive mode.
This will cause the mail command to write to /dev/null rather than stdout which should also stop this annoyance.
Hope this helps
The process was still linked to the controlling terminal because STDOUT needs somewhere to go. You solved that problem by redirecting to a file ( > startup.log ).
If you're not interested in the output, discard STDOUT completely ( >/dev/null ).
If you're not interested in errors, either, discard both ( &>/dev/null ).
If you want the processes to keep running even after you log out of your terminal, use nohup — that effectively disconnects them from what you are doing and leaves them to quietly run in the background until you reboot your machine (or otherwise kill them).
nohup child.sh param1a param2a &>/dev/null &
If I want to call another batch script from within a batch script I could use
CALL File.bat
to pause the execution of the current batch file and wait for the CALLed script to complete.
I can use
START File.bat
if I want them to run simultaneously.
How do I achieve this behavior in a shell script??
If you want to wait:
#!/bin/bash
# do some stuff
/path/to/other/script
# do other stuff
To run it simultaneously (i.e. "in the background"):
#!/bin/bash
# do some stuff
/path/to/other/script &
# do other stuff, then optionally:
wait
# this will wait for all background jobs to finish
There are other ways, and certain things you should consider about input and output redirection for the background process if you want to provide specific input and/or capture output or errors, but that's the basics.
By “shell“, I assume you mean *NIX sh.
To execute another script and wait for it to complete, do
sh file.sh
To start it in background, do
(sh file.sh) &
For bash (and other Bourne shell-compatible shells):
you don't need CALL; invoking another script or program executes it synchronously. someprog.sh
you append & to a command to run it asynchronously; note that the program will halt if it attempts to read from stdin if it's in the background. someprog.sh &
I'd like to run a script every time I close a Bash session.
I use XFCE and Terminal 0.4.5 (Xfce Terminal Emulator), I would like to run a script every time I close a tab in Terminal including the last one (when I close Terminal).
Something like .bashrc but running at the end of every session.
.bash_logout doesn't work
You use trap (see man bash):
trap /u1/myuser/on_exit_script.sh EXIT
The command can be added to your .profile/.login
This works whether you exit the shell normally (e.g. via exit command) or simply kill the terminal window/tab, since the shell gets the EXIT signal either way - I just tested by exiting my putty window.
My answer is similar to DVK's answer but you have to use a command or function, not a file.
$ man bash
[...]
trap [-lp] [[arg] sigspec ...]
The command arg is to be read and executed when the shell
receives signal(s) sigspec.
[...]
If a sigspec is EXIT (0) the command arg is executed on
exit from the shell.
So, you can add to your .bashrc something like the following code:
finish() {
# Your code here
}
trap finish EXIT
Write you script in "~/.bash_logout". It executed by bash(1) when login shell exits.
If you close your session with "exit", might be able to something like
alias endbash="./runscript;exit" and just exit by entering endbash. I'm not entirely sure this works, as I'm running windows at the moment.
edit: DVK has a better answer.