So, I have started studying UNIX systems about a month ago and now I have a basic question about job control
How can I create a job, that will contain several processes, using only default bash commands?
jobs are usually an interactive shell concept, as there is usually a controlling terminal involved.
A shell script is executed in a non-interactive, non-login session of shell, hence no job control by default.
You can force job control inside a script, by setting:
set -m
inside the script.
From help set:
-m Job control is enabled.
echo | ping google.com &
Was fine example, because both of these processes work independently and only require pipe symbol (|) to work in sync in the background
Related
I know that execute a command and add & to the end would create a job and make the command execute in background.
Now I want to create a job in a bash shell. I tried
#!/bin/bash
my-job &
# some other tasks
Then I executed jobs, but I got no output. However, ps aux does show my-job is running in the background.
I want to create a job inside a script, because in some cases I want to bring the job into foreground.
jobs are usually an interactive shell concept, as there is usually a controlling terminal involved.
A shell script is executed in a non-interactive, non-login session of shell, hence no job control by default.
You can force job control inside a script, by setting:
set -m
inside the script.
From help set:
-m Job control is enabled.
I am using elastic map reduce from Amazon. I am sshing into hadoop master node and executing a script like.
$EMR_BIN/elastic-mapreduce --jobflow $JOBFLOW --ssh < hivescript.sh . It sshes me into the master node and runs the hive script. The hivescript contains the following lines
hive
add jar joda-time-1.6.jar;
add jar EmrHiveUtils-1.2.jar;
and some commands to create hive tables. The script runs fine and creates the hive tables and everything else, but comes back to the prompt from where I ran the script. How do I leave it sshed into hadoop master node at the hive prompt.
Consider using Expect, then you could do something along these lines and interact at the end:
/usr/bin/expect <<EOF
spawn ssh ... YourHost
expect "password"
send "password\n"
send javastuff
interact
EOF
These are the most common answers I've seen (with the drawbacks I ran into with them):
Use expect
This is probably the most well rounded solution for most people
I cannot control whether expect is installed in my target environments
Just to try this out anyway, I put together a simple expect script to ssh to a remote machine, send a simple command, and turn control over to the user. There was a long delay before the prompt showed up, and after fiddling with it with little success I decided to move on for the time being.
Eventually I came back to this as the final solution after realizing I had violated one of the 3 virtues of a good programmer -- false impatience.
Use screen / tmux to start the shell, then inject commands from an external process.
This works ok, but if the terminal window dies it leaves a screen/tmux instance hanging around. I could certainly try to come up with a way to just re-attach to prior instances or kill them; screen (and probably tmux) can make it die instead of auto-detaching, but I didn't fiddle with it.
If using gnome-terminal, use its -x or --command flag (I'm guessing xterm and others have similar options)
I'll go into more detail on problems I had with this on #4
Make a bash script with #!/bin/bash --init-file as the shebang; this will cause your script to execute, then leave an interactive shell running afterward
This and #3 had issues with some programs that required user interaction before the shell is presented to them. Some programs (like ssh) it worked fine with, others (telnet, vxsim) presented a prompt but no text was passed along to the program; only ctrl characters like ^C.
Do something like this: xterm -e 'commands; here; exec bash'. This will cause it to create an interactive shell after your commands execute.
This is fine as long as the user doesn't attempt to interrupt with ^C before the last command executes.
Currently, the only thing I've found that gives me the behavior I need is to use cmdtool from the OpenWin project.
/usr/openwin/bin/cmdtool -I 'commands; here'
# or
/usr/openwin/bin/cmdtool -I 'commands; here' /bin/bash --norc
The resulting terminal injects the list of commands passed with -I to the program executed (no parms means default shell), so those commands show up in that shell's history.
What I don't like is that the terminal cmdtool provides feels so clunky ... but alas.
I am on shared hosting and I'm trying to schedule cronjob to run every now and then. Via cPanel I scheduled to execute my script but even though that according to my host support the cronjob runs, the script doesn't seem as doing anything. The cron job command I set via cPanel is:
/bin/sh /home1/myusername/public_html/somefolder/cronjob2.sh
and the cronjob2.sh
#!/bin/bash
/home1/myusername/public_html/somefolder/node_modules/forever/bin/forever stop 0
when via SSH I execute:
/home1/myusername/public_html/somefolder/cronjob2.sh
it stops forever process as needed. From cronjob doesn't do anything.
How can I get this working?
EDIT:
So I've tried:
/bin/sh /home1/username/public_html/somefolder/cronjob2.sh >> /tmp/mylog 2>&1
and mylog entries say:
/usr/bin/env: node: No such file or directory
It seems that forever needs to run node and this cannot be found. How would I possibly fix this?
EDIT2:
Accepted answer at superuser.com. Thank you all for help
https://superuser.com/questions/763261/simple-script-run-via-cronjob-doesnt-work-but-works-from-shell/763288#763288
For cron job lines in a crontab it's not required to specify kind of shell or e.g. of perl.
It's enough, that your script contains
shebang
line.
Therefore you should remove /bin/sh from your cron job line.
Another aspect, that might cause a different behavior of your script by interactive start and by cron daemon start is possible different environment, first of all the PATH variable. Therefore check, if you script is able to be executed in very restricted environment, that is provided by cron daemon. You can determine your cron job environment experimentally by start of temporary cron job, that executes "env" command and writes its output to a file.
Once more aspect: Have you redirected STDOUT and STDERR of the cron job to a log file and read its content to analyze the issue? You can do it as follows:
your_cron_job >/tmp/any_name.log 2>&1
According to what you wrote, when you run your script via SSH, you are using bash, because this line is the first of your script:
#!/bin/bash
However, in the crontab, you are forcing the use of sh instead of bash. Are you sure your script is fully compatible with sh? Otherwise, simply replace /bin/sh with /bin/bash in your cron command and test again.
I'm seeking for an advice regarding the best practice of starting (java) programs from shell scripts.
Currently the practice within our firm seems to be having a shell script which sets all the environment variables and launches the java (which is not important in this case) process on background similar to:
nohup $JAVA_CMD > $LOG_DIR/$LOG_FILE 2>&1 &
which is the last line of the script. We are launching single process.
One doubt I have is that return code of such shell process is always 0 even for the case when the program fails to start due to some Exception/Error. This makes it hard for monitoring tools - they can't rely on the shell exit code for example.
I found out this can be fixed by waiting for the process to end like:
nohup $JAVA_CMD > $LOG_DIR/$LOG_FILE 2>&1 &
wait $!
But my understanding is that this makes the last & completely useless since running:
nohup $JAVA_CMD > $LOG_DIR/$LOG_FILE 2>&1
will have the same effect.
So my question is: what is the best practice of launching programs from shell? Does the running on background have some benefits I'm overlooking?
You should look into at and batch, and possibly cron. These are all tools to run commands scripts, job streams non-interactively. at runs a job then emails stderr output back to the user - default behavior.
at -k now <<!
$JAVA_CMD > $LOG_DIR/$LOG_FILE 2>&1
!
The batch command will let you write a series of commands to a file, then execute the file as if it were stdin, you can also do this interactively.
cron jobs (crontab) run at specified times and dates, like every Monday at 0200. This does not seem to fit your question.
Try this:
http://www.thegeekstuff.com/2010/06/at-atq-atrm-batch-command-examples/
There is an interactive shell console, I can get into it, run specific set of commands inside the console and exit from it.
Now I want to write a bash script that connects to an interactive shell console and runs my commands silently, exits at the end without any interaction. This means I want to have everything automated in a non-interactive way. Any ideas how can I achieve this?
I am trying something like, say, blabla shell is the interactive console here, it always bring me to the interactive mode :(
/usr/bin/blabla shell << EOF
do A,
do B,
do C
quit
EOF
I have a long/specific version of this question can be found here ->
Configure flume in shell/bash script - avoid interactive flume shell console
Closing stdin should do the trick:
exec <&-
The expect command if your friend. It can emulate interactive communication with other commands even in very sophisticated way.
From man expect:
Expect is a program that "talks" to other interactive programs according to a script.
You can try putting the commands you would input in the interactive prompt into a file, then run the command like:
command < file
Maybe the Secure SHell, ssh does what you need. It requires that the "remote" machine is configured as an SSH server. I use it regularly to run commands on other hosts, such as
ssh user#host command