Automate a Ruby command without it exiting - ruby

This hopefully should be an easy question to answer. I am attempting to have mumble-ruby run automatically I have everything up and running except after running this simple script it runs but ends. In short:
Running this from terminal I get "Press enter to terminate script" and it works.
Running this via a cronjob runs the script but ends it and runs cli.disconnect (I assume).
I want the below script to run automatically via a cronjob at a specified time and not end until the server shuts down.
#!/usr/bin/env ruby
require 'mumble-ruby'
cli = Mumble::Client.new('IP Address', Port, 'MusicBot', 'Password')
cli.connect
sleep(1)
cli.join_channel(5)
stream = cli.stream_raw_audio('/tmp/mumble.fifo')
stream.volume = 2.7
print 'Press enter to terminate script';
gets
cli.disconnect

Assuming you are on a Unix/Linux system, you can run it in a screen session. (This is a Unix command, not a scripting function.)
If you don't know what screen is, it's basically a "detachable" terminal session. You can open a screen session, run this script, and then detach from that screen session. That detached session will stay alive even after you log off, leaving your script running. (You can re-attach to that screen session later if you want to shut it down manually.)
screen is pretty neat, and every developer on Unix/Linux should be aware of it.
How to do this without reading any docs:
open a terminal session on the server that will run the script
run screen - you will now be in a new shell prompt in a new screen session
run your script
type ctrl-a then d (without ctrl; the "d" is for "detach") to detach from the screen (but still leave it running)
Now you're back in your first shell. Your script is still alive in your screen session. You can disconnect and the screen session will keep on trucking.
Do you want to get back into that screen and shut the app down manually? Easy! Run screen -r (for "reattach"). To kill the screen session, just reattach and exit the shell.
You can have multiple screen sessions running concurrently, too. (If there is more than one screen running, you'll need to provide an argument to screen -r.)
Check out some screen docs!
Here's a screen howto. Search "gnu screen howto" for many more.

Lots of ways to skin this cat... :)
My thought was to take your script (call it foo) and remove the last 3 lines. In your /etc/rc.d/rc.local file (NOTE: this applies to Ubuntu and Fedora, not sure what you're running - but it has something similar) you'd add nohup /path_to_foo/foo 2>&1 > /dev/null& to the end of the file so that it runs in the background. You can also run that command right at a terminal if you just want to run it and have it running. You have to make sure that foo is made executable with chmod +x /path_to_foo/foo.

Use an infinite loop. Try:
while running do
sleep(3600)
end
You can use exit to terminate when you need to. This will run the loop once an hour so it doesnt eat up processing time. An infinite loop before your disconnect method will prevent it from being called until the server shuts down.

Related

bash ignores & for last command in loop

I just wrote my first bash script to start some redis instances on a development server. While it is mostly working, the last opened redis instance is blocking the active terminal – though I have the trailing & sign and the other started instances aren't blocking the terminal. How would I push them all to the background?
Here's the script:
#!/bin/bash
REDIS=(6379 6380 6381 6382 6383 6390 6391 6392 6393)
for i in "${REDIS[#]}"
do
:
redis-server --port $i &
done
It sounds like your terminal is not actually blocked, your prompt just got overwritten. It's a purely cosmetic issue. Due to the way terminals work, bash doesn't know to redraw it so it looks like the command is in the foreground.
Run the script again, and blindly type lsEnter. You'll probably see that the shell responds as normal, even though you can't see the prompt.
You can alternatively just hit Enter to get bash to redraw the prompt.

unix - running a shell script in the background and creating an output log

What's the best way to run this shell script where I need to create an output log at the same time run it in the background? The catch is, I need to input a couple of parameters then enter a password.
For example I execute the shell script like so:
-bash-4.3$ ./tst.sh param1 param2 >> tst.log
Password for user mas:
I need to pass in (2) parameters, then prompted for a password:
./tsh.sh <param1> <param2>
This will work, but I have to keep the session open and I want it so it goes to the background or something similar where it will continue to run if my connection to the host fails..
If you want to run something that will survive if your connection fails you should run it in a screen or tmux session. You can use those to create sessions that you can disconnect from and reconnect to, and many other really cool things once you start really getting into them.
So if you ssh in and then run screen you'll still be at a bash prompt, but if you run a command then press ^a^d you will detach from that session. Everything running inside screen will keep going, and you'll be able to reconnect with screen -x later. You can have many screen sessions at the same time too, use screen -ls to see which are running then you can use screen -x <id> to reconnect to a particular one.

executing a script which runs even if i log off

So, I have a long running script (of order few days) say execute.sh which I am planning to execute on a server on which I have a user account...
Now, I want to execute this script so that it runs forever even if I logoff or disconnect from the server??
How do i do that?
THanks
You have a couple of choices. The most basic would be to use nohup:
nohup ./execute.sh
nohup executes the command as a child process and detaches from terminal and continues running if it receives SIGHUP. This signal means sig hangup and will getting triggered if you close a terminal and a process is still attached to it.
The output of the process will getting redirected to a file, per default nohup.out located in the current directory.
You may also use bash's disown functionality. You can start a script in bash:
./execute.sh
Then press Ctrl+z and then enter:
disown
The process will now run in background, detached from the terminal. If you care about the scripts output you may redirect output to a logfile:
./execute.sh > execute.log 2>&1
Another option would be to install screen on the remote machine, run the command in a screen session and detach from it. You'll find a lot of tutorials about this.
nohup (no hangup) it and run it in the background:
nohup execute.sh &
Output that normally would have gone to the screen (STDOUT) will go to a file called nohup.out.

how to send ssh job to background

I logged in to a remote server via ssh and started a php script. Appereantly, it will take 17 hours to complete, is there a way to break the connection but the keep the script executing? I didn't make any output redirection, so I am seeing all the output.
Can you stop the process right now? If so, launch screen, start the process and detach screen using ctrl-a then ctrl-d. Use screen -r to retrieve the session later.
This should be available in most distros, failing that, a package will definitely be available for you.
ctrl + z
will pause it. Than type
bg
to send it to background. Write down the PID of the process for later usage ;)
EDIT: I forgot, you have to execute
disown -$PID
where $PID is the pid of your process
after that, and the process will not be killed after you close the terminal.
you described it's important to protect script continuation. Unfortunately I don't know, you make any interaction with script and script is made by you.
continuation protects 'screen' command. your connection will break, but screen protect pseudo terminal, you can reconnect to this later, see man.
if you don't need operators interaction with script, you simply can put script to background at the start, and log complete output into log file. Simply use command:
nohup /where/is/your.script.php >output.log 2&>1 &
>output.log will redirect output into log file, 2&>1 will append error stream into output, effectively into log file. last & will put command into background. Notice, nohup command will detach process from terminal group.
At now you can safely exit from ssh shell. Because your script is out of terminal group, then it won't be killed. It will be rejoined from your shell process, into system INIT process. It is unix like system behavior. Complete output you can monitor using command
tail -f output.log #allways breakable by ^C, it is only watching
Using this method you do not need use ^Z , bg etc shell tricks for putting command to the background.
Notice, using redirection to nohup command is preferred. Otherwise nohup will auto redirect all outputs for you to nohup.out file in the current directory.
You can use screen.

Run Ruby script in the background

I have a Ruby script that I need to have running all the time in my Linux box. I tried nohup ruby ruby.rb& but it seems it doesn't work.
How can I have the script running in background?
Have a look at screen which is a command-line utility. Start it with
screen
You will get a new shell which is detached. Start your script there with
ruby whatever.rb
And watch it run. Then hit Ctrl-A Ctrl-D and you should be back at your original shell. You can leave the ssh session now, and the script will continue running. At a later time, login to your box and type
screen -r
and you should be back to the detached shell.
If you use screen more than once, you will have to select the screen session by pid which is not so comfortable. To simplify, you can do
screen -S worker
to start the session and
screen -r worker
to resume it.
Depending on your needs:
fork do
Process.setsid
sleep 5
puts "In daemon"
end
puts "In control script"
In real life you will have to reopen STDOUT/STDERR.

Resources