I'm restarting the server from my local mashine using the following command:
ssh -l root -p 22 $SERVER_HOST "cd $SERVER_DIR && nohup bin/restart &"
And it's not working, and prints nothing so I don't know what's the problem. But if I remove nohup and & - it's working. Why, and how to make it work (and continue in background after terminating ssh)?
Version without nohup works, but blocks the shell (it also prints output from the bin/restart script unlike the version with nohup). But I can't use it as I need the server to continue to work in background.
ssh -l root -p $SERVER_PORT $SERVER_HOST "cd $SERVER_DIR && bin/restart"
If that matter, the content of the bin/restart script (restarting Ruby on Rails app)
. /root/.asdf/asdf.sh
killall -r ruby
RAILS_ENV=production bundle exec rails server
What worked for me is:
ssh -tt -l root -p 22 $SERVER_HOST 'cd $SERVER_DIR && nohup bin/restart & sleep 1'
Tested with OpenSSH 7.2p2 (client & server), bash 4.3, Linux 4.15.
-tt forces ssh to allocate a terminal, which is apparently important for nohup to work.
sleep 1 is not special, you just need something that will force the shell to context-switch, it could also be /bin/true, but that isn't as sure-fire.
Related
I have written some simple python scripts and I would like to run them on my raspberry pi even when I am not logged in. So far, I can log in to the machine via ssh and run the script without a problem. As soon as I have to disconnect the ssh session, I notice that the script breaks or stops. Thus, I was wondering if there is a way to keep the script running after the end of the ssh connection.
Here is the system I am using: Raspberry pi 3B+ with ubuntu 22.04 LTS, and here is how I run my script:
ssh xx#xxx.xxx.xxx.xxx
cd myapp/
python3 runapp.py
You can use nohup to stop hangup signals affecting your process. Here are three different ways of doing it.
This is a "single shot" command, I mean you type it all in one go:
ssh SOMEHOST "cd SOMEWHERE && nohup python3 SOMESCRIPT &"
Or here, you log in, change directory and get a new prompt in the remote host, run some commands and then, at some point, exit the ssh session:
ssh SOMEHOST
cd SOMEWHERE
nohup python SOMESCRIPT &
exit
Or, this is another "single shot" where you won't get another prompt till you type EOF
ssh SOMEHOST <<EOF
cd SOMEWHERE
nohup python SOMESCRIPT &
EOF
if there is a way to keep the script running after the end of the ssh connection.
Just run it in the background.
python3 runapp.py &
You could store the logs to system log.
python3 runapp.py | logger &
You could learn about screen and tmux virtual terminals, so that you can view the logs later. You could start a tmux session, run the command inside and detach the session.
You could setup a systemd service file with the program, and run is "as a service".
If atd is running you can use at to schedule commands to be executed at a particular time.
Examples:
$ echo "python3 /path/to/runapp.py"|at 11:00
job 10 at Fri Jun 3 11:00:00 2022
$ echo "python3 /path/to/runapp.py" | at now
job 11 at Thu Jun 2 19:57:00 2022
# after minutes/hours/days/....
$ echo "python3 /path/to/runapp.py" | at now +5 minutes
$ echo "python3 /path/to/runapp.py" | at now +2 hours
$ ssh user#host "echo 'python3 /path/to/runapp.py'| at now"
Jobs created with at are executed only once.
I am running a very simple script that will ssh into a remote ubuntu instance, move around the directory structure execute a few things, then I want the prompt to stay in Ubuntu. When the script ends, in ends back at the local prompt. How do I make modify the script so that it finishes with the remote prompt?
local$ ssh -i xxx.pem ubuntu#xxx.ap-region.compute.amazonaws.com \
"cd virtualenv; ls -lh;"
There are two things needed to be added to your commandline:
The bash command in the end starts the bash shell (you can start any other you want)
The -t switch will make sure the remote server will allocate you TTY and your shell will work as expected:
local$ ssh -t -i xxx.pem ubuntu#xxx.ap-region.compute.amazonaws.com \
"cd virtualenv; ls -lh; bash"
I am trying to execute a few commands via my first script but it's not working.
#!/bin/bash
#connect to server
echo "Connecting to the server..."
ssh -t root#IP '
#switch user to deploy
su - deploy
#switch path
echo "Switching the path"
cd /var/www/deploys/bin/app/config
#run deploy script
echo "Running deploy script"
/usr/local/bin/cap -S env=prod deploy
#restart apache
sudo /bin/systemctl restart httpd.service
bash -l
'
What is happening? I am successfully connected to the server, the user is changed and then I don't see nothing happening. When I press ctrl + c just like that in terminal, some output from the command that should be executed appears but there are some errors.
Why I don't see everything what is happening in terminal after launching the script? Am I doing it the wrong way?
BTW: when I try connect manually and run the commands myself, everything is working nicely.
Using CentOS 7.
Clean way to login through ssh and excecute a set of commands is
ssh user#ip << EOF
#some commands
EOF
here EOF acts as the delimitter for the command list
the script can be modified as
ssh -t root#IP << EOF
#switch user to deploy
su - deploy
#switch path
echo "Switching the path"
cd /var/www/deploys/bin/app/config
#run deploy script
echo "Running deploy script"
/usr/local/bin/cap -S env=prod deploy
#restart apache
sudo /bin/systemctl restart httpd.service
bash -l
EOF
will excecutes the command and closes the connection there after
I'm attempting to write a bash script in ruby that will start a Resque worker for one of my apps.
The command that I generate from the params given in the console looks like this...
command = "ssh user##{#ip} 'cd /path/to/app; bundle exec rake resque:work QUEUE=#{#queue}&'"
`command`
The command is interpolated correctly and everything looks great. I'm asked to input the password for the ssh command and then nothing happens. I'm pretty sure my syntax is correct for making an ssh connection and running a line of code within that connection. ssh user#host 'execute command'
I've done a simpler command that only runs the mac say terminal command and that worked fine
command = "ssh user##{#ip} 'say #{#queue}'"
`command`
I'm running the rake task in the background because I have used that line once inside ssh and it will only keep the worker alive if you run the process in the background.
Any thoughts? Thanks!
I figured it out.
It was an rvm thing. I need to include . .bash_profile at the beginning of the scripts I wanted to run.
So...
"ssh -f hostname '. .bash_profile && cd /path/to/app && bundle exec rake resque:work QUEUE=queue'" is what I needed to make it work.
Thanks for the help #Casper
Ssh won't exit the session until all processes that were launched by the command argument have finished. It doesn't matter if you run them in the background with &.
To get around this problem just use the -f switch:
-f Requests ssh to go to background just before command execution. This is
useful if ssh is going to ask for passwords or passphrases, but the user
wants it in the background. This implies -n. The recommended way to start
X11 programs at a remote site is with something like ssh -f host xterm.
I.e.
"ssh -f user##{#ip} '... bundle exec rake resque:work QUEUE=#{#queue}'"
EDIT
In fact looking more closely at the problem it seems ssh is just waiting for the remote side to close stdin and stdout. You can test it easily like this:
This hangs:
ssh localhost 'sleep 10 &'
This does not hang:
ssh localhost 'sleep 10 </dev/null >/dev/null &'
So I assume the last version is actually pretty closely equivalent to running with -f.
I'm currently trying to ssh into a remote machine and run a script, then leave the node with the script running. Below is my script. However, when it runs, the script is successfully run on the machine but ssh session hangs. What's the problem?
ssh -x $username#$node 'rm -rf statuslist
mkdir statuslist
chmod u+x ~/monitor/concat.sh
chmod u+x ~/monitor/script.sh
nohup ./monitor/concat.sh &
exit;'
There are some situations when you want to execute/start some scripts on a remote machine/server (which will terminate automatically) and disconnect from the server.
eg: A script running on a box which when executed
takes a model and copies it to a remote server
creates a script for running a simulation with the model and push it to server
starts the script on the server and disconnect
The duty of the script thus started is to run the simulation in the server and once completed (will take days to complete) copy the results back to client.
I would use the following command:
ssh remoteserver 'nohup /path/to/script `</dev/null` >nohup.out 2>&1 &'
#CKeven, you may put all those commands on one script, push it to the remote server and initiate it as follows:
echo '#!/bin/bash
rm -rf statuslist
mkdir statuslist
chmod u+x ~/monitor/concat.sh
chmod u+x ~/monitor/script.sh
nohup ./monitor/concat.sh &
' > script.sh
chmod u+x script.sh
rsync -azvp script.sh remotehost:/tmp
ssh remotehost '/tmp/script.sh `</dev/null` >nohup.out 2>&1 &'
Hope this works ;-)
Edit:
You can also use
ssh user#host 'screen -S SessionName -d -m "/path/to/executable"'
Which creates a detached screen session and runs target command within it
What do you think about using screen for this? You could run screen via ssh to start the command (concat.sh) and then you'd be able to return to the screen session if you wanted to monitor it (could be handy, depending on what concat does).
To be more specific, try this:
ssh -t $username#$node screen -dm -S testing ./monitor/concat.sh
You should find that the prompt returns immediately, and that concat.sh is running on the remote machine. I'll explain some of the options:
ssh -t makes a TTY. screen needs this.
screen -dm makes it start in "detached" mode. This is like "background" for your purposes.
-S testing gives your screen session a name. It is optional but recommended.
Now, once you've done this, you can go to the remote machine and run this:
screen -r testing
This will attach you to the screen session which contains your program. From there you can control it, kill it, see its output, and so on. Ctrl-A, then d will detach you from the screen session. screen -ls will list all running sessions.
It could be the standard input stream. Try ssh -n ... or ssh -f ....
For me, only this worked:
screen -dmS name sh my-script.sh
This, of course, depends on screen, and lets you attach later, if you ever want stdin or stdout. Screen will terminate itself when my-script.sh ends.
Below is a much more common decision that required some efforts to find, and it really works for me:
#!/usr/bin/bash
theScreenSessionName="test"
theTabNumber="1"
theStuff="date; hostname; cd /usr/local; pwd; /usr/local/bin/top"
echo "this is a test"
ssh -f user#server "/usr/local/bin/screen -x $theScreenSessionName -p $theTabNumber -X stuff \"
$theStuff
\""
It sends $theStuff list of commands to the tab No $theTabNumber of the screen session $theScreenSessionName preliminarily created at the 'server' on behalf of 'user'.
Please be aware of a trailing whitespace after
-X stuff \"
that is sent to overcome a 'stuff' option's glitch. The whitespace and $theStuff in the next line are appended by 'Enter' (^M) keystrokes. DON'T MISS 'EM!
The "this is a test" message is echoed in the initial terminal, and $theStuff commands are really executed inside the mentioned screen/tab.