Run a job in a raspberry pi - bash

I have written some simple python scripts and I would like to run them on my raspberry pi even when I am not logged in. So far, I can log in to the machine via ssh and run the script without a problem. As soon as I have to disconnect the ssh session, I notice that the script breaks or stops. Thus, I was wondering if there is a way to keep the script running after the end of the ssh connection.
Here is the system I am using: Raspberry pi 3B+ with ubuntu 22.04 LTS, and here is how I run my script:
ssh xx#xxx.xxx.xxx.xxx
cd myapp/
python3 runapp.py

You can use nohup to stop hangup signals affecting your process. Here are three different ways of doing it.
This is a "single shot" command, I mean you type it all in one go:
ssh SOMEHOST "cd SOMEWHERE && nohup python3 SOMESCRIPT &"
Or here, you log in, change directory and get a new prompt in the remote host, run some commands and then, at some point, exit the ssh session:
ssh SOMEHOST
cd SOMEWHERE
nohup python SOMESCRIPT &
exit
Or, this is another "single shot" where you won't get another prompt till you type EOF
ssh SOMEHOST <<EOF
cd SOMEWHERE
nohup python SOMESCRIPT &
EOF

if there is a way to keep the script running after the end of the ssh connection.
Just run it in the background.
python3 runapp.py &
You could store the logs to system log.
python3 runapp.py | logger &
You could learn about screen and tmux virtual terminals, so that you can view the logs later. You could start a tmux session, run the command inside and detach the session.
You could setup a systemd service file with the program, and run is "as a service".

If atd is running you can use at to schedule commands to be executed at a particular time.
Examples:
$ echo "python3 /path/to/runapp.py"|at 11:00
job 10 at Fri Jun 3 11:00:00 2022
$ echo "python3 /path/to/runapp.py" | at now
job 11 at Thu Jun 2 19:57:00 2022
# after minutes/hours/days/....
$ echo "python3 /path/to/runapp.py" | at now +5 minutes
$ echo "python3 /path/to/runapp.py" | at now +2 hours
$ ssh user#host "echo 'python3 /path/to/runapp.py'| at now"
Jobs created with at are executed only once.

Related

Why my command is not working with nohup and SSH?

I'm restarting the server from my local mashine using the following command:
ssh -l root -p 22 $SERVER_HOST "cd $SERVER_DIR && nohup bin/restart &"
And it's not working, and prints nothing so I don't know what's the problem. But if I remove nohup and & - it's working. Why, and how to make it work (and continue in background after terminating ssh)?
Version without nohup works, but blocks the shell (it also prints output from the bin/restart script unlike the version with nohup). But I can't use it as I need the server to continue to work in background.
ssh -l root -p $SERVER_PORT $SERVER_HOST "cd $SERVER_DIR && bin/restart"
If that matter, the content of the bin/restart script (restarting Ruby on Rails app)
. /root/.asdf/asdf.sh
killall -r ruby
RAILS_ENV=production bundle exec rails server
What worked for me is:
ssh -tt -l root -p 22 $SERVER_HOST 'cd $SERVER_DIR && nohup bin/restart & sleep 1'
Tested with OpenSSH 7.2p2 (client & server), bash 4.3, Linux 4.15.
-tt forces ssh to allocate a terminal, which is apparently important for nohup to work.
sleep 1 is not special, you just need something that will force the shell to context-switch, it could also be /bin/true, but that isn't as sure-fire.

why Jenkins shell script hangs when i run sudo pm2 ls

I confess I am total newbie to Jenkins.
I have
Jenkins-tls
installed on my Mac for experimentation.
I have a remote server that I testing with.
My Jenkins script is ultra simple.
ssh to the remote machine
sudo pm2 ls
the last command just hangs
I run the same 2 commands from the command line and it all works perfectly.
FYI, I need sudo for pm2 since I need to be root to run pm2, without sudo, I get access denied.
Any thoughts?
I believe you make the invalid assumption that jenkins somehow "types" commands after starting ssh to the remote session's command shell. This is not what happens. Instead, it will wait for the ssh command to finish, and only then execute the next command sudo pm2 ls. This never happens, because the ssh session never terminates. You observe this as a "hang".
How to solve this?
If there's only a small number of commands, you can use ssh to run them with
ssh user#remote sudo mp2 ls
ssh user#remote command arg1 arg2
If this gets longer, why not place all commands in a remote script and just run it with
ssh user#remote /path/to/script

Raspbian - Transmission torrents don't start after rebooting

I wrote a bash file and put in cron to start my torrents (on web interface) automatically after rebooting the system, but nothing happens.
The crontab -e
#reboot bash /home/pi/torrent.sh >> /home/pi/torrent.log 2>&1
torrent.sh
echo
date
sudo service transmission-daemon start
sleep 10
sudo transmission-remote -t all -s
sleep 1
The torrent.sh has got all permissions.
P.S.: If I run the script from terminal my torrents start normally.
Hope you can help!

run ssh script into ubuntu instance do something, when exit, stay in ubuntu

I am running a very simple script that will ssh into a remote ubuntu instance, move around the directory structure execute a few things, then I want the prompt to stay in Ubuntu. When the script ends, in ends back at the local prompt. How do I make modify the script so that it finishes with the remote prompt?
local$ ssh -i xxx.pem ubuntu#xxx.ap-region.compute.amazonaws.com \
"cd virtualenv; ls -lh;"
There are two things needed to be added to your commandline:
The bash command in the end starts the bash shell (you can start any other you want)
The -t switch will make sure the remote server will allocate you TTY and your shell will work as expected:
local$ ssh -t -i xxx.pem ubuntu#xxx.ap-region.compute.amazonaws.com \
"cd virtualenv; ls -lh; bash"

How to run a command in background using ssh and detach the session

I'm currently trying to ssh into a remote machine and run a script, then leave the node with the script running. Below is my script. However, when it runs, the script is successfully run on the machine but ssh session hangs. What's the problem?
ssh -x $username#$node 'rm -rf statuslist
mkdir statuslist
chmod u+x ~/monitor/concat.sh
chmod u+x ~/monitor/script.sh
nohup ./monitor/concat.sh &
exit;'
There are some situations when you want to execute/start some scripts on a remote machine/server (which will terminate automatically) and disconnect from the server.
eg: A script running on a box which when executed
takes a model and copies it to a remote server
creates a script for running a simulation with the model and push it to server
starts the script on the server and disconnect
The duty of the script thus started is to run the simulation in the server and once completed (will take days to complete) copy the results back to client.
I would use the following command:
ssh remoteserver 'nohup /path/to/script `</dev/null` >nohup.out 2>&1 &'
#CKeven, you may put all those commands on one script, push it to the remote server and initiate it as follows:
echo '#!/bin/bash
rm -rf statuslist
mkdir statuslist
chmod u+x ~/monitor/concat.sh
chmod u+x ~/monitor/script.sh
nohup ./monitor/concat.sh &
' > script.sh
chmod u+x script.sh
rsync -azvp script.sh remotehost:/tmp
ssh remotehost '/tmp/script.sh `</dev/null` >nohup.out 2>&1 &'
Hope this works ;-)
Edit:
You can also use
ssh user#host 'screen -S SessionName -d -m "/path/to/executable"'
Which creates a detached screen session and runs target command within it
What do you think about using screen for this? You could run screen via ssh to start the command (concat.sh) and then you'd be able to return to the screen session if you wanted to monitor it (could be handy, depending on what concat does).
To be more specific, try this:
ssh -t $username#$node screen -dm -S testing ./monitor/concat.sh
You should find that the prompt returns immediately, and that concat.sh is running on the remote machine. I'll explain some of the options:
ssh -t makes a TTY. screen needs this.
screen -dm makes it start in "detached" mode. This is like "background" for your purposes.
-S testing gives your screen session a name. It is optional but recommended.
Now, once you've done this, you can go to the remote machine and run this:
screen -r testing
This will attach you to the screen session which contains your program. From there you can control it, kill it, see its output, and so on. Ctrl-A, then d will detach you from the screen session. screen -ls will list all running sessions.
It could be the standard input stream. Try ssh -n ... or ssh -f ....
For me, only this worked:
screen -dmS name sh my-script.sh
This, of course, depends on screen, and lets you attach later, if you ever want stdin or stdout. Screen will terminate itself when my-script.sh ends.
Below is a much more common decision that required some efforts to find, and it really works for me:
#!/usr/bin/bash
theScreenSessionName="test"
theTabNumber="1"
theStuff="date; hostname; cd /usr/local; pwd; /usr/local/bin/top"
echo "this is a test"
ssh -f user#server "/usr/local/bin/screen -x $theScreenSessionName -p $theTabNumber -X stuff \"
$theStuff
\""
It sends $theStuff list of commands to the tab No $theTabNumber of the screen session $theScreenSessionName preliminarily created at the 'server' on behalf of 'user'.
Please be aware of a trailing whitespace after
-X stuff \"
that is sent to overcome a 'stuff' option's glitch. The whitespace and $theStuff in the next line are appended by 'Enter' (^M) keystrokes. DON'T MISS 'EM!
The "this is a test" message is echoed in the initial terminal, and $theStuff commands are really executed inside the mentioned screen/tab.

Resources