bash script execute commands after ssh - bash

I am trying to execute a few commands via my first script but it's not working.
#!/bin/bash
#connect to server
echo "Connecting to the server..."
ssh -t root#IP '
#switch user to deploy
su - deploy
#switch path
echo "Switching the path"
cd /var/www/deploys/bin/app/config
#run deploy script
echo "Running deploy script"
/usr/local/bin/cap -S env=prod deploy
#restart apache
sudo /bin/systemctl restart httpd.service
bash -l
'
What is happening? I am successfully connected to the server, the user is changed and then I don't see nothing happening. When I press ctrl + c just like that in terminal, some output from the command that should be executed appears but there are some errors.
Why I don't see everything what is happening in terminal after launching the script? Am I doing it the wrong way?
BTW: when I try connect manually and run the commands myself, everything is working nicely.
Using CentOS 7.

Clean way to login through ssh and excecute a set of commands is
ssh user#ip << EOF
#some commands
EOF
here EOF acts as the delimitter for the command list
the script can be modified as
ssh -t root#IP << EOF
#switch user to deploy
su - deploy
#switch path
echo "Switching the path"
cd /var/www/deploys/bin/app/config
#run deploy script
echo "Running deploy script"
/usr/local/bin/cap -S env=prod deploy
#restart apache
sudo /bin/systemctl restart httpd.service
bash -l
EOF
will excecutes the command and closes the connection there after

Related

Run a job in a raspberry pi

I have written some simple python scripts and I would like to run them on my raspberry pi even when I am not logged in. So far, I can log in to the machine via ssh and run the script without a problem. As soon as I have to disconnect the ssh session, I notice that the script breaks or stops. Thus, I was wondering if there is a way to keep the script running after the end of the ssh connection.
Here is the system I am using: Raspberry pi 3B+ with ubuntu 22.04 LTS, and here is how I run my script:
ssh xx#xxx.xxx.xxx.xxx
cd myapp/
python3 runapp.py
You can use nohup to stop hangup signals affecting your process. Here are three different ways of doing it.
This is a "single shot" command, I mean you type it all in one go:
ssh SOMEHOST "cd SOMEWHERE && nohup python3 SOMESCRIPT &"
Or here, you log in, change directory and get a new prompt in the remote host, run some commands and then, at some point, exit the ssh session:
ssh SOMEHOST
cd SOMEWHERE
nohup python SOMESCRIPT &
exit
Or, this is another "single shot" where you won't get another prompt till you type EOF
ssh SOMEHOST <<EOF
cd SOMEWHERE
nohup python SOMESCRIPT &
EOF
if there is a way to keep the script running after the end of the ssh connection.
Just run it in the background.
python3 runapp.py &
You could store the logs to system log.
python3 runapp.py | logger &
You could learn about screen and tmux virtual terminals, so that you can view the logs later. You could start a tmux session, run the command inside and detach the session.
You could setup a systemd service file with the program, and run is "as a service".
If atd is running you can use at to schedule commands to be executed at a particular time.
Examples:
$ echo "python3 /path/to/runapp.py"|at 11:00
job 10 at Fri Jun 3 11:00:00 2022
$ echo "python3 /path/to/runapp.py" | at now
job 11 at Thu Jun 2 19:57:00 2022
# after minutes/hours/days/....
$ echo "python3 /path/to/runapp.py" | at now +5 minutes
$ echo "python3 /path/to/runapp.py" | at now +2 hours
$ ssh user#host "echo 'python3 /path/to/runapp.py'| at now"
Jobs created with at are executed only once.

Why my command is not working with nohup and SSH?

I'm restarting the server from my local mashine using the following command:
ssh -l root -p 22 $SERVER_HOST "cd $SERVER_DIR && nohup bin/restart &"
And it's not working, and prints nothing so I don't know what's the problem. But if I remove nohup and & - it's working. Why, and how to make it work (and continue in background after terminating ssh)?
Version without nohup works, but blocks the shell (it also prints output from the bin/restart script unlike the version with nohup). But I can't use it as I need the server to continue to work in background.
ssh -l root -p $SERVER_PORT $SERVER_HOST "cd $SERVER_DIR && bin/restart"
If that matter, the content of the bin/restart script (restarting Ruby on Rails app)
. /root/.asdf/asdf.sh
killall -r ruby
RAILS_ENV=production bundle exec rails server
What worked for me is:
ssh -tt -l root -p 22 $SERVER_HOST 'cd $SERVER_DIR && nohup bin/restart & sleep 1'
Tested with OpenSSH 7.2p2 (client & server), bash 4.3, Linux 4.15.
-tt forces ssh to allocate a terminal, which is apparently important for nohup to work.
sleep 1 is not special, you just need something that will force the shell to context-switch, it could also be /bin/true, but that isn't as sure-fire.

Running part of bash script on a remote machine

I need to run some commands locally and then some command on a remote machine all using a single local bash script.
For simplicity just say I want to do the following and execute it on my local desktop machine.
#!/bin/bash
#upload some files to a remote machine
cd /tmp
./upload-files.sh
#run commands on remote machine
ssh myuser:mypass#somewhere.com
cd /tmp/uploads <--- these commands don't run in the ssh connection
./process-uploads.sh
exit
#run command locally again.
cd -
echo 'complete!'
Any ideas of how to do this?
You can use here-doc with ssh command:
#!/bin/bash
#upload some files to a remote machine
cd /tmp
./upload-files.sh
#run commands on remote machine
ssh -t -t myuser:mypass#somewhere.com<<EOF
cd /tmp/uploads
./process-uploads.sh
exit
EOF
#run command locally again.
cd -
echo 'complete!'
If you want to run only one command:
ssh myuser:mypass#somewhere.com 'cd /tmp/uploads; ./process-uploads.sh'

Excecuting script running ssh commands in the background

I'm trying to execute this script on a remote server with requiretty enabled in the sudoers file.
#!/bin/bash
value=$(ssh -tt localhost sudo bash -c hostname)
echo $value
If I run the script using $ ./sample.sh & it stays stopped in the background. Only by using fg I can force the script to run. I think the problem is the missing tty for the output, but what can I do?
... what can I do?
You can stty -tostop.

Shell script - exiting before completion

I have made a short shell script which launches a VM, sleeps some time to allow the VM to boot and then mounts a share at the VM on the host computer:
#!/bin/bash
nohup VBoxManage startvm "Ubuntu server" --type headless &&
sleep 60 &&
sudo mount -t cifs //192.168.1.1/www /media/ubuntuserver/
The VM is started properly and the script sleeps but no mount occurs and the script seems to just exit instead. What am I doing wrong?
is your sudo mount working in non-interactive mode? make sure this command is not asking any password
Add some logging so that you know what output is being returned
#!/bin/bash
nohup VBoxManage startvm "Ubuntu server" --type headless 2>&1 >> ~/script_log.txt &&
sleep 60 2>&1 >> ~/script_log.txt &&
sudo mount -t cifs //192.168.1.1/www /media/ubuntuserver/ 2>&1 >> ~/script_log.txt
replace ~/script_log.txt with any suitable log file path

Resources