Exit from an SSH session but not script - bash

In my bash script, I do:
ssh me9#some_mad_server.com;
cd ~/apple;
echo "Before Exit"
exit
echo "After Exit"
I never see Before Exit or After Exit. I can understand why I may not see Before Exist as my script at that stage is in another console. But I am confused if the Exit mean my script ends and hence why After Exit never gets logged.
Any help appreciated.

To execute a series of commands on a remote host, you need to pass them to ssh on the command line, not execute them after the ssh call. Like this:
ssh me9#some_mad_server.com '
cd ~/apple
echo "Before Exit"
'
echo "After Exit"
This uses a multiline string to pass multiple commands. An exit is implicit when the end of the string is reached.
Importantly, the commands in the quoted string are executed on the remote host, while the final echo is executed on the local server. I've indented the remote commands for clarity.

You can use screen for this.
screen -d -m <command>
Use screen -r to attach to that screen again
screen -r <screen ID>

Related

Shell script on remote server terminates ssh session

I have a shell script written by some vendor and does a lot of low level stuffs under the hood which I have no domain specific knowledge on. I have a manual given by the vendor how to execute this script on the CLI manually. it works as expected if executed on the CLI.
Now I write a script to automate this process, but the ssh session of my script will terminate abruptly when the script is completed and the remote commands after the script within the ssh session will not execute. only the local commands out of the ssh session will continue.
echo "LOCAL: start"
sshpass -p ${PSSWD} timeout 45 ssh -n -q -oConnectTimeout=10 -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null user#$ip '(
function executeFlash(){
echo "remote: executeFlash: start"
flash.sh
echo "remote: executeFlash: end"
}
echo "REMOTE: start"
cp flashrom /usr/sbin/
cp libftdi1.so.2 /usr/lib64/
executeFlash
echo "REMOTE: end"
)'
echo "LOCAL: end"
output is
LOCAL: start
REMOTE: start
remote: executeFlash: start
some logs showing the successful execution of flash.sh
--- missing remote command for "remote: executeFlash: end" and "REMOTE: end"
LOCAL: end
As shown above, the session seems to be terminated and rest of the remote commands not executed "remote: executeFlash: end"
I have tried to call the script in subprocess like flash.sh & or ./flash.sh
but no luck. During Manual execution in the CLI, I do not lose the ssh session which is the expected behavior
After removing "timeout 45" from the ssh command, the issue got resolved

Logging into server (ssh) with bash script

I want to log into server based on user's choice so I wrote bash script. I am totally newbie - it is my first bash script:
#!/bin/bash
echo -e "Where to log?\n 1. Server A\n 2. Server B"
read to_log
if [ $to_log -eq 1 ] ; then
echo `ssh user#ip -p 33`
fi
After executing this script I am able to put a password but after nothing happens.
If someone could help me solve this problem, I would be grateful.
Thank you.
The problem with this script is the contents of the if statement. Replace:
echo `ssh user#ip -p 33`
with
ssh user#ip
and you should be good. Here is why:
Firstly, the use of back ticks is called "command substitution". Back ticks have been deprecated in favor of $().
Command substitution tells the shell to create a sub-shell, execute the enclosed command, and capture the output for assignment/use elsewhere in the script. For example:
name=$(whoami)
will run the command whoami, and assign the output to the variable name.
the enclosed command has to run to completion before the assignment can take place, and during that time the shell is capturing the output, so nothing will display on the screen.
In your script, the echo command will not display anything until the ssh command has completed (i.e. the sub-shell has exited), which never happens because the user does not know what is happening.
You have no need to capture the output of the ssh command, so there is no need to use command substitution. Just run the command as you would any other command in the script.

script file : command stopped

I made simple script. file name is sutest.
#!/bin/bash
cd ~/Downloads/redis-4.0.1/src
./redis-server
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
I runed script.$ . sutest
But, script code is stopped at ./redis-server.
So I can't see echo messages.
I want to make this kind of script files. How can I do that??
I would be appreciate your help.
Let's say more general case.
myscript1 file executes process like redis-server above.
another myscript2 file executes process like redis-server above.
another myscript3 file executes process like redis-server above.
How can I run three script files simultaneously??
I want to do job in ssh connection.
To make the matter worse, If I can't use screen or tmux??
Add a '&' char at the end of the row
./redis-server &
this char permits to run in backgroud the job, and the script continues.
Just do the echos first:
cd ~/Downloads/redis-4.0.1/src
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
exec ./redis-server
The use of exec is a small trick (which you can omit if you prefer): it replaces the shell script with redis-server, so the shell script is no longer running at all. Without exec, you end up with the shell script waiting around for redis-server to finish, which is unnecessary if the script will do nothing further.
If you don't like that for some reason, you can keep the original order:
cd ~/Downloads/redis-4.0.1/src
./redis-server & # run in background
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
wait # optional

expect: launching scp after sftp

I could really use some help. I'm still pretty new with expect. I need to launch a scp command directly after I run sftp.
I got the first portion of this script working, my main concern is the bottom portion. I really need to launch a command after this command completes. I'd rather be able to spawn another command than, hack something up like piping this with a sleep command and running it after 10 s or something weird.
Any suggestions are greatly appreciated!
spawn sftp user#host
expect "password: "
send "123\r"
expect "$ "
sleep 2
send "cd mydir\r"
expect "$ "
sleep 2
send "get somefile\r"
expect "$ "
sleep 2
send "bye\r"
expect "$ "
sleep 2
spawn scp somefile user2#host2:/home/user2/
sleep 2
So i figured out I can actually get this to launch the subprocess if I use "exec" instead of spawn.. in other words:
exec scp somefile user2#host2:/home/user2/
the only problem? It prompts me for a password! This shouldn't happen, I already have the ssh-keys installed on both systems. (In other words, if I run the scp command from the host I'm running this expect script on, it will run without prompting me for a password). The system I'm trying to scp to, must be recognizing this newly spawned process as a new host, because its not picking up my ssh-key. Any ideas?
BTW, I apologize I haven't actually posted a "working" script, I can't really do that without comprimising the security of this server. I hope that doesn't detract from anyones ability to assist me.
I think the problem lies with me not terminating the initially spawned process. I don't understand expect enough to do it properly. If I try "close" or "eof", it simply kills the entire script, which I don't want to do just yet (because I still need to scp the file to the second host).
Ensure that your SSH private key is loaded into an agent, and that the environment variables pointing to that agent are active in the session where you're calling scp.
[[ $SSH_AUTH_SOCK ]] || { # if no agent already running...
eval "$(ssh-agent -s)" # ...then start one...
ssh-add /path/to/your/ssh/key # ...load your key...
started_ssh_agent=1 # and flag that we started it ourselves
}
# ...put your script here...
[[ $started_ssh_agent ]] && { # if we started the agent ourselves...
eval "$(ssh-agent -s -k)" # ...then clean up nicely when done.
}
As an aside, I'd strongly suggest replacing the code given in the question with something like the following:
lftp -u user,123 -e 'get /mydir/somefile -o localfile' sftp://host </dev/null
lftp scp://user2#host2 -e 'put localfile -o /home/user2/somefile' </dev/null
Each connection handled in one line, and no silliness messing around with expect.

Bash Script Quits After Exiting SSH

I'm trying to write a Bash script that logs into 2 different linux based power-strips (Ubiquiti Mpower Pros) and turns 2 different lights off (one on each strip). To do this I login to the 1st strip, change the appropriate file to 0 (thus turning off the light), and exit, repeating the same process on the next power-strip. However, after I exit the first SSH connection, the script stops working. Could someone please suggest a fix? My only idea would be to encase this script in a python program. Here's my code:
#!/bin/bash
ssh User#192.168.0.100
echo "0" > /proc/power/relay1
exit
# hits the enter key
cat <(echo "") | <command>
ssh User#192.168.0.103
echo "logged in"
echo "0" > /proc/power/relay1
exit
cat <(echo "") | <command>
ssh as an app BLOCKS while it's running, the echo and exit are executed by the local shell, not by the remote machine. so you are doing:
ssh to remote machine
exit remote shell
echo locally
exit locally
and boom, your script is dead. If that echo/exit is supposed to be run on the remote system, then you should be doing:
ssh user#host command
^^^^^---executed on the remote machine
e.g.
ssh foo#bar 'echo ... ; exit'
The commands you're apparently trying to run through ssh are actually being executed locally. You can just pass the command you want to run to ssh and it will do it (without needing an explicit exit)
ssh User#192.168.0.110 'echo "0" > /proc/power/relay1'
will do that, and similar for the other ssh command

Resources