Shell script on remote server terminates ssh session - bash

I have a shell script written by some vendor and does a lot of low level stuffs under the hood which I have no domain specific knowledge on. I have a manual given by the vendor how to execute this script on the CLI manually. it works as expected if executed on the CLI.
Now I write a script to automate this process, but the ssh session of my script will terminate abruptly when the script is completed and the remote commands after the script within the ssh session will not execute. only the local commands out of the ssh session will continue.
echo "LOCAL: start"
sshpass -p ${PSSWD} timeout 45 ssh -n -q -oConnectTimeout=10 -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null user#$ip '(
function executeFlash(){
echo "remote: executeFlash: start"
flash.sh
echo "remote: executeFlash: end"
}
echo "REMOTE: start"
cp flashrom /usr/sbin/
cp libftdi1.so.2 /usr/lib64/
executeFlash
echo "REMOTE: end"
)'
echo "LOCAL: end"
output is
LOCAL: start
REMOTE: start
remote: executeFlash: start
some logs showing the successful execution of flash.sh
--- missing remote command for "remote: executeFlash: end" and "REMOTE: end"
LOCAL: end
As shown above, the session seems to be terminated and rest of the remote commands not executed "remote: executeFlash: end"
I have tried to call the script in subprocess like flash.sh & or ./flash.sh
but no luck. During Manual execution in the CLI, I do not lose the ssh session which is the expected behavior

After removing "timeout 45" from the ssh command, the issue got resolved

Related

Run remote process asynchronically on bash and get its remote pid, stdout+stderr in a file and exit code

I want to run a remote process asynchronically and get its remote pid, output (stdout + stderr) saved into a file or a variable (I need it for further processing) and exit code.
The remote pid is needed while the the remote process is running, not after it's done.
Also, multiple processes with the same name run on the remote machine, so any solution which uses the process name won't work for me.
What I've got so far:
export SSH="ssh -o ServerAliveInterval=100 $user#$remote_ip"
and "my_test" is the binary I want to run.
To get the remote pid and output I tried:
$SSH "./my_test > my_test_output & echo \$! > pid_file"
remote_pid=$($SSH "cat pid_file")
# run some remote application which needs the remote pid (send signals to my_test)
$SSH "./some_tester $remote_pid"
# now wait for my_test to end and get its exit code
$SSH "wait $remote_pid; echo $?"
bash: wait: pid 71033 is not a child of this shell
The $SSH command returns after echoing the remote pid to pid_file, since there are no file descriptors connected to this ssh socket (https://unix.stackexchange.com/a/30433/316062).
Is there a way to somehow get my_test exit code?
OK, my solution is:
# the part which generates the code on the remote machine
$SSH << 'EOF' &
./my_test > my_test_output &
remote_pid=$!
echo $remote_pid > pid_file
wait $remote_pid
echo $? > exit_code
EOF
local_pid=$!
# since we run the previous ssh command asynchronically, we need to make sure
# pid_file was already created when we try to read it
sleep 2
remote_pid=$($SSH "cat pid_file")
# now we can run remote task which needs the remote test pid
$SSH "./some_tester $remote_pid"
wait $local_pid
echo "my_test is done!"
exit_code=$($SSH "cat exit_code")

expect: launching scp after sftp

I could really use some help. I'm still pretty new with expect. I need to launch a scp command directly after I run sftp.
I got the first portion of this script working, my main concern is the bottom portion. I really need to launch a command after this command completes. I'd rather be able to spawn another command than, hack something up like piping this with a sleep command and running it after 10 s or something weird.
Any suggestions are greatly appreciated!
spawn sftp user#host
expect "password: "
send "123\r"
expect "$ "
sleep 2
send "cd mydir\r"
expect "$ "
sleep 2
send "get somefile\r"
expect "$ "
sleep 2
send "bye\r"
expect "$ "
sleep 2
spawn scp somefile user2#host2:/home/user2/
sleep 2
So i figured out I can actually get this to launch the subprocess if I use "exec" instead of spawn.. in other words:
exec scp somefile user2#host2:/home/user2/
the only problem? It prompts me for a password! This shouldn't happen, I already have the ssh-keys installed on both systems. (In other words, if I run the scp command from the host I'm running this expect script on, it will run without prompting me for a password). The system I'm trying to scp to, must be recognizing this newly spawned process as a new host, because its not picking up my ssh-key. Any ideas?
BTW, I apologize I haven't actually posted a "working" script, I can't really do that without comprimising the security of this server. I hope that doesn't detract from anyones ability to assist me.
I think the problem lies with me not terminating the initially spawned process. I don't understand expect enough to do it properly. If I try "close" or "eof", it simply kills the entire script, which I don't want to do just yet (because I still need to scp the file to the second host).
Ensure that your SSH private key is loaded into an agent, and that the environment variables pointing to that agent are active in the session where you're calling scp.
[[ $SSH_AUTH_SOCK ]] || { # if no agent already running...
eval "$(ssh-agent -s)" # ...then start one...
ssh-add /path/to/your/ssh/key # ...load your key...
started_ssh_agent=1 # and flag that we started it ourselves
}
# ...put your script here...
[[ $started_ssh_agent ]] && { # if we started the agent ourselves...
eval "$(ssh-agent -s -k)" # ...then clean up nicely when done.
}
As an aside, I'd strongly suggest replacing the code given in the question with something like the following:
lftp -u user,123 -e 'get /mydir/somefile -o localfile' sftp://host </dev/null
lftp scp://user2#host2 -e 'put localfile -o /home/user2/somefile' </dev/null
Each connection handled in one line, and no silliness messing around with expect.

bash script run job after screen command in one script

this is what i want to do :
#!/bin/bash
# start the tunnel
ssh tunnel#hostA -L 6000:hostB:22 -N
# this is the problem, i need to go next process after tunnel on
# Main proses that i must run under tunnel command
sftp -oPort=6000 user#localhost:/home/you <<GETME
lcd /home/me/temp
get *.tar
bye
GETME
echo " Job Done"
# i hope it can be add
# kill the tunnel after sftp done
right now i use 2 putty to run sftp to hostB, i think maybe it can be done in 1 single script

Bash Script Quits After Exiting SSH

I'm trying to write a Bash script that logs into 2 different linux based power-strips (Ubiquiti Mpower Pros) and turns 2 different lights off (one on each strip). To do this I login to the 1st strip, change the appropriate file to 0 (thus turning off the light), and exit, repeating the same process on the next power-strip. However, after I exit the first SSH connection, the script stops working. Could someone please suggest a fix? My only idea would be to encase this script in a python program. Here's my code:
#!/bin/bash
ssh User#192.168.0.100
echo "0" > /proc/power/relay1
exit
# hits the enter key
cat <(echo "") | <command>
ssh User#192.168.0.103
echo "logged in"
echo "0" > /proc/power/relay1
exit
cat <(echo "") | <command>
ssh as an app BLOCKS while it's running, the echo and exit are executed by the local shell, not by the remote machine. so you are doing:
ssh to remote machine
exit remote shell
echo locally
exit locally
and boom, your script is dead. If that echo/exit is supposed to be run on the remote system, then you should be doing:
ssh user#host command
^^^^^---executed on the remote machine
e.g.
ssh foo#bar 'echo ... ; exit'
The commands you're apparently trying to run through ssh are actually being executed locally. You can just pass the command you want to run to ssh and it will do it (without needing an explicit exit)
ssh User#192.168.0.110 'echo "0" > /proc/power/relay1'
will do that, and similar for the other ssh command

Exit from an SSH session but not script

In my bash script, I do:
ssh me9#some_mad_server.com;
cd ~/apple;
echo "Before Exit"
exit
echo "After Exit"
I never see Before Exit or After Exit. I can understand why I may not see Before Exist as my script at that stage is in another console. But I am confused if the Exit mean my script ends and hence why After Exit never gets logged.
Any help appreciated.
To execute a series of commands on a remote host, you need to pass them to ssh on the command line, not execute them after the ssh call. Like this:
ssh me9#some_mad_server.com '
cd ~/apple
echo "Before Exit"
'
echo "After Exit"
This uses a multiline string to pass multiple commands. An exit is implicit when the end of the string is reached.
Importantly, the commands in the quoted string are executed on the remote host, while the final echo is executed on the local server. I've indented the remote commands for clarity.
You can use screen for this.
screen -d -m <command>
Use screen -r to attach to that screen again
screen -r <screen ID>

Resources