Execute a read command - bash

I have a little script that I use to send bash commands to several web servers under a load balancer. I'm able to send the command successfully, but I also want to execute it locally.
#!/bin/bash
echo "Type commands to be sent to web servers 1-8. Use ctrl+c to exit."
function getCommand() {
read thisCmd
echo "Sending '$thisCmd'..."
if [ ! -z "$thisCmd" ]; then
# Run command locally
echo "From web1"
cd ~
command $thisCmd
# Send to remotes
for i in {2..8}
do
echo "From web$i..."
ssh "web$i" "$thisCmd"
done
fi
echo Done
getCommand
}
getCommand
But this is resulting in
user#web1:~$ ./sshAll.sh
Type commands to be sent to web servers 1-8. Use ctrl+c to exit.
cd html; pwd
Sending 'cd html; pwd'...
From web1
./sshAll.sh: line 11: cd: html;: No such file or directory
From web2...
/home/user/html
How do I get this working?

When expanding a variable as a command like this:
$thisCmd
Or this
command $thisCmd
Bash would only parse it as a single command so ; and the likes would be just considered as an argument or part of it e.g. html;
So one basic solution to that is to use eval:
eval "$thisCmd"
But it's a little dangerous. Still it's just the same as those you send to the remote servers. You still execute them like how eval does it.

Related

Capture Shell Script Output and Do a String Match to execute the next command?

i have a Shell script doing the following two commands, connecting to a remote server and putting files via SFTP, lets called it "execute.sh"
sftp -b /usr/local/outbox/send.sh username#example.com
mv /usr/local/outbox/DD* /usr/local/outbox/completed/
Then in my "send.sh" i have following commands to be executed on the remote server.
cd ExampleFolder/outbox
put Files_*
bye
Now my problem is
If the first command "sftp -b" fails due to a remote connection error some network problem, it still moves the files into the "completed folder" which is incorrect, so i want some way to do the next command "mv" to be executed only if the first command to "sftp" is successfully connected.
Can we do this by enhancing this shell script ? or some work around ?
My Shell is Bash.
Simply insert && between the two commands:
sftp -b /usr/local/outbox/send.sh username#example.com && \
mv /usr/local/outbox/DD* /usr/local/outbox/completed/
If the first fails, the second one will not run.
Alternatively, you can check the exit code of the first command explicitly. The exit code of the last command is always saved in $?, and it is 0 if the command succeeded:
sftp -b /usr/local/outbox/send.sh username#example.com
if [ $? -eq 0 ]
then
mv /usr/local/outbox/DD* /usr/local/outbox/completed/
fi
If you really wanted to capture the output of the first command, you could run it in $(...) and store the value in a variable:
sftpOutput="$(sftp -b /usr/local/outbox/send.sh username#example.com)"
and then use this variable in further checks, e.g. match it against a pattern in the next if.

script file : command stopped

I made simple script. file name is sutest.
#!/bin/bash
cd ~/Downloads/redis-4.0.1/src
./redis-server
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
I runed script.$ . sutest
But, script code is stopped at ./redis-server.
So I can't see echo messages.
I want to make this kind of script files. How can I do that??
I would be appreciate your help.
Let's say more general case.
myscript1 file executes process like redis-server above.
another myscript2 file executes process like redis-server above.
another myscript3 file executes process like redis-server above.
How can I run three script files simultaneously??
I want to do job in ssh connection.
To make the matter worse, If I can't use screen or tmux??
Add a '&' char at the end of the row
./redis-server &
this char permits to run in backgroud the job, and the script continues.
Just do the echos first:
cd ~/Downloads/redis-4.0.1/src
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
exec ./redis-server
The use of exec is a small trick (which you can omit if you prefer): it replaces the shell script with redis-server, so the shell script is no longer running at all. Without exec, you end up with the shell script waiting around for redis-server to finish, which is unnecessary if the script will do nothing further.
If you don't like that for some reason, you can keep the original order:
cd ~/Downloads/redis-4.0.1/src
./redis-server & # run in background
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
wait # optional

expect: launching scp after sftp

I could really use some help. I'm still pretty new with expect. I need to launch a scp command directly after I run sftp.
I got the first portion of this script working, my main concern is the bottom portion. I really need to launch a command after this command completes. I'd rather be able to spawn another command than, hack something up like piping this with a sleep command and running it after 10 s or something weird.
Any suggestions are greatly appreciated!
spawn sftp user#host
expect "password: "
send "123\r"
expect "$ "
sleep 2
send "cd mydir\r"
expect "$ "
sleep 2
send "get somefile\r"
expect "$ "
sleep 2
send "bye\r"
expect "$ "
sleep 2
spawn scp somefile user2#host2:/home/user2/
sleep 2
So i figured out I can actually get this to launch the subprocess if I use "exec" instead of spawn.. in other words:
exec scp somefile user2#host2:/home/user2/
the only problem? It prompts me for a password! This shouldn't happen, I already have the ssh-keys installed on both systems. (In other words, if I run the scp command from the host I'm running this expect script on, it will run without prompting me for a password). The system I'm trying to scp to, must be recognizing this newly spawned process as a new host, because its not picking up my ssh-key. Any ideas?
BTW, I apologize I haven't actually posted a "working" script, I can't really do that without comprimising the security of this server. I hope that doesn't detract from anyones ability to assist me.
I think the problem lies with me not terminating the initially spawned process. I don't understand expect enough to do it properly. If I try "close" or "eof", it simply kills the entire script, which I don't want to do just yet (because I still need to scp the file to the second host).
Ensure that your SSH private key is loaded into an agent, and that the environment variables pointing to that agent are active in the session where you're calling scp.
[[ $SSH_AUTH_SOCK ]] || { # if no agent already running...
eval "$(ssh-agent -s)" # ...then start one...
ssh-add /path/to/your/ssh/key # ...load your key...
started_ssh_agent=1 # and flag that we started it ourselves
}
# ...put your script here...
[[ $started_ssh_agent ]] && { # if we started the agent ourselves...
eval "$(ssh-agent -s -k)" # ...then clean up nicely when done.
}
As an aside, I'd strongly suggest replacing the code given in the question with something like the following:
lftp -u user,123 -e 'get /mydir/somefile -o localfile' sftp://host </dev/null
lftp scp://user2#host2 -e 'put localfile -o /home/user2/somefile' </dev/null
Each connection handled in one line, and no silliness messing around with expect.

Exit from an SSH session but not script

In my bash script, I do:
ssh me9#some_mad_server.com;
cd ~/apple;
echo "Before Exit"
exit
echo "After Exit"
I never see Before Exit or After Exit. I can understand why I may not see Before Exist as my script at that stage is in another console. But I am confused if the Exit mean my script ends and hence why After Exit never gets logged.
Any help appreciated.
To execute a series of commands on a remote host, you need to pass them to ssh on the command line, not execute them after the ssh call. Like this:
ssh me9#some_mad_server.com '
cd ~/apple
echo "Before Exit"
'
echo "After Exit"
This uses a multiline string to pass multiple commands. An exit is implicit when the end of the string is reached.
Importantly, the commands in the quoted string are executed on the remote host, while the final echo is executed on the local server. I've indented the remote commands for clarity.
You can use screen for this.
screen -d -m <command>
Use screen -r to attach to that screen again
screen -r <screen ID>

How can I get the return code for a command which ran via ssh and not the ssh command itself?

I am trying to write a bash script in which one of the first steps is to check for the presence of a file in a remote host and whether it is executable or not. This sounds like a job for the test command, but I'd have to prefix it with ssh. Something like this:
$(ssh user#box 'test -x /path/thefile >/dev/null 2>&1; $?')
Except the above doesn't seem to be working to return to me whether the remote file exists and is executable...how would I do this?
When you use the $() syntax, you are getting the output of the command, and not the result code. You can just do:
if ssh user#box 'test -x /path/thefile'; then
echo "It exists"
fi

Resources