Bash: Variable from file output - bash

I use a script on my personal server to setup a few things, and I'm a bit lazy.
So instead of doing every command by hand etc I got this script:
#!/bin/bash
sshservers=(IP IP IP IP)
for sshserver in "${sshservers[#]}"
do
ssh root#$sshserver 'Foo Bar'
done
echo "done"
exit
So it takes the servers from the variable in sshservers=.
But I want it to take the IP's from a file called "servers.txt".
Though when using
sshservers=$(cat servers.txt)
This doesn't work. I have to manually insert it directly in the script.
How can I use the variable from a text file?
Thanks in advance!

Related

Bash - SSH to remote server and retrieve data

I have a bash script that needs to connect to another server for parts of it's execution. I have tried many of the standard instructions and syntaxes for executing ssh commands, but with little progress.
On the remote server, I need to source a shell script that contains several env parameters for some software. One of these parameters are then used in a filepath to point to an executable, which contains a function ' -lprojects ' that can list the projects for the software on that server.
I have verified that running the commands on the server itself works multiple times. My issue is when I try to run the same commands over SSH. If I use the approach where I use the env variable for the filepath, it shows that the variable is null in the filepath, giving a file/directory not found error. If I hard-code the filepath to point to the executable, it gives me an error saying that the shell script is not sourced (which I assume it needs for other functions and apis for the executable to reveal it's -lprojects function)
Here is how the code looks like somewhat:
ssh remote.server 'source /filepath/remotescript.sh'
filelist=$(ssh remote.server $REMOTEVARIABLE'/bin/executable -lprojects')
echo ${filelist[#]}
for file in $filelist
do
echo $file
ssh SERVER2 awk 'something' /filepath/"$file"/somefile.txt | sed 'something' >> filepath/values.csv;
done
As you can see, I then also need to loop through the contents of the -lprojects output in the remote.server, do some awk and sed on the files to extract the wanted text (this works), but then I need to write that back to the client (local server) values.csv file. This is more generic, as there will be several servers I have to do this for, but all of them have to write to the same .csv file. For simplicity, you can just regard this as a one remote server case, since it is vital I get it working for at least one now in the beginning.
Note that I also tried something like:
ssh remote.server << EOF
'source /filepath/remotescript.sh'
filelist=$(ssh remote.server $REMOTEVARIABLE'/bin/executable -lprojects')
EOF
But with similar results. Also placing the single-quotes in the filelist both before and after the remotevariable, etc.
How do I go about properly doing this?
To access the environment variable, you must source the script that defines the environment within the same SSH call as the one where you are using it, otherwise, you're running your commands in two different shells which are unrelated:
filelist=$(ssh remote.server 'source /filepath/remotescript.sh; $REMOTEVARIABLE/bin/executable -lprojects')
Assuming executable outputs one file name per line, you can use readarray to achieve the effect :
readarray -t filelist < <(ssh remote.server '
source /filepath/remotescript.sh
$REMOTEVARIABLE/bin/executable -lprojects
'
)
echo ${filelist[#]}
for file in $filelist
do
echo $file
ssh SERVER2 awk 'something' /filepath/"$file"/somefile.txt | sed 'something' >> filepath/values.csv;
done

Declare variable on unix server

I am trying to login on one of the remote server(Box1) and trying to read one file on remote server(Box1).
That contain the another server(Box2) details, base upon that details I have to come back to the local server and ssh to another server(Box2) for some data crunching. and so on.....
ssh box1.com << EOF
if [[ ! -f /home/rakesh/tomar.log ]]
then
echo "LOG file not found"
else
echo " LOG file present"
export server_node1= `cat /home/rakesh/tomar.log`
fi
EOF
ssh box2.com << EOF
if [[ ! -f /home/rakesh/tomar.log ]]
then
echo "LOG file not found"
else
echo " LOG file present"
export server_node2= `cat /home/rakesh/tomar.log`
fi
EOF
but I am not getting value of "server_node1" and "server_node2" on local machine.
any help would be appreciated.
Just like bash -c 'export foo=bar' cannot declare a variable in the calling shell where you typed this, an ssh command cannot declare a variable in the calling shell. You will have to refactor so that the calling shell receives the information and knows what to do with it.
I agree with the comment that storing a log file in a variable is probably not a sane, or at least elegant, thing to do, but the easy way to do what you are attempting is to put the ssh inside the assignment.
server_node1=$(ssh box1.com cat tomar.log)
server_node2=$(ssh box2.com cat tomar.log)
A few notes and amplifications:
The remote shell will run in your home directory, so I took it out (on the assumption that /home/rt9419 is your home directory, obviously).
In case of an error in the cat command, the exit code of ssh will be the error code from cat, and the error message on standard error will be visible on your standard error, so the echo seemed quite superfluous. (If you want a custom message, variable=$(ssh whatever) || echo "Custom message" >&2 would do that. Note the redirection to standard error; it doesn't seem to matter here, but it's good form.)
If you really wanted to, you could run an arbitrarily complex command in the ssh; as outlined above, it didn't seem necessary here, but you could do assigment=$(ssh remote 'if [[ things ]]; then for variable in $(complex commands to drive a loop); do : etc etc; done; fi; more </dev/null; exit "$variable"') or whatever.
As further comments on your original attempt,
The backticks in the here document in your attempt would be evaluated by your local shell before the ssh command even ran. There are separate questions about how to fix that; see e.g. How have both local and remote variable inside an SSH command. but in short, unless you absolutely require the local shell to be able to modify the commands you send, probably put them in single quotes, like I did in the silly complex ssh example above.
The function of export is to make variables visible to child processes. There is no way to affect the environment of a parent process (short of having it cooperate and/or coordinate the change, as in the code above). As an example to illustrate the difference, if you set PERL5LIB to a directory with Perl libraries, but fail to export it, the Perl process you start will not see the variable; it is only visible to the current shell. When you export it, any Perl process you start as a child of this shell will also see this variable and the value you assigned. In other words, you export variables which are not private to the current shell (and don't export private ones; aside from making sure they are private, this saves the amount of memory which needs to be copied between processes), but that still only makes them visible to children, by the design of the U*x process architecture.
You should get back the file from box1and box2 with an scp:
scp box1.com:/home/rt9419/tomar.log ~/tomar1.log
#then you can cat!
export server_node1=`cat ~/tomar1.log`
idem with box2
scp box2.com:/home/rt9419/tomar.log ~/tomar2.log
#then you can cat!
export server_node2=`cat ~/tomar2.log`
There are several possibilities. In your case, you could on the remote system create a file (in bash syntax), containing the assignments of these variables, for example
echo "export server_node2='$(</home/rt9419/tomar.log)'" >>export_settings
(which makes me wonder why you want the whole content of your logfile be stored into a variable, but this is another question), then transfer this file to your host (for example with scp) and source it from within your bash script.

Want to read variable value from remote file

In one of my bash script I want to read and use the variable value from other script which is on remote machine.
How should I go ahead to resolve this. Any related info would be helpful.
Thanks in advance!
How about this (which is code I cannot currently test myself):
text=$(ssh yourname#yourmachine 'grep uploadRate= /root/yourscript')
It assumes that the value of the variable is contained in one line. The variable text now contains you variable assignment, presumably something like
uploadRate=1MB/s
There are several ways to convert the text/code into a real variable assignment in your current script, like evaluating the string or using grep. I would recommend
uploadRate=${text#*=}
to just remove the part up and including the =.
Edit: One more caveat to mention is that this only works if the original assignment does not contain variable references itself like in
uploadRate=1000*${kB}/s
ssh user#machine 'command'
will print the standard output of the remote command.
I would tell two ways at least:
1) You can simply redirect output to a file from remote server to your system with scp command...It would work for you.Then your script on your machine should read that file as an argument...
script on your machine:
read -t 50 -p "Waiting for argumet: " $1
It waits for output from remote machine,
Then you can
sshpass -p<password> scp user#host:/Path/to/file /path/to/script/
What you need to do:
You should tell the script from your machine, that the output from scp command is the argument($1)
2)Run script from your machine:
#!/bin/bash
script='
#Your commands
'
sshpass -p<password> ssh user#host $script
And you have also another ways to run script to do sth with remote machine.

how to run scripts within a telnet session

I want to connect to a remote host using telnet
there is no username/password verification
just
telnet remotehost
then I need to input some commands for initialization
and then I need to repeat the following commands:
cmd argument
argument is read from a local file, in this file there are many lines, each line is a argument
and after runing one "cmd argument", the remote host will output some results
it may output a line with string "OK"
or output many lines, one of which is with string "ERROR"
and I need to do something according to the results.
basically, the script is like:
initialization_cmd #some initial comands
while read line
do
cmd $line
#here the remote host will output results, how can I put the results into a variable?
# here I want to judge the results, like
if $results contain "OK";then
echo $line >>good_result_log
else
echo $line >> bad_result_log
fi
done < local_file
the good_result_log and bad_result_log are local files
is it possible or not? thanks!
This won't work as echo will output to the stdout of the tty and not to the stdin of the telnet process.
I would suggest writing an expect script for this task. Perhaps you could adapt something like this.
This question was asked in at least four different forums at the same time. Don't know what kind of points this kind of entrepreneurship earns, but here are links to answers:
linux forums
unix.com
superuser.com

Auto SSH and execute script

I have roughly 12 computers that each have the same script on them. This script merely pings all the other machines, and prints out whether the machine is "reachable" or "unreachable". However, it is inefficient to login to each machine manually using ssh to execute this script.
Suppose I'm logged into node 1. Is there any way to for me to login to node 2-12 automatically using SSH, execute the ping script, pipe the results to a file, logout and proceed to the next machine? Some kind of bash shell script?
I'm afraid I'm at a loss here since I haven't had experience with shell-scripting before.
Since the script is on the other machines, you can just have ssh run the command for you there:
ssh $hostname my_script >> results_file
When you specify a command like that, it's executed instead of the login shell.
I'll leave it up to you to figure out how to loop over hostnames!
One trick you'll need to use is setting up pre-authorized keys for each host. Then you can run a script on one host, running something like 'ssh hostname command > log.hostname'
This script might be what you are looking for: It allows you to execute one command (which can be your script) on multiple remote machines via ssh. It's a simple script with bash source available, so you should be able to customize it to your needs:
http://www.heinzi.at/projects/upgradebest.sh/
Yes you can
You need actually 2 small scripts as following:
remote_ssh.sh ( which takes as first argument the name of the machine and the rest of the arguments are your script that you want to execute with his own arguments)
Example : remote_ssh.sh node5 "echo hello world"
remote_ssh.sh as following:
#!/bin/bash
ALL_ARG=$#
FST_ARG=$1
REST_ARG=${ALL_ARG##$FST_ARG}
echo "Executing REMOTE COMMAND ON $FST_ARG"
/usr/bin/ssh $FST_ARG bash execute_ssh_command.sh $FST_ARG pwd $REST_ARG
execute_ssh_command.sh as following :
#!/bin/bash
ALL_ARG=$#
FST_ARG=$1
DIR_ARG=$2
REM_ARG="$1 $2"
REST_ARG=${ALL_ARG##$REM_ARG}
cd $DIR_ARG
$REST_ARG
of course you have to get this 2 scripts in your path of all your nodes ( maybe ~/bin/ )
Hope that it's helpful

Resources