Want to read variable value from remote file - bash

In one of my bash script I want to read and use the variable value from other script which is on remote machine.
How should I go ahead to resolve this. Any related info would be helpful.
Thanks in advance!

How about this (which is code I cannot currently test myself):
text=$(ssh yourname#yourmachine 'grep uploadRate= /root/yourscript')
It assumes that the value of the variable is contained in one line. The variable text now contains you variable assignment, presumably something like
uploadRate=1MB/s
There are several ways to convert the text/code into a real variable assignment in your current script, like evaluating the string or using grep. I would recommend
uploadRate=${text#*=}
to just remove the part up and including the =.
Edit: One more caveat to mention is that this only works if the original assignment does not contain variable references itself like in
uploadRate=1000*${kB}/s

ssh user#machine 'command'
will print the standard output of the remote command.

I would tell two ways at least:
1) You can simply redirect output to a file from remote server to your system with scp command...It would work for you.Then your script on your machine should read that file as an argument...
script on your machine:
read -t 50 -p "Waiting for argumet: " $1
It waits for output from remote machine,
Then you can
sshpass -p<password> scp user#host:/Path/to/file /path/to/script/
What you need to do:
You should tell the script from your machine, that the output from scp command is the argument($1)
2)Run script from your machine:
#!/bin/bash
script='
#Your commands
'
sshpass -p<password> ssh user#host $script
And you have also another ways to run script to do sth with remote machine.

Related

Bash - SSH to remote server and retrieve data

I have a bash script that needs to connect to another server for parts of it's execution. I have tried many of the standard instructions and syntaxes for executing ssh commands, but with little progress.
On the remote server, I need to source a shell script that contains several env parameters for some software. One of these parameters are then used in a filepath to point to an executable, which contains a function ' -lprojects ' that can list the projects for the software on that server.
I have verified that running the commands on the server itself works multiple times. My issue is when I try to run the same commands over SSH. If I use the approach where I use the env variable for the filepath, it shows that the variable is null in the filepath, giving a file/directory not found error. If I hard-code the filepath to point to the executable, it gives me an error saying that the shell script is not sourced (which I assume it needs for other functions and apis for the executable to reveal it's -lprojects function)
Here is how the code looks like somewhat:
ssh remote.server 'source /filepath/remotescript.sh'
filelist=$(ssh remote.server $REMOTEVARIABLE'/bin/executable -lprojects')
echo ${filelist[#]}
for file in $filelist
do
echo $file
ssh SERVER2 awk 'something' /filepath/"$file"/somefile.txt | sed 'something' >> filepath/values.csv;
done
As you can see, I then also need to loop through the contents of the -lprojects output in the remote.server, do some awk and sed on the files to extract the wanted text (this works), but then I need to write that back to the client (local server) values.csv file. This is more generic, as there will be several servers I have to do this for, but all of them have to write to the same .csv file. For simplicity, you can just regard this as a one remote server case, since it is vital I get it working for at least one now in the beginning.
Note that I also tried something like:
ssh remote.server << EOF
'source /filepath/remotescript.sh'
filelist=$(ssh remote.server $REMOTEVARIABLE'/bin/executable -lprojects')
EOF
But with similar results. Also placing the single-quotes in the filelist both before and after the remotevariable, etc.
How do I go about properly doing this?
To access the environment variable, you must source the script that defines the environment within the same SSH call as the one where you are using it, otherwise, you're running your commands in two different shells which are unrelated:
filelist=$(ssh remote.server 'source /filepath/remotescript.sh; $REMOTEVARIABLE/bin/executable -lprojects')
Assuming executable outputs one file name per line, you can use readarray to achieve the effect :
readarray -t filelist < <(ssh remote.server '
source /filepath/remotescript.sh
$REMOTEVARIABLE/bin/executable -lprojects
'
)
echo ${filelist[#]}
for file in $filelist
do
echo $file
ssh SERVER2 awk 'something' /filepath/"$file"/somefile.txt | sed 'something' >> filepath/values.csv;
done

Is there any possibility to store variable from remote server output inside script?

I'm little aware that, we can store command output as variable in local script as below:
variable=$(command)
Like this way, can we store remote command output as variable? Any suggestion would be highly appreciated. For Ex:
#!/bin/bash
ssh remote#hostname << EOF
variable=$(command1)
variable=$(command2)
variable=$(command3)
Try,
Variable=`ssh remote#hostname "command1" </dev/null`
The stdout of a remotely executed command/script can be directly assigned to local variable easily.

Bash: Variable from file output

I use a script on my personal server to setup a few things, and I'm a bit lazy.
So instead of doing every command by hand etc I got this script:
#!/bin/bash
sshservers=(IP IP IP IP)
for sshserver in "${sshservers[#]}"
do
ssh root#$sshserver 'Foo Bar'
done
echo "done"
exit
So it takes the servers from the variable in sshservers=.
But I want it to take the IP's from a file called "servers.txt".
Though when using
sshservers=$(cat servers.txt)
This doesn't work. I have to manually insert it directly in the script.
How can I use the variable from a text file?
Thanks in advance!

Capture output of remote command in variable inside of a shell script

I have a script I want to run on remote via ssh. It checks if there is a process running and should try to kill it, if it exists. Now, my code looks like this:
ssh my_prod_env << ENDSSH
...
pid=$(pgrep -f "node my_app.js")
echo $pid
# kill process with $pid
...
exit
ENDSSH
The problem lies here: I cannot capture output of pgrep command in variable. I tried with $(), backticks, pipe then read and maybe other approaches, but all without success.
I would like to do it all in one ssh session.
Now I am thinking the output of command goes to the output stream I cannot access in my script. I might be wrong, though.
Either way, help will be appreciated.
Ok, after you provided in comments more info what you want, I believe this is the correct answer to your question:
ssh my_prod_env -t 'pgrep -f "node my_app.js"'
This will call the command and leave you logged on the server
This is what fixes the thing - "escaping" the ENDSSH tag.
ssh my_prod_env << /ENDSSH
...
# capture output of remote commands in remote variables
...
ENDSSH
Problem was that my vars were local and I was trying to capture output of remote commands in them.
This question/answer helped me realize what is going on: How to assign local variable with a remote command result in bash script?
So, my question could be marked as duplicate or something similar, I guess.

Auto SSH and execute script

I have roughly 12 computers that each have the same script on them. This script merely pings all the other machines, and prints out whether the machine is "reachable" or "unreachable". However, it is inefficient to login to each machine manually using ssh to execute this script.
Suppose I'm logged into node 1. Is there any way to for me to login to node 2-12 automatically using SSH, execute the ping script, pipe the results to a file, logout and proceed to the next machine? Some kind of bash shell script?
I'm afraid I'm at a loss here since I haven't had experience with shell-scripting before.
Since the script is on the other machines, you can just have ssh run the command for you there:
ssh $hostname my_script >> results_file
When you specify a command like that, it's executed instead of the login shell.
I'll leave it up to you to figure out how to loop over hostnames!
One trick you'll need to use is setting up pre-authorized keys for each host. Then you can run a script on one host, running something like 'ssh hostname command > log.hostname'
This script might be what you are looking for: It allows you to execute one command (which can be your script) on multiple remote machines via ssh. It's a simple script with bash source available, so you should be able to customize it to your needs:
http://www.heinzi.at/projects/upgradebest.sh/
Yes you can
You need actually 2 small scripts as following:
remote_ssh.sh ( which takes as first argument the name of the machine and the rest of the arguments are your script that you want to execute with his own arguments)
Example : remote_ssh.sh node5 "echo hello world"
remote_ssh.sh as following:
#!/bin/bash
ALL_ARG=$#
FST_ARG=$1
REST_ARG=${ALL_ARG##$FST_ARG}
echo "Executing REMOTE COMMAND ON $FST_ARG"
/usr/bin/ssh $FST_ARG bash execute_ssh_command.sh $FST_ARG pwd $REST_ARG
execute_ssh_command.sh as following :
#!/bin/bash
ALL_ARG=$#
FST_ARG=$1
DIR_ARG=$2
REM_ARG="$1 $2"
REST_ARG=${ALL_ARG##$REM_ARG}
cd $DIR_ARG
$REST_ARG
of course you have to get this 2 scripts in your path of all your nodes ( maybe ~/bin/ )
Hope that it's helpful

Resources