bash ssh script not outputting results to file - bash

Doing a simple script in work and I can't figure out why it will not output the results in another file.
/tmp/system stores the node list
#!/bin/bash
$results=restest.txt
for i in $(cat /tmp/system); do
ping -c 1 $i
if [ $i = 0 ]; then
ssh $i ps -ef | grep ops | echo > $results
fi
done

echo is not printing from stdin but rather its arguments (or a newline if there no arguments). So
echo file.txt
only prints 'file.txt', not the content of the file. Therefore your code only writes a newline to the file. You can use cat for printing stdin to stdout, but here it is useless, since you can pipe the grep output directly to the file:
ssh $i ps -ef | grep ops > $results

first thank you for editing your code (it's more clear like this :)
I have 2 or 3 advices :
1- when you want to store a value in variable dont use "$" symbole, this symbole is used to get the variable's value
ex:
MYVAR=3
echo MYVAR "this will print MYVAR"
echo $MYVAR "this will print 3"
2- always quote your values, specially if the value is comming from another command
3- To fix your script you need to quote the command executed on the remote machine,
then redirect the output to your file
ex:
ssh user#MachineA "ls > file.txt" "file.txt is created on machineA"
ssh user#machineA "ls" > file.txt "file.txt is created on YOUR machine"
so simply you can replace your last line by
ssh $i "ps -ef | grep ops" > $results
try to use -ne in your test bash classictest
good luck

There are several errors. First, don't iterate over files like this. Second, i is the name of the host, not the exit status of ping. Third, the echo is unnecessary (and does not read from standard input). Lastly, $ is only used for expanding a parameter, not assigning to it.
#!/bin/bash
results=restest.txt
while read -r host; do
if ping -c 1 "$host"; then
ssh "$host" ps -ef | grep ops > "$results"
fi
done < /tmp/system

Related

Why it is exit when ssh remote executed script in linux shell [duplicate]

This question already has answers here:
ssh breaks out of while-loop in bash [duplicate]
(2 answers)
While loop stops reading after the first line in Bash
(5 answers)
ssh in bash script exits loop [duplicate]
(1 answer)
Closed 2 years ago.
In my project, i need to find the user processed on node.
I have a file: jodIdUser. The content in this file has two columns, like:
395163 chem-yupy
395164 chem-yupy
395165 phy-xiel
395710 mae-chent
Now i have a script appRecord.sh, and i have a whle loop in it.
The while method code is like:
cat $workDir/jobIdUser | while read LINE
do
jobUser=`echo $LINE | awk '{print $2}'`
jobId=`echo $LINE | awk '{print $1}'`
jobOnNodes=`/usr/bin/jobToNode $jobId | xargs`
echo $timeStr" "$jobId" "$jobUser" "$jobOnNodes >> $workDir/tmpRecFile
#20200702, it is needed to find out user process on nodes at this time here
designatedNode=`echo $jobOnNodes | awk '{print $NF}'`
echo $jobOnNodes
echo $designatedNode
ssh $designatedNode sh $workDir/nodeProInfo.sh ##Here code will exit while loop
echo $timeStr" "$jobId" "$jobUser" "$jobOnNodes >> $workDir/$recordFile
done
The code of nodeProInfo.sh is like:
#!/bin/bash
source /etc/profile
workDir=/work/ccse-xiezy/appRecord
hostName=`hostname`
jobInfo=`ps axo user:15,comm | grep -Ev "libstor|UID|ganglia|root|gdm|postfix|USER|rpc|polkitd|dbus|chrony|libstoragemgmt|para-test|ssh|ccse-x|lsf|lsbatch" | tail -n 1`
echo $hostName" "$jobInfo >> $workDir/proRes
Now I run the script sh appRecord.sh, it is wrong. It will exit when the first loop in while
[cc#login04 appRecord]$ sh appRecord.sh
r03n56 r04n09 r04n15
r04n15
[cc#login04 appRecord]$
I don't know why it will exit when remote ssh node method, who can help me?
UPDATE:
I have another script which runns ok. the jobIdUser has content like:
r01n23 xxx-ser
r92n12 yyn-ser
and the while loop is:
cat $workDir/jobIdUser | while read LINE
do
.............
ssh $NODE pkill -u -9 $USER
.............
done
cat $workDir/jobIdUser | while read LINE
do
...
ssh $designatedNode sh $workDir/nodeProInfo.sh ##Here code will exit while loop
...
done
ssh (and every other command running inside the loop) inherits its standard input from the standard input of the while loop. By default, ssh reads its standard
input and passes the data to the standard input of the remote process. This means that ssh is consuming the data being piped into the loop, preventing the while statement from reading it.
You need to prevent ssh from reading its standard input. You can do this one of two ways. First of all, the -n flag prevents ssh from reading its standard input:
ssh -n $designatedNode sh $workDir/nodeProInfo.sh
Or you can redirect ssh's standard input from another source:
ssh $designatedNode sh $workDir/nodeProInfo.sh < /dev/null

While loop not looping trough ssh connections from host list [duplicate]

I have the following shell script. The purpose is to loop thru each line of the target file (whose path is the input parameter to the script) and do work against each line. Now, it seems only work with the very first line in the target file and stops after that line got processed. Is there anything wrong with my script?
#!/bin/bash
# SCRIPT: do.sh
# PURPOSE: loop thru the targets
FILENAME=$1
count=0
echo "proceed with $FILENAME"
while read LINE; do
let count++
echo "$count $LINE"
sh ./do_work.sh $LINE
done < $FILENAME
echo "\ntotal $count targets"
In do_work.sh, I run a couple of ssh commands.
The problem is that do_work.sh runs ssh commands and by default ssh reads from stdin which is your input file. As a result, you only see the first line processed, because the command consumes the rest of the file and your while loop terminates.
This happens not just for ssh, but for any command that reads stdin, including mplayer, ffmpeg, HandBrakeCLI, httpie, brew install, and more.
To prevent this, pass the -n option to your ssh command to make it read from /dev/null instead of stdin. Other commands have similar flags, or you can universally use < /dev/null.
A very simple and robust workaround is to change the file descriptor from which the read command receives input.
This is accomplished by two modifications: the -u argument to read, and the redirection operator for < $FILENAME.
In BASH, the default file descriptor values (i.e. values for -u in read) are:
0 = stdin
1 = stdout
2 = stderr
So just choose some other unused file descriptor, like 9 just for fun.
Thus, the following would be the workaround:
while read -u 9 LINE; do
let count++
echo "$count $LINE"
sh ./do_work.sh $LINE
done 9< $FILENAME
Notice the two modifications:
read becomes read -u 9
< $FILENAME becomes 9< $FILENAME
As a best practice, I do this for all while loops I write in BASH.
If you have nested loops using read, use a different file descriptor for each one (9,8,7,...).
More generally, a workaround which isn't specific to ssh is to redirect standard input for any command which might otherwise consume the while loop's input.
while read -r line; do
((count++))
echo "$count $line"
sh ./do_work.sh "$line" </dev/null
done < "$filename"
The addition of </dev/null is the crucial point here, though the corrected quoting is also somewhat important for robustness; see also When to wrap quotes around a shell variable?. You will want to use read -r unless you specifically require the slightly odd legacy behavior you get for backslashes in the input without -r. Finally, avoid upper case for your private variables.
Another workaround of sorts which is somewhat specific to ssh is to make sure any ssh command has its standard input tied up, e.g. by changing
ssh otherhost some commands here
to instead read the commands from a here document, which conveniently (for this particular scenario) ties up the standard input of ssh for the commands:
ssh otherhost <<'____HERE'
some commands here
____HERE
ssh -n option prevents checking the exit status of ssh when using HEREdoc while piping output to another program.
So use of /dev/null as stdin is preferred.
#!/bin/bash
while read ONELINE ; do
ssh ubuntu#host_xyz </dev/null <<EOF 2>&1 | filter_pgm
echo "Hi, $ONELINE. You come here often?"
process_response_pgm
EOF
if [ ${PIPESTATUS[0]} -ne 0 ] ; then
echo "aborting loop"
exit ${PIPESTATUS[0]}
fi
done << input_list.txt
This was happening to me because I had set -e and a grep in a loop was returning with no output (which gives a non-zero error code).

Ubuntu bash script output correct value - A script calls another script

I have a script that calls another script, but the output from the script seems to be wrong. Below is my code:
print_output.sh
#!/bin/bash
echo $1
total=$(eval $1 | awk '{sum+=$1} END {print sum}')
echo "Total is $total"
try.sh
#!/bin/bash
./print_output.sh "ssh $1 'cd /somelocation/logs; cat data-$1.txt' | grep ..."
Usuage is: ./try.sh ukdry-01
I expect the output to be:
uk
along with the "Total is 11" in this case. But the output is appearing as "cat data.txt".
Where I am going wrong? As I expect the input to be "ukdry-01". The "cat data.txt" is a command that be evaluated (i.e eval $1) from print_output.sh and executed within the script.
data.txt contains for example
1 AUTH
2 AND
8 BOOLEAN
The idea is that the total is printed. The script will ssh to a user inputted to an hostname i.e. ukdry-01, goes to data-ukdry-01.txt through cat and then logs out and performs a grep on the data found. The data is filtered is fed into print_output.sh which are all integers and prints out a total of column one. The usuage has to be the script name i.e. ./try.sh ukdry-01 where internally it calls ./print_output and then the ssh commands etc.
Below is what had worked for me:
try.sh
#!/bin/bash
./print_output.sh $# "ssh $1 'cd /somelocation/logs; cat data-$1.txt' | grep ..."
Usuage is: ./try.sh ukdry-01
The $# copies the positional parameters and I can make use of ukdry-01 as $1.

After running shell script, variables lose their value

I have done a simple shell script:
#!/bin/sh
file_name=$1
state=`cat "$file_name" | grep "port protocol" | awk '{print $4}'`
echo $state
reason=`cat "$file_name" | grep "port protocol"`
echo $reason
This outputs the values $state and $reason.
However, when I run..
echo $state
..in the console it does not output nothing. It seems like the variable loses its value. Is this the normal behaviour or should I had something to the script?
Thanks!
Assuming that you're running your script like ./script.sh or sh script.sh, then this is the expected behaviour. Child processes cannot change the environment of their parent. The fact that you're running a shell script from a shell doesn't change this rule.
What you can do is source the script instead of executing it, to set those variables in the local environment:
. script.sh
This effectively runs the lines of the script in your current shell, so the variables will be set there.
While I've got your attention, I'd recommend making the following changes to your script:
#!/bin/sh
file_name=$1
state=$(awk '/port protocol/ {print $4}' "$file_name")
echo "$state"
reason=$(grep "port protocol" "$file_name")
echo "$reason"
In short, quote your variables, avoid useless calls to cat and use pattern matching in awk rather than piping grep to it.

BASH : Feeding file to while loop

I have next code with while loop in my sctipt :
TMP_FILE=`mktemp`
some_script.sh | grep aa > $TMP_FILE
while read i
do
echo $i
number=`ssh somehost cat somefile | grep 11 `
echo $number
done < $TMP_FILE
Contents of TMP_FILE looks like :
hostname1 AB_CDEF_JH10
hostname2 BC_DEF_JK19
...
In this case, script works correctly only one loop pass, picking up first line from TMP_FILE. After that , script exit. Is there any idea why it do not want to process other lines except firs one ?
Try passing the -n option to ssh to prevent it from reading from stdin.
By default, ssh reads from stdin (which is your file, in this case) and forwards it to the stdin of the command running on the remote host. As a result, your whole file gets consumed by ssh and the loop only executes once!

Resources