Shell script can read file line by line but not perform actions for each line - shell

I'm trying to run this command over multiple machines
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#x.x.x.x "mkdir test"
The IPs are stored in the following .txt file
$ cat ips.txt
10.0.2.15
10.0.2.5
I created a bash script that reads this file line by line. If I run it with an echo:
#!/bin/bash
input="ips.txt"
while IFS= read -r line
do
echo "$line"
#sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$line "mkdir test"
done < "$input"
It prints every line:
$ ./variables.sh
10.0.2.15
10.0.2.5
This makes me understand that the script is working as intended. However, when I replace the echo line with the command I want to run for each line:
#!/bin/bash
input="ips.txt"
while IFS= read -r line
do
#echo "$line"
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$line "mkdir test"
done < "$input"
It only performs the action for the first IP on the file, then stops. Why?

Managed to solve this by using a for instead of a while. Script ended up looking like this:
for file in $(cat ips.txt)
do
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$file "mkdir test"
done

While your example is a solution that works, it's not the explanation.
Your could find the explanation here : ssh breaks out of while-loop in bash
In two words :
"while" loop continue reading from the same file-descriptor that defined in the loop header ( $input in your case )
ssh (or sshpass) read data from stdin (but in your case from file descriptor $input). And here is the point that hide the things as we didn't exect "ssh" to read the data.
Just to understand the problem you could have same strange experience for example using commands like "ffmpeg" or "mplayer" in while loop. Mplayer and ffmpeg use the keyboards while they are running, so they will consume all the the file-descriptor.
Another good and funny example :
#!/bin/bash
{
echo first
for ((i=0; i < 16384; i++)); do echo "testing"; done
echo "second"
} > test_file
while IFS= read -r line
do
echo "Read $line"
cat | uptime > /dev/null
done < test_file
At first part we write 1st line : first
16384 lines : testing
then last line : second
16384 lines "testing" are equal to 128Kb buffer
At the second part, the command "cat | uptime" will consume exactly 128Kb buffer, so our script will give
Read first
Read second
As solution, as you did, we could use "for" loop.
Or use "ssh -n"
Or playing with some file descriptor - you could find the example in the link that I gave.

Related

Issue with echo output to file in bash

I am running into an issue trying to echo some strings into files inside a shell script.
I am calling the shell script through pyspark/spark's rdd.pipe() command, and I checked to make sure the input is coming through by echoing each line in the shell script.
Here is the shell script code:
#!/bin/sh
while read -r one; do
read -r two
read -r three
read -r four
read -r five
read -r six
read -r seven
read -r eight
echo -e "$one\n$two\n$three\n$four\n" >> 1.txt
echo -e "$five\n$six\n$seven\n$eight\n" >> 2.txt
done
I ran the echo command WITHOUT piping to a file and that showed up in the output back in my spark program. The input to the shell script is just strings. Does anyone have ideas why 1.txt and 2.txt aren't being written to?

Run bash script from file with input

I want to make a simple bash script that makes a for loop over a file with commands and execute those commands, and finishes when an error happens. I have something like this
#!/bin/bash
while IFS= read -r line; do
echo $line
output=$(eval $line)
if [ $? -eq 0 ]
then
echo ok
else
echo $output
break
fi
echo
done < summary.txt
The problem is that the first command I'm trying to make is a sudo command, so I have to put the password. I tried putting it in the command like
sudo -S <<< password <<< Y command
with no luck. I've checked that works if I just put it directly without having to read it (not putting it as a string). The thing is that without the loop, the program would be long with no much sense.
Thanks
echo <password> | sudo -S < your command>
From man sudo
-S, --stdin
Write the prompt to the standard error and read the password from the standard input instead of using the terminal
device. The password must be followed by a newline character.

bash while loop breaks after the first line

I have a simple script with while loop, but cannot understand, why it breaks after first line, from $vault_list variable:
#!/bin/bash
tech_login="$1"
vault_list=$(docker exec -i tmgnt_vault_1 vault list secret/${tech_login}-terminals | sed 1,2d)
while IFS= read -r terminal
do
echo "line is $terminal"
key_values=$(docker exec -i tmgnt_vault_1 vault read secret/${tech_login}-terminals/$terminal )
done <<< "$vault_list"
If I remove $key_values from while loop, it returns all values in echo "line is $terminal".
Can anyone point me, what is the problem with while loop? I assume, that this can be a problem with output, but not sure.
Hopefully this will help others.
ssh might be the command that is eating stdin.
It was for me.
e.g. ssh inside a while loop was causing the loop to exit after first iteration.
LIST="cid1 10.10.0.1 host1
cid2 10.10.0.2 host1
cid3 10.10.0.3 host2"
# this while loop exits after first iteration
# ssh has eaten rest of stdin
echo "$LIST" |while read -r cid cip chost; do
echo $cid;
PSINFO=$(ssh $chost docker exec -i $cid "ps -e -orss=,pid=,args=,cmd=" |grep java );
echo PSINFO=$PSINFO;
done;
SOLVED by directing ssh to take stdin from /dev/null using </dev/null:
# this while loop keeps on running
# ssh directed to take stdin from /dev/null
echo "$LIST" |while read -r cid cip chost; do
echo $cid;
PSINFO=$(ssh $chost docker exec -i $cid "ps -e -orss=,pid=,args=,cmd=" </dev/null |grep java );
echo PSINFO=$PSINFO;
done;
With hint from #choroba I found right syntax for $key_values:
key_values=$(docker exec -i tmgnt_vault_1 vault read secret/${tech_login}-terminals/$terminal <<<$terminal)
I was need to pass the $terminal variable explicitly to the docker command, which can be done with a here-string, "<<

Shell Script - While Loop / File Reading not working

I have a requirement which should address following points.
I have a file which contains list of IP addresses,I want to read line by line.
For each IP I need to push following commands using SSH (all are Mikrotik devices)
/ radius add service=login address=172.16.0.1 secret=aaaa
/ user aaa set use-radius=yes
Following is my code.
#!/bin/bash
filename="branch"
while IFS= read line; do
echo ${line//}
line1=${line//}
ok='#'
line3=$ok$line1
sshpass -p abc123 ssh -o StrictHostKeyChecking=no admin$line3 / radius add service=login address=172.16.0.1 secret=aaaa
sleep 3
sshpass -p abc123 ssh -o StrictHostKeyChecking=no admin$line3 / user aaa set use-radius=yes
sleep 3
echo $line3
echo $line
done <"$filename"
Branch text file:
192.168.100.1
192.168.101.2
192.168.200.1
Issue: What ever the changes I am doing While loop is only run once.
Troubleshooting/Observations:
Without the SSH command if I run the While loop to read the file " branch " it work fine.
The problem is that a program in the loop also reads data on standard input. This will consume the 2nd and subsequent lines of what's in "$filename".
On the next iteration of the loop, there's nothing left to read and the loop terminates.
The solution is to identify the command reading stdin, probably sshpass and change it to leave stdin alone. The answer by Cyrus shows one way to do that for ssh. If that doesn't work, try
sshpass [options and arguments here] < /dev/null
Another solution is to replace the while with a for loop. This works as long as the branch file only contains IP addresses:
for ip in $(cat branch); do
echo $ip
...
sshpass ...
done

not able to use ssh within a shell script

I have this shell script which ssh to other server, find few specific files(.seq files older than 50 mnts) and writes their name to another file.
#! /usr/bin/bash
while read line
do
#echo $line
if [[ $line =~ ^# ]];
then
#echo $line;
continue;
else
serverIP=`echo $line|cut -d',' -f1`
userID=`echo $line|cut -d',' -f2`
fi
done < sftp.conf
sshpass -p red32hat ssh $userID#$serverIP
cd ./perl
for files in `find -name "*.seq" -mmin +50`
do
#sshpass -p red32hat scp *.seq root#rinacac-test:/root/perl
echo $files>>abcde.txt
done
exit;
#EOF
Now problem is that when I run it.. neither it writes to abcde.txt file nor it is exiting from the remote server. when I manually execute the exit command...it exists saying "perl no such file or directory"... while I have perl sub directory in my home directory..
other thing is when I run the for loop portion of the script on the 2nd server(by directly logging into it) it is working fine and writing to abcde.txt filr...please help...
ssh takes commands either on standard input or as the last parameter. You can therefore do this (very dynamic but tricky to get the expansions right):
ssh user#host <<EOF
some
custom
commands
EOF
or this (less dynamic but can take simple parameters without escaping):
scp my_script.sh user#host:
ssh user#host './my_script.sh'

Resources