Shell Script - While Loop / File Reading not working - bash

I have a requirement which should address following points.
I have a file which contains list of IP addresses,I want to read line by line.
For each IP I need to push following commands using SSH (all are Mikrotik devices)
/ radius add service=login address=172.16.0.1 secret=aaaa
/ user aaa set use-radius=yes
Following is my code.
#!/bin/bash
filename="branch"
while IFS= read line; do
echo ${line//}
line1=${line//}
ok='#'
line3=$ok$line1
sshpass -p abc123 ssh -o StrictHostKeyChecking=no admin$line3 / radius add service=login address=172.16.0.1 secret=aaaa
sleep 3
sshpass -p abc123 ssh -o StrictHostKeyChecking=no admin$line3 / user aaa set use-radius=yes
sleep 3
echo $line3
echo $line
done <"$filename"
Branch text file:
192.168.100.1
192.168.101.2
192.168.200.1
Issue: What ever the changes I am doing While loop is only run once.
Troubleshooting/Observations:
Without the SSH command if I run the While loop to read the file " branch " it work fine.

The problem is that a program in the loop also reads data on standard input. This will consume the 2nd and subsequent lines of what's in "$filename".
On the next iteration of the loop, there's nothing left to read and the loop terminates.
The solution is to identify the command reading stdin, probably sshpass and change it to leave stdin alone. The answer by Cyrus shows one way to do that for ssh. If that doesn't work, try
sshpass [options and arguments here] < /dev/null
Another solution is to replace the while with a for loop. This works as long as the branch file only contains IP addresses:
for ip in $(cat branch); do
echo $ip
...
sshpass ...
done

Related

how to run bash script interactively from url? [duplicate]

I have a simple Bash script that takes in inputs and prints a few lines out with that inputs
fortinetTest.sh
read -p "Enter SSC IP: $ip " ip && ip=${ip:-1.1.1.1}
printf "\n"
#check IP validation
if [[ $ip =~ ^[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
echo "SSC IP: $ip"
printf "\n"
else
echo "Enter a valid SSC IP address. Ex. 1.1.1.1"
exit
fi
I tried to upload them into my server, then try to run it via curl
I am not sure why the input prompt never kick in when I use cURL/wget.
Am I missing anything?
With the curl ... | bash form, bash's stdin is reading the script, so stdin is not available for the read command.
Try using a Process Substitution to invoke the remote script like a local file:
bash <( curl -s ... )
Your issue can be simply be reproduced by run the script like below
$ cat test.sh | bash
Enter a valid SSC IP address. Ex. 1.1.1.1
This is because the bash you launch with a pipe is not getting a TTY, when you do a read -p it is read from stdin which is content of the test.sh in this case. So the issue is not with curl. The issue is not reading from the tty
So the fix is to make sure you ready it from tty
read < /dev/tty -p "Enter SSC IP: $ip " ip && ip=${ip:-1.1.1.1}
printf "\n"
#check IP validation
if [[ $ip =~ ^[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
echo "SSC IP: $ip"
printf "\n"
else
echo "Enter a valid SSC IP address. Ex. 1.1.1.1"
exit
fi
Once you do that even curl will start working
vagrant#vagrant:/var/www/html$ curl -s localhost/test.sh | bash
Enter SSC IP: 2.2.2.2
SSC IP: 2.2.2.2
I personally prefer source <(curl -s localhost/test.sh) option. While it is similar to bash ..., the one significant difference is how processes handled.
bash will result in a new process being spun up, and that process will evoke commands from the script.
source on the other hand will use current process to evoke commands from the script.
In some cases that can play a key role. I admit that is not very often though.
To demonstrate do the following:
### Open Two Terminals
# In the first terminal run:
echo "sleep 5" > ./myTest.sh
bash ./myTest.sh
# Switch to the second terminal and run:
ps -efjh
## Repeat the same with _source_ command
# In the first terminal run:
source ./myTest.sh
# Switch to the second terminal and run:
ps -efjh
Results should look similar to this:
Before execution:
Running bash (main + two subprocesses):
Running source (main + one subprocess):
UPDATE:
Difference in use variable usage by bash and source:
source command will use your current environment. Meaning that upon execution all changes and variable declarations, made by the script, will be available in your prompt.
bash on the other hand will be running in as a different process; therefore, all variables will be discarded when process exits.
I think everyone will agree that there are benefits and drawbacks to each method. You just have to decide which one is better for your use case.
## Test for variables declared by the script:
echo "test_var3='Some Other Value'" > ./myTest3.sh
bash ./myTest3.sh
echo $test_var3
source ./myTest3.sh
echo $test_var3
## Test for usability of current environment variables:
test_var="Some Value" # Setting a variable
echo "echo $test_var" > myTest2.sh # Creating a test script
chmod +x ./myTest2.sh # Adding execute permission
## Executing:
. myTest2.sh
bash ./myTest2.sh
source ./myTest2.sh
./myTest2.sh
## All of the above results should print the variable.
I hope this helps.

Shell script can read file line by line but not perform actions for each line

I'm trying to run this command over multiple machines
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#x.x.x.x "mkdir test"
The IPs are stored in the following .txt file
$ cat ips.txt
10.0.2.15
10.0.2.5
I created a bash script that reads this file line by line. If I run it with an echo:
#!/bin/bash
input="ips.txt"
while IFS= read -r line
do
echo "$line"
#sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$line "mkdir test"
done < "$input"
It prints every line:
$ ./variables.sh
10.0.2.15
10.0.2.5
This makes me understand that the script is working as intended. However, when I replace the echo line with the command I want to run for each line:
#!/bin/bash
input="ips.txt"
while IFS= read -r line
do
#echo "$line"
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$line "mkdir test"
done < "$input"
It only performs the action for the first IP on the file, then stops. Why?
Managed to solve this by using a for instead of a while. Script ended up looking like this:
for file in $(cat ips.txt)
do
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$file "mkdir test"
done
While your example is a solution that works, it's not the explanation.
Your could find the explanation here : ssh breaks out of while-loop in bash
In two words :
"while" loop continue reading from the same file-descriptor that defined in the loop header ( $input in your case )
ssh (or sshpass) read data from stdin (but in your case from file descriptor $input). And here is the point that hide the things as we didn't exect "ssh" to read the data.
Just to understand the problem you could have same strange experience for example using commands like "ffmpeg" or "mplayer" in while loop. Mplayer and ffmpeg use the keyboards while they are running, so they will consume all the the file-descriptor.
Another good and funny example :
#!/bin/bash
{
echo first
for ((i=0; i < 16384; i++)); do echo "testing"; done
echo "second"
} > test_file
while IFS= read -r line
do
echo "Read $line"
cat | uptime > /dev/null
done < test_file
At first part we write 1st line : first
16384 lines : testing
then last line : second
16384 lines "testing" are equal to 128Kb buffer
At the second part, the command "cat | uptime" will consume exactly 128Kb buffer, so our script will give
Read first
Read second
As solution, as you did, we could use "for" loop.
Or use "ssh -n"
Or playing with some file descriptor - you could find the example in the link that I gave.

bash while read loop stops after the first line (after corrections based on a StackExchange post) [duplicate]

The eventual goal is to have my bash script execute a command on multiple servers. I almost have it set up. My SSH authentication is working, but this simple while loop is killing me. When I execute the while loop, reading my file for host names, it works fine when I run a
ssh $HOST "uname -a"
but when I attempt to run another ssh command,
ssh $HOST "oslevel -s"
the while loop ends early! I can't figure it out. Why would the while read do loop run perfectly fine with the first command, but not when the second is added?
I have a simple text file called hosts.list that has 4 hostnames, one per line.
$ cat hosts.list
pcced1bip04
pcced1bit04
pcced1bo02
pcced1bo04
$ cat getinfo.bash
#!/bin/bash
set -x
while read HOST
do
echo $HOST
ssh $HOST "uname -a"
#ssh $HOST "oslevel -s"
echo ""
done < hosts.list`
When it runs, it works fine. It goes through the file, line by line and gets the results of "uname -a". So everything is fine, right? (Sorry, but I turned on set -x).
$ ./getinfo.bash
+ read HOST
+ echo pcced1bip04
pcced1bip04
+ ssh pcced1bip04 'uname -a'
AIX pcced1bip04 1 6 0001431BD400
+ echo ''
+ read HOST
+ echo pcced1bit04
pcced1bit04
+ ssh pcced1bit04 'uname -a'
AIX pcced1bit04 1 6 0001431BD400
+ echo ''
+ read HOST
+ echo pcced1bo02
pcced1bo02
+ ssh pcced1bo02 'uname -a'
AIX pcced1bo02 1 6 0009FE2AD400
+ echo ''
+ read HOST
+ echo pcced1bo04
pcced1bo04
+ ssh pcced1bo04 'uname -a'
AIX pcced1bo04 1 6 0009FE2AD400
+ echo ''
+ read HOST
$
The problem occurs when I enable the line [ssh $HOST "oslevel -s"]. When I do, the script only reads the first line of the file, and then stops. Why won't it go onto the other lines?
$ ./getinfo.bash
+ read HOST
+ echo pcced1bip04
pcced1bip04
+ ssh pcced1bip04 'uname -a'
AIX pcced1bip04 1 6 0001431BD400
+ ssh pcced1bip04 'oslevel -s'
6100-06-02-1044
+ echo ''
+ read HOST
$
If I had a problem with my script, why would it be working perfectly fine with just the [ssh $HOST "uname -a"] in the while loop?
If you run commands which read from stdin (such as ssh) inside a loop, you need to ensure that either:
Your loop isn't iterating over stdin
Your command has had its stdin redirected:
...otherwise, the command can consume input intended for the loop, causing it to end.
The former:
while read -u 5 -r hostname; do
ssh "$hostname" ...
done 5<file
...which, using bash 4.1 or newer, can be rewritten with automatic file descriptor assignment as so:
while read -u "$file_fd" -r hostname; do
ssh "$hostname" ...
done {file_fd}<file
The latter:
while read -r hostname; do
ssh "$hostname" ... </dev/null
done <file
...can also, for ssh alone, can also be approximated with the -n parameter (which also redirects stdin from /dev/null):
while read -r hostname; do
ssh -n "$hostname"
done <file
Assign to an array before the loop, so that you are not using stdin for your loop variables. The ssh inside the loop can then use stdin without interfering with your loop.
readarray a < hosts.list
for HOST in "${a[#]}"; do
ssh $HOST "uname -a"
#...other stuff in loop
done
As the solution specified here use -n option for ssh or open file with a different handle:
while read -u 4 HOST
do
echo $HOST
ssh $HOST "uname -a"
ssh $HOST "oslevel -s"
echo ""
done 4< hosts.list`
maybe with python XD
#!/usr/bin/python
import sys
import Queue
from subprocess import call
logfile = sys.argv[1]
q = Queue.Queue()
with open(logfile) as data:
datalines = (line.rstrip('\r\n') for line in data)
for line in datalines:
q.put(line)
while not q.empty() :
host = q.get()
print "++++++ " + host + " ++++++"
call(["ssh", host, "uname -a"])
call(["ssh", host, "oslevel -s"])
print "++++++++++++++++++++++++++"

Failed to run scripts in multiple remote host by ssh

I write a deployAll.sh, which read ip_host.list line by line, then add group for all the remote hosts,
when I run: sh deployAll.sh
results:
Group is added to 172.25.30.11
not expected results:
Group is added to 172.25.30.11
Group is added to 172.25.30.12
Group is added to 172.25.30.13
Why it just execute the first one? please help, thanks a lot!
deployAll.sh
#!/bin/bash
function deployAll()
{
while read line;do
IFS=';' read -ra ipandhost<<< "$line"
ssh "${ipandhost[0]}" "groupadd -g 1011 test"
printf "Group is added to ${ipandhost[0]}\n"
done < ip_host.list
}
deployAll
ip_host.list
172.25.30.11;test-30-11
172.25.30.12;test-30-12
172.25.30.13;test-30-13
That's a frequent problem which is caused by the special behavior of ssh, which sucks up stdin, starving the loop ( i.e. while read line;do ...;done )
Please see Bash FAQ 89 which discusses this subject in detail.
I also just answered ( and solved ) a similar question regarding ffmpeg with the same behavior as ssh in this case. Here: When reading a file line by line, I only get to execute ffmpeg on the first line.
TL;DR :
There are three main options:
Using ssh's -n option. Quoted from man ssh:
-n Redirects stdin from /dev/null (actually, prevents reading from stdin). This must be used when ssh is run in the background. A common trick is to use this to run X11 programs on a remote
machine. For example, ssh -n shadows.cs.hut.fi emacs & will start an emacs on shadows.cs.hut.fi, and the X11 connection will be automatically forwarded over an encrypted channel. The ssh pro‐
gram will be put in the background. (This does not work if ssh needs to ask for a password or passphrase; see also the -f option.)
Adding a </dev/null at the end of ssh's line ( i.e. ssh ... </dev/null ) will fix the issue and will make ssh behave as expected.
Let read read from a File Descriptor which is unlikely to be used by a random program:
while IFS= read -r line <&3; do
# Here read is reading from FD 3, to which 'ip_host.list' is redirected.
done 3<ip_host.list
Without the ssh command (which wouldn't make sense on my network), I get the expected output so I suspect that the ssh command is swallowing the remaining standard input. You should use the -n flag to prevent ssh from reading from stdin (equivalent to redirecting stdin from /dev/null):
ssh -n "${ipandhost[0]}" "groupadd -g 1011 test"
or
ssh "${ipandhost[0]}" "groupadd -g 1011 test" < /dev/null
See also How to keep script from swallowing all of stdin?
My solution is to generate ssh keys through ssh-keygen command and replace existing public key file (if any). After which installation will resume.

Bash script to pass commands remotely via SSH

i'm just starting out with bash & am trying to write a script to search specific files in a server remotely based on: (a)device name and (b) string. my goal is to get all output containing 'string' for the device specified. when i tried the script below just hangs. however, when i run the command directly on the server("grep -i "router1" /var/log/router.log | grep -i "UPDOWN"), it works. any ideas?any ideas?
#!/bin/bash
#
read -p "Enter username: " user
read -p "Enter device name: " dev
read -p "Enter string: " str
while read /home/user1/syslogs
do
ssh "$user"#server1234 'grep -i "$dev" /var/log/"$syslogs" 2> /dev/null | grep -i "$str"'
done
You seem to be mis-using the read command. You don't specify the file to read from as an argument; read always reads from standard input. It's not clear what you want to do with the value you read from the file as a result, but you want something like this:
read -p "Enter username: " user
read -p "Enter device name: " dev
read -p "Enter string: " str
while read fileName; do
# Also: I'm borrowing sputnick's solution to the nested quote problem.
ssh $user#server1234 <<EOF
grep -i "$dev" /var/log/$fileName 2>/dev/null | grep -i "$str"
EOF
done < /home/user1/syslogs
The message Pseudo-terminal will not be allocated because stdin is not a terminal is due to the fact that the stdin of the remote host's shell is being redirected from a here document and that there is no command specified for the remote host to execute, i. e. the remote host first assumes there will be a need to allocate a pseudo-terminal for an interactive login session due to the lacking command (see the synopsis of the ssh man page: ssh ... [user#]hostname [command]), but then realizes that the stdin of its shell is not a terminal since it is redirected from a here document. The result is that the remote host refuses to allocate a pseudo-terminal.
The solution in the given case would be to just specify a shell as a command for the remote host to execute the commands given in the here document.
As an alternative to specifying a shell as a command the remote host could be told in advance that there is no need for the allocation of a pseudo-terminal using the -T switch.
The -t switch, on the other hand, would be necessary only if a specified command expects an interactive login shell session on the remote host (such as top or vim).
- ssh $user#server1234 <<EOF ...
+ ssh $user#server1234 /bin/sh <<EOF ...
+ ssh -T $user#server1234 <<EOF ...

Resources