How to execute a bash command with arguments from array elements? - bash

I am writing a simple bash script on Linux to ping some hosts, e.g.:
ping -c 1 google.com
ping -c 1 amazon.com
...
In my approach, I am loading the hosts from a separate file into an array and then loop over the elements. The problem occurs when calling the ping command with elements from the array. Executing the following script gives me an error message.
#!/bin/bash
IFS=$'\n' read -d '' -r hosts < hostnames.txt
for host in "${hosts[#]}"
do
ping -c 1 ${host}
done
I guess there is something wrong with the syntax, but I couldn't figure it out yet.

Your hostnames.txt was generated on a Windows machine?
Your hostnames have trailing \r characters, so the lookups fail.
Try this:
cp hostnames.txt hostnames.txt.bkp
dos2unix hostnames.txt.bkp hostnames.txt
And then run your script again.
If you don't have dos2unix installed and don't want to install it ... maybe you have tr already available. In that case this should do the trick, too:
tr -d '\r' < hostnames.txt.bkp > hostnames.txt

Related

Shell script can read file line by line but not perform actions for each line

I'm trying to run this command over multiple machines
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#x.x.x.x "mkdir test"
The IPs are stored in the following .txt file
$ cat ips.txt
10.0.2.15
10.0.2.5
I created a bash script that reads this file line by line. If I run it with an echo:
#!/bin/bash
input="ips.txt"
while IFS= read -r line
do
echo "$line"
#sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$line "mkdir test"
done < "$input"
It prints every line:
$ ./variables.sh
10.0.2.15
10.0.2.5
This makes me understand that the script is working as intended. However, when I replace the echo line with the command I want to run for each line:
#!/bin/bash
input="ips.txt"
while IFS= read -r line
do
#echo "$line"
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$line "mkdir test"
done < "$input"
It only performs the action for the first IP on the file, then stops. Why?
Managed to solve this by using a for instead of a while. Script ended up looking like this:
for file in $(cat ips.txt)
do
sshpass -p 'nico' ssh -o 'StrictHostKeyChecking=no' nico#$file "mkdir test"
done
While your example is a solution that works, it's not the explanation.
Your could find the explanation here : ssh breaks out of while-loop in bash
In two words :
"while" loop continue reading from the same file-descriptor that defined in the loop header ( $input in your case )
ssh (or sshpass) read data from stdin (but in your case from file descriptor $input). And here is the point that hide the things as we didn't exect "ssh" to read the data.
Just to understand the problem you could have same strange experience for example using commands like "ffmpeg" or "mplayer" in while loop. Mplayer and ffmpeg use the keyboards while they are running, so they will consume all the the file-descriptor.
Another good and funny example :
#!/bin/bash
{
echo first
for ((i=0; i < 16384; i++)); do echo "testing"; done
echo "second"
} > test_file
while IFS= read -r line
do
echo "Read $line"
cat | uptime > /dev/null
done < test_file
At first part we write 1st line : first
16384 lines : testing
then last line : second
16384 lines "testing" are equal to 128Kb buffer
At the second part, the command "cat | uptime" will consume exactly 128Kb buffer, so our script will give
Read first
Read second
As solution, as you did, we could use "for" loop.
Or use "ssh -n"
Or playing with some file descriptor - you could find the example in the link that I gave.

Bash script placing quotes around command

I have a Bash script that will get an IP to use as part of an SSH tunnel, but running this script the SSH tunnel fails. When using set -x I can see it places the arguments to the SSH command in single quotes and manually running this line results in the same error.
The Script:
ssh -N -L 9000:${ip_array[$2]}:443 ssh-server
The first argument is used elsewhere in the script for something else which is why the second is used here. ssh-server is an alias in my SSH config to the server i am tunneling through.
The output I get is:
ssh -N -L '9000:"172.0.0.1":443' ssh-server
Could this be because the script to fetch the IP returns strings to the array?
you can try removing the double-quotes first :
ip=$(echo "${ip_array[$2]}" | sed "s/\"//g")
ssh -N -L 9000:${ip}:443 ssh-server
Or just use shell parameter expansion to remove the quotes:
ssh -N -L 9000:${ip_array[$2]//"/}:443 ssh-server
That lone double quote may mess up your editor's syntax highlighting.
Get rid of the quotes by piping it through the tr command:
ssh -N -L 9000:$( echo ${ip_array[$2]} | tr -d '"' ):443 ssh-server

eval printf works from command line but not in script

When I run the following command in a terminal it works, but not from a script:
eval $(printf "ssh foo -f -N "; \
for port in $(cat ~/bar.json | grep '_port' | grep -o '[0-9]\+'); do \
printf "-L $port:127.0.0.1:$port ";\
done)
The error I get tells me that printf usage is wrong, as if the -L argument within quotes would've been an argument to printf itself.
I was wondering why that is the case. Am I missing something obvious?
__
Context (in case my issue is an XY problem): I want to start and connect to a jupyter kernel running on a remote computer. To do so I wrote a small script that
sends a command per ssh for the remote to start the kernel
copies via scp a configuration file that I can use to connect to the kernel from my local computer
reads the configuration file and opens appropriate ssh tunnels between local and remote
For those not familiar with jupyter, a configuration file (bar.json) looks more or less like the following:
{
"shell_port": 35932,
"iopub_port": 37145,
"stdin_port": 42704,
"control_port": 39329,
"hb_port": 39253,
"ip": "127.0.0.1",
"key": "4cd3e12f-321bcb113c204eca3a0723d9",
"transport": "tcp",
"signature_scheme": "hmac-sha256",
"kernel_name": ""
}
And so, in my command above, the printf statement creates an ssh command with all the 5 -L port forwarding for my local computer to connect to the remote, and eval should run that command. Here's the full script:
#!/usr/bin/env bash
# Tell remote to start a jupyter kernel.
ssh foo -t 'python -m ipykernel_launcher -f ~/bar.json' &
# Wait a bit for the remote kernel to launch and write conf. file
sleep 5
# Copy the conf. file from remote to local.
scp foo:~/bar.json ~/bar.json
# Parse the conf. file and open ssh tunnels.
eval $(printf "ssh foo -f -N "; \
for port in $(cat ~/bar.json | grep '_port' | grep -o '[0-9]\+'); do \
printf "-L $port:127.0.0.1:$port ";\
done)
Finally, jupyter console --existing ~/foo.json connects to remote.
As #that other guy says, bash's printf builtin barfs on printf "-L ...". It thinks you're passing it a -L option. You can fix it by adding --:
printf -- "-L $port:127.0.0.1:$port "
Let's make that:
printf -- '-L %s:127.0.0.1:%s ' "$port" "$port"
But since we're here, we can do a lot better. First, let's not process JSON with basic shell tools. We don't want to rely on it being formatting a certain way. We can use jq, a lightweight and flexible command-line JSON processor.
$ jq -r 'to_entries | map(select(.key | test(".*_port"))) | .[].value' bar.json
35932
37145
42704
39329
39253
Here we use to_entries to convert each field to a key-value pair. Then we select entries where the .key matches the regex .*_port. Finally we extract the corresponding .values.
We can get rid of eval by constructing the ssh command in an array. It's always good to avoid eval when possible.
#!/bin/bash
readarray -t ports < <(jq -r 'to_entries | map(select(.key | test(".*_port"))) | .[].value' bar.json)
ssh=(ssh foo -f -N)
for port in "${ports[#]}"; do ssh+=(-L "$port:127.0.0.1:$port"); done
"${ssh[#]}"

How to run for loop inside heredoc while accessing remote machine

Here is my script in which I use local variable inside a remote machine using heredoc. But the loop under the heredoc takes the first variable value only. The loop runs fine inside the heredoc but with the same values.
#!/bin/bash
prod_web=($(cat /tmp/webip.txt));
new_prod_app_private_ip=($(cat /tmp/ip.txt));
no_n=($(cat /tmp/serial.txt));
ssh -t -o StrictHostKeyChecking=no ubuntu#${prod_web[0]} -p 2345 -v << EOF
set -xv
for (( x = 0; x < '${#no_n[#]}'; x++ ))
do
sudo su
echo '${no_n[x]}'
echo '${new_prod_app_private_ip[x]}'
curl -fIkSs https://'${new_prod_app_private_ip[x]}':9002 | head -n 1
done
EOF
So, my ip.txt file contains values like:
10.0.1.0
10.0.2.0
10.0.3.0
My serial.txt file:
9
10
11
So, my loop runs for only the first IP (present in /tmp/ip.txt) in the remote machine, three times. I want to run it for all the three IPs. My remote ip is present in the file /tmp/webip.txt.
Got stuck for a long time, any help is appreciated. Is there any other solution that I can go with?
There are 2 environments. On your local machine and on the remote machine. You need to think how to transfer data/variables/state/objects/handles between these machines.
If you set something on your local machine (ie. prod_web=($(cat /tmp/webip.txt));) and then just ssh to remote host (ie. ssh user#host 'echo "${prod_web[#]}"'), the variable will not be visible/exported to the remote machine. You can:
scp the files {ip,serial}.txt and execute the whole script on the remote machine, then cleanup , ie. remove the {ip,serial}.txt files from the remote machine
pass the files {ip,serial}.txt somehow merged/joined/pasted to the stdin of the ssh and then read up stdin on the remove machine
create all the commands to run on your local machine and then pass pre-prepared commands to remote machine, like ssh .... "$(for ...; do; echo curl ...; done)"
I would go with the second option, as I like passing everything using pipes and don't like to cleanup after me - removing temporary files in case of error can be a mess.
My script would probably look like this:
#!/bin/bash
set -euo pipefail
read -r host _ <webip.txt
paste serial.txt ip.txt | ssh -t -o StrictHostKeyChecking=no -p 2345 -v ubuntu#"$host" '#!/bin/bash
set -euo pipefail
while read -r no_n ip; do
for ((i = 0; i < no_n; ++i)); do
printf "%s\n" "$no_n"
printf "%s\n" "$ip"
curl -fIkSs https://"$ip":9002 | head -n 1
done
done
'
As the remote script would become larger and less qouting friendly, I would save it into another remote_scripts.sh and execute ssh ... -m remote_scripts.sh.
I don't get what you are trying to do with that sudo su, which 100% does not do what you want.
If the no_n magic number is the number of times to execute that curl and you have xargs and you don't really care about errors, you can just do a magic and confusing oneliner:
#!/bin/bash
set -euo pipefail
read -r host _ <webip.txt
paste serial.txt ip.txt | ssh -t -o StrictHostKeyChecking=no -p 2345 -v ubuntu#"$host" 'xargs -n2 -- sh -c "seq 0 \"\$1\" | xargs -n1 -- sh -c \"curl -fIkSs https://\\\"\\\$1\\\":9002 | head -n 1\" -- \"\$2\"" --'
Preparing all the command to run maybe actually more readable and may save some nasty qouting to resolve. But this really depends on how big serial.txt and ip.txt are and how big are the commands to be executed on the remote machine, as you want to minimize the number of bytes transferred between machines.
Here the commands to run are constructed on local machine (ie. "$(...)" is passed to ssh) and executed on remote machine:
# semi-readable script, not as fast and no xargs
ssh -t -o StrictHostKeyChecking=no -p 2345 -v ubuntu#"$host" "$(paste serial.txt ip.txt | while read -r serial ip; do
seq 0 "$serial" | while read -r _; do
echo "curl -fIkSs \"https://$ip:9002\" | head -n 1"
done
done)"
HERE-doc does not expand shell commands, so:
$ cat <<EOF
> echo 1
> EOF
echo 1
but you can use command substitution $( ... ):
$ cat <<EOF
> $(echo 1)
> EOF
1

how to ssh a loop over several commands

I am new to ssh so forgive me if my questions are trivial..i need to make a a remote computer execute a set of commands several times so i was thinking about making a loop using ssh ..the problem is i don't know do i save those commands in a file and loop on that file or can i like save them in ssh and just call them ..i am really troubled..also if i make a loop like this
i= 10
while i!= 0
execute command.text file ???
i--
How to i tell it to execute the file ?
Just try first on the shell in the remote machine to run the command you want.
You will find plenty of info over the internet about loops in shell/bash/csh/whatevershell:
For instance assuming bash run in the remote host (from: http://www.bashoneliners.com/ )
$ for ((i=1; i<=10; ++i)); do echo $i; done
Once you learn that, simply then take that statement to the ssh command from the machine you want to trigger the action:
$ ssh user#remotehost 'for ((i=1; i<=10; ++i)); do echo $i; done'
You can write a simple script that will execute needed commands, and path it to ssh.
For example:
script.sh, it will iterate over your bunch of commands 10 times:
for i in $(seq 10)
do
command1
command2
command3
done
and path it to remote server for execution:
$ ssh $SERVERNAME < script.sh
If you have this command.text file in which you have written all the commands in column (you can modify them with vi or vim and put them in column), you don't even need to do a loop, you can simply do:
cat command.text | awk '{print "ssh user#remotehost "$0" "}' | sh -x
For example if command.text contains:
ls -lart
cd /tmp
uname -a
This will let you do all commands written in the command.text by doing ssh user#remotehost.

Resources