BASH, passing multi-item variable to function - bash

I have a function:
function checkConn()
{
RET=0
echo "in function: ${2}"
echo
for i in `cat ${1}`
do
for p in ${2}
do
if nc -dzw 2 ${i} ${p} 2>&1 >/dev/null
and so on.
In the "main" body of the script, I have the following:
PORTS='22 161 162 1521'
checkConn ${FILE} ${PORTS}
FILE is the name of a file which contains a list of IPs.
When I pass PORTS to the function, only the 1st item gets passed. I also tried it with double-quotes.
I put that "echo" statement to confirm. It only shows the first item in PORTS, which is 22.
How can I pass all the ports to this function, and then loop through each one?

Best practice is to pass the list of ports as individual arguments, each with their own argv entry -- like so:
checkConn() {
file=$1; shift ## read filename from $1, then remove it from the argument list
while IFS= read -r address; do
for port; do ## we shifted off the filename so this only iterates over ports
if nc -dzw 2 "$address" "$port" </dev/null >/dev/null 2>&1; then
echo "$address $port OPEN"
else
echo "$address $port CLOSED"
fi
done
done <"$file"
}
file=addrs.txt
ports=( 22 161 162 1521 )
checkConn "$file" "${ports[#]}"
Notes:
The function keyword is not actually required for use to define a function; it makes your code incompatible with POSIX sh, but provides no benefit over the portable syntax. Avoid it.
The while IFS= read -r idiom is described in detail in BashFAQ #1. Also see Don't Read Lines With For.
Arrays, as used in the ports=( ... ) and "${ports[#]}" syntax, are described in the BashGuide.

Multiple syntax violations and outdated constructs, you probably need something like,
function checkConn() {
# define variables as local unless you are using it beyond the scope
# of the function and always lower-case variable names
local ret=0
printf "%s\n" "$2"
# Splitting the string into an array so that it can be accessed
# element wise inside the loop. The -a option in read stores the
# elements read to the array
read -ra portList <<<"$2"
# Input-redirection of reading the file represented by argument $1
# representing the file name. The files are read one at a time
while IFS= read -r line; do
# Array values iterated in a for-loop; do the action
# for each value in the port number
for port in "${portList[#]}"; do
if nc -dzw 2 "$line" "$port" 2>&1 >/dev/null; then
printf "%s %s\n" "$line" "$port"
# Your rest of the code
fi
done
done < "$1"
}
and call the function as
ports='22 161 162 1521'
filename="file"
checkConn "$filename" "$PORTS"

Related

Can I make a shell function in as a pipeline conditionally "disappear", without using cat?

I have a bash script that produces some text from a pipe of commands. Based on a command line option I want to do some validation on the output. For a contrived example...
CHECK_OUTPUT=$1
...
check_output()
{
if [[ "$CHECK_OUTPUT" != "--check" ]]; then
# Don't check the output. Passthrough and return.
cat
return 0
fi
# Check each line exists in the fs root
while read line; do
if [[ ! -e "/$line" ]]; then
echo "Error: /$line does not exist"
return 1
fi
echo "$line"
done
return 0
}
ls /usr | grep '^b' | check_output
[EDIT] better example: https://stackoverflow.com/a/52539364/1888983
This is really useful, particularly if I have multiple functions that can becomes passthroughs. Yes, I could move the CHECK_OUTPUT conditional and create a pipe with or without check_output but I'd need to write lines for each combination for more functions. If there are better ways to dynamically build a pipe I'd like to know.
The problem is the "useless use of cat". Can this be avoided and make check_output like it wasn't in the pipe at all?
Yes, you can do this -- by making your function a wrapper that conditionally injects a pipeline element, instead of being an unconditional pipeline element itself. For example:
maybe_checked() {
if [[ $CHECK_OUTPUT != "--check" ]]; then
"$#" # just run our arguments as a command, as if we weren't here
else
# run our arguments in a process substitution, reading from stdout of same.
# ...some changes from the original code:
# IFS= stops leading or trailing whitespace from being stripped
# read -r prevents backslashes from being processed
local line # avoid modifying $line outside our function
while IFS= read -r line; do
[[ -e "/$line" ]] || { echo "Error: /$line does not exist" >&2; return 1; }
printf '%s\n' "$line" # see https://unix.stackexchange.com/questions/65803
done < <("$#")
fi
}
ls /usr | maybe_checked grep '^b'
Caveat of the above code: if the pipefail option is set, you'll want to check the exit status of the process substitution to have complete parity with the behavior that would otherwise be the case. In bash version 4.3 or later (IIRC), $? is modified by process substitutions to have the relevant PID, which can be waited for to retrieve exit status.
That said, this is also a use case wherein using cat is acceptable, and I'm saying this as a card-carying member of the UUOC crowd. :)
Adopting the examples from John Kugelman's answers on the linked question:
maybe_sort() {
if (( sort )); then
"$#" | sort
else
"$#"
fi
}
maybe_limit() {
if [[ -n $limit ]]; then
"$#" | head -n "$limit"
else
"$#"
fi
}
printf '%s\n' "${haikus[#]}" | maybe_limit maybe_sort sed -e 's/^[ \t]*//'

Can't add a new element to an array in bash [duplicate]

In the following program, if I set the variable $foo to the value 1 inside the first if statement, it works in the sense that its value is remembered after the if statement. However, when I set the same variable to the value 2 inside an if which is inside a while statement, it's forgotten after the while loop. It's behaving like I'm using some sort of copy of the variable $foo inside the while loop and I am modifying only that particular copy. Here's a complete test program:
#!/bin/bash
set -e
set -u
foo=0
bar="hello"
if [[ "$bar" == "hello" ]]
then
foo=1
echo "Setting \$foo to 1: $foo"
fi
echo "Variable \$foo after if statement: $foo"
lines="first line\nsecond line\nthird line"
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo "Value of \$foo in while loop body: $foo"
done
echo "Variable \$foo after while loop: $foo"
# Output:
# $ ./testbash.sh
# Setting $foo to 1: 1
# Variable $foo after if statement: 1
# Value of $foo in while loop body: 1
# Variable $foo updated to 2 inside if inside while loop
# Value of $foo in while loop body: 2
# Value of $foo in while loop body: 2
# Variable $foo after while loop: 1
# bash --version
# GNU bash, version 4.1.10(4)-release (i686-pc-cygwin)
echo -e $lines | while read line
...
done
The while loop is executed in a subshell. So any changes you do to the variable will not be available once the subshell exits.
Instead you can use a here string to re-write the while loop to be in the main shell process; only echo -e $lines will run in a subshell:
while read line
do
if [[ "$line" == "second line" ]]
then
foo=2
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo "Value of \$foo in while loop body: $foo"
done <<< "$(echo -e "$lines")"
You can get rid of the rather ugly echo in the here-string above by expanding the backslash sequences immediately when assigning lines. The $'...' form of quoting can be used there:
lines=$'first line\nsecond line\nthird line'
while read line; do
...
done <<< "$lines"
UPDATED#2
Explanation is in Blue Moons's answer.
Alternative solutions:
Eliminate echo
while read line; do
...
done <<EOT
first line
second line
third line
EOT
Add the echo inside the here-is-the-document
while read line; do
...
done <<EOT
$(echo -e $lines)
EOT
Run echo in background:
coproc echo -e $lines
while read -u ${COPROC[0]} line; do
...
done
Redirect to a file handle explicitly (Mind the space in < <!):
exec 3< <(echo -e $lines)
while read -u 3 line; do
...
done
Or just redirect to the stdin:
while read line; do
...
done < <(echo -e $lines)
And one for chepner (eliminating echo):
arr=("first line" "second line" "third line");
for((i=0;i<${#arr[*]};++i)) { line=${arr[i]};
...
}
Variable $lines can be converted to an array without starting a new sub-shell. The characters \ and n has to be converted to some character (e.g. a real new line character) and use the IFS (Internal Field Separator) variable to split the string into array elements. This can be done like:
lines="first line\nsecond line\nthird line"
echo "$lines"
OIFS="$IFS"
IFS=$'\n' arr=(${lines//\\n/$'\n'}) # Conversion
IFS="$OIFS"
echo "${arr[#]}", Length: ${#arr[*]}
set|grep ^arr
Result is
first line\nsecond line\nthird line
first line second line third line, Length: 3
arr=([0]="first line" [1]="second line" [2]="third line")
You are asking this bash FAQ. The answer also describes the general case of variables set in subshells created by pipes:
E4) If I pipe the output of a command into read variable, why
doesn't the output show up in $variable when the read command finishes?
This has to do with the parent-child relationship between Unix
processes. It affects all commands run in pipelines, not just
simple calls to read. For example, piping a command's output
into a while loop that repeatedly calls read will result in
the same behavior.
Each element of a pipeline, even a builtin or shell function,
runs in a separate process, a child of the shell running the
pipeline. A subprocess cannot affect its parent's environment.
When the read command sets the variable to the input, that
variable is set only in the subshell, not the parent shell. When
the subshell exits, the value of the variable is lost.
Many pipelines that end with read variable can be converted
into command substitutions, which will capture the output of
a specified command. The output can then be assigned to a
variable:
grep ^gnu /usr/lib/news/active | wc -l | read ngroup
can be converted into
ngroup=$(grep ^gnu /usr/lib/news/active | wc -l)
This does not, unfortunately, work to split the text among
multiple variables, as read does when given multiple variable
arguments. If you need to do this, you can either use the
command substitution above to read the output into a variable
and chop up the variable using the bash pattern removal
expansion operators or use some variant of the following
approach.
Say /usr/local/bin/ipaddr is the following shell script:
#! /bin/sh
host `hostname` | awk '/address/ {print $NF}'
Instead of using
/usr/local/bin/ipaddr | read A B C D
to break the local machine's IP address into separate octets, use
OIFS="$IFS"
IFS=.
set -- $(/usr/local/bin/ipaddr)
IFS="$OIFS"
A="$1" B="$2" C="$3" D="$4"
Beware, however, that this will change the shell's positional
parameters. If you need them, you should save them before doing
this.
This is the general approach -- in most cases you will not need to
set $IFS to a different value.
Some other user-supplied alternatives include:
read A B C D << HERE
$(IFS=.; echo $(/usr/local/bin/ipaddr))
HERE
and, where process substitution is available,
read A B C D < <(IFS=.; echo $(/usr/local/bin/ipaddr))
Hmmm... I would almost swear that this worked for the original Bourne shell, but don't have access to a running copy just now to check.
There is, however, a very trivial workaround to the problem.
Change the first line of the script from:
#!/bin/bash
to
#!/bin/ksh
Et voila! A read at the end of a pipeline works just fine, assuming you have the Korn shell installed.
This is an interesting question and touches on a very basic concept in Bourne shell and subshell. Here I provide a solution that is different from the previous solutions by doing some kind of filtering. I will give an example that may be useful in real life. This is a fragment for checking that downloaded files conform to a known checksum. The checksum file look like the following (Showing just 3 lines):
49174 36326 dna_align_feature.txt.gz
54757 1 dna.txt.gz
55409 9971 exon_transcript.txt.gz
The shell script:
#!/bin/sh
.....
failcnt=0 # this variable is only valid in the parent shell
#variable xx captures all the outputs from the while loop
xx=$(cat ${checkfile} | while read -r line; do
num1=$(echo $line | awk '{print $1}')
num2=$(echo $line | awk '{print $2}')
fname=$(echo $line | awk '{print $3}')
if [ -f "$fname" ]; then
res=$(sum $fname)
filegood=$(sum $fname | awk -v na=$num1 -v nb=$num2 -v fn=$fname '{ if (na == $1 && nb == $2) { print "TRUE"; } else { print "FALSE"; }}')
if [ "$filegood" = "FALSE" ]; then
failcnt=$(expr $failcnt + 1) # only in subshell
echo "$fname BAD $failcnt"
fi
fi
done | tail -1) # I am only interested in the final result
# you can capture a whole bunch of texts and do further filtering
failcnt=${xx#* BAD } # I am only interested in the number
# this variable is in the parent shell
echo failcnt $failcnt
if [ $failcnt -gt 0 ]; then
echo $failcnt files failed
else
echo download successful
fi
The parent and subshell communicate through the echo command. You can pick some easy to parse text for the parent shell. This method does not break your normal way of thinking, just that you have to do some post processing. You can use grep, sed, awk, and more for doing so.
I use stderr to store within a loop, and read from it outside.
Here var i is initially set and read inside the loop as 1.
# reading lines of content from 2 files concatenated
# inside loop: write value of var i to stderr (before iteration)
# outside: read var i from stderr, has last iterative value
f=/tmp/file1
g=/tmp/file2
i=1
cat $f $g | \
while read -r s;
do
echo $s > /dev/null; # some work
echo $i > 2
let i++
done;
read -r i < 2
echo $i
Or use the heredoc method to reduce the amount of code in a subshell.
Note the iterative i value can be read outside the while loop.
i=1
while read -r s;
do
echo $s > /dev/null
let i++
done <<EOT
$(cat $f $g)
EOT
let i--
echo $i
How about a very simple method
+call your while loop in a function
- set your value inside (nonsense, but shows the example)
- return your value inside
+capture your value outside
+set outside
+display outside
#!/bin/bash
# set -e
# set -u
# No idea why you need this, not using here
foo=0
bar="hello"
if [[ "$bar" == "hello" ]]
then
foo=1
echo "Setting \$foo to $foo"
fi
echo "Variable \$foo after if statement: $foo"
lines="first line\nsecond line\nthird line"
function my_while_loop
{
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2; return 2;
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2;
echo "Variable \$foo updated to $foo inside if inside while loop"
return 2;
fi
# Code below won't be executed since we returned from function in 'if' statement
# We aready reported the $foo var beint set to 2 anyway
echo "Value of \$foo in while loop body: $foo"
done
}
my_while_loop; foo="$?"
echo "Variable \$foo after while loop: $foo"
Output:
Setting $foo 1
Variable $foo after if statement: 1
Value of $foo in while loop body: 1
Variable $foo after while loop: 2
bash --version
GNU bash, version 3.2.51(1)-release (x86_64-apple-darwin13)
Copyright (C) 2007 Free Software Foundation, Inc.
Though this is an old question and asked several times, here's what I'm doing after hours fidgeting with here strings, and the only option that worked for me is to store the value in a file during while loop sub-shells and then retrieve it. Simple.
Use echo statement to store and cat statement to retrieve. And the bash user must chown the directory or have read-write chmod access.
#write to file
echo "1" > foo.txt
while condition; do
if (condition); then
#write again to file
echo "2" > foo.txt
fi
done
#read from file
echo "Value of \$foo in while loop body: $(cat foo.txt)"

Shell-write a file into shell variable

I have a file like this format:
a;b;c
e;d;f
how can I use shell to read the file detail information into variables?
I would have 6 variables to store the data.
more detailed information for this is that as the following shows:
I have written a script:
#!/bin/sh
unset ret
ret=0
if [ "$#" -ne 3 ]; then
logger -p err "Usage: $0 eth_name rule_file table_name"
ret=1
exit ret
fi
OFS=$IFS # store field separator
IFS=";" # define field separator
eth_name=$1 # ethernet device name
rule_file=$2 # input file name
table_name=$3 # lookup table name
logger -p notice "$0 $eth_name $rule_file $table_name"
unset a # reference to line array
unset i j # index
unset m n # dimension
### read route configuration
i=0
while read line
do
a=A$i
unset $a
declare -a $a='($line)'
i=$((i+1))
done < $rule_file
# store number of lines
m=$i
# function for apply route
add_route()
{
if [ "source" = "$1" ]; then
src_address=$(ifconfig $eth_name | sed -n 's/.*inet addr:\([0-9.]\+\)\s.*/\1/p')
ip rule add from $src_address lookup $table_name
ret=$?
logger -p notice "ip rule add from $src_address lookup $table_name $ret"
elif [ "default" = "$1" ]; then
ip route add default via $2 table $table_name
ret=$?
logger -p notice "ip route add default via $2 table $table_name $ret"
else
ipaddress_range=$1
gateway_ipaddress=$2
ip route add $ipaddress_range via $gateway_ipaddress dev $eth_name table $table_name
ret=$?
logger -p notice "ip route add $ipaddress_range via $gateway_ipaddress dev $eth_name table $table_name $ret"
fi
}
### apply route configuration
for ((i=0; i < $m; i++))
do
a=A$i
# get line size
# double escape '\\' for sub shell '``' and 'echo'
p0=`eval echo \\${$a[0]}`
p1=`eval echo \\${$a[1]}`
add_route $p0 $p1
done
IFS=$OFS
the rule file's format is as the following shows:
source;
default_route;172.20.5.192/26
default_gateway;172.20.5.254
172.17.23.64/26;172.20.5.254
172.31.252.0/24;172.20.5.254
172.31.254.0/24;172.20.5.254
10.217.1.0/24;172.20.5.254
10.217.2.0/24;172.20.5.254
this script is working normally under the bash environment, now my linux system is not having bash now, this script is not working now, how to change the script to make the script running?
the function for this script is very simple, write every line into the linux system's ip rule and ip route. need to throw 3 variables to make the script running.
You can easily achieve it with read.
#!/bin/bash
while IFS=";" read -r var1 var2 var3; do
command
done <file
exit 0
where:
IFS is your delimiter
varN are your vars, you can use any name you want
N.B. If you need stdin you need to use a file descriptor:
#!/bin/bash
exec 3<file
while IFS=";" read -r var1 var2 var3 <&3; do
command
done
exec 3>&-
exit 0
Further readings here.
N.B.#2 The command read is not the faster solution, usually after 5k lines lose 100 ms compared to other tools (E.G. awk).
The following parses your original sample text and saves fields to sequentially numbered variables, stripping values out of lines using parameter expansion, and doesn't care how many fields you have per line.
#!/bin/sh
i=0
while read line; do
while [ -n "$line" ]; do
eval this_$((i=i+1))="\${line%%;*}"
last="$line"
line="${line#*;}"
[ "$last" = "$line" ] && break
done
done < input.txt
I've tested this successfully with both bash and FreeBSD's /bin/sh (which is based on ash). (Note that FreeBSD's /bin/sh doesn't seem to like arithmetic expressions like $((i++)), but both shells I tested are fine with the notation above.)
But from the look of the script you updated your question with, this isn't what you need. It seems that you have input data with:
one record per line, and
two fields, separated by semicolons.
But I wonder if you even need to store things in variables. It seems to me you'd be looking more for the following type of structure.
#!/bin/sh
...
while IFS=";" read range gateway; do
case "$range" in
source)
: Note that $gateway is blank
;;
default_route)
: do something
;;
default_gateway)
: do something else
;;
[0-9]*)
ip route add "$range" via "$gateway" dev "$eth_name" table "$table_name"
;;
*)
printf 'ERROR: unknown range in rule file: %s\n' "$range" >&2
;;
esac
done < $rule_file
Additional input validation wouldn't hurt.

Read a config file in BASH without using "source"

I'm attempting to read a config file that is formatted as follows:
USER = username
TARGET = arrows
I realize that if I got rid of the spaces, I could simply source the config file, but for security reasons I'm trying to avoid that. I know there is a way to read the config file line by line. I think the process is something like:
Read lines into an array
Filter out all of the lines that start with #
search for the variable names in the array
After that I'm lost. Any and all help would be greatly appreciated. I've tried something like this with no success:
backup2.config>cat ~/1
grep '^[^#].*' | while read one two;do
echo $two
done
I pulled that from a forum post I found, just not sure how to modify it to fit my needs since I'm so new to shell scripting.
http://www.linuxquestions.org/questions/programming-9/bash-shell-program-read-a-configuration-file-276852/
Would it be possible to automatically assign a variable by looping through both arrays?
for (( i = 0 ; i < ${#VALUE[#]} ; i++ ))
do
"${NAME[i]}"=VALUE[i]
done
echo $USER
Such that calling $USER would output "username"? The above code isn't working but I know the solution is something similar to that.
The following script iterates over each line in your input file (vars in my case) and does a pattern match against =. If the equal sign is found it will use Parameter Expansion to parse out the variable name from the value. It then stores each part in it's own array, name and value respectively.
#!/bin/bash
i=0
while read line; do
if [[ "$line" =~ ^[^#]*= ]]; then
name[i]=${line%% =*}
value[i]=${line#*= }
((i++))
fi
done < vars
echo "total array elements: ${#name[#]}"
echo "name[0]: ${name[0]}"
echo "value[0]: ${value[0]}"
echo "name[1]: ${name[1]}"
echo "value[1]: ${value[1]}"
echo "name array: ${name[#]}"
echo "value array: ${value[#]}"
Input
$ cat vars
sdf
USER = username
TARGET = arrows
asdf
as23
Output
$ ./varscript
total array elements: 2
name[0]: USER
value[0]: username
name[1]: TARGET
value[1]: arrows
name array: USER TARGET
value array: username arrows
First, USER is a shell environment variable, so it might be better if you used something else. Using lowercase or mixed case variable names is a way to avoid name collisions.
#!/bin/bash
configfile="/path/to/file"
shopt -s extglob
while IFS='= ' read lhs rhs
do
if [[ $lhs != *( )#* ]]
then
# you can test for variables to accept or other conditions here
declare $lhs=$rhs
fi
done < "$configfile"
This sets the vars in your file to the value associated with it.
echo "Username: $USER, Target: $TARGET"
would output
Username: username, Target: arrows
Another way to do this using keys and values is with an associative array:
Add this line before the while loop:
declare -A settings
Remove the declare line inside the while loop and replace it with:
settings[$lhs]=$rhs
Then:
# set keys
user=USER
target=TARGET
# access values
echo "Username: ${settings[$user]}, Target: ${settings[$target]}"
would output
Username: username, Target: arrows
I have a script which only takes a very limited number of settings, and processes them one at a time, so I've adapted SiegeX's answer to whitelist the settings I care about and act on them as it comes to them.
I've also removed the requirement for spaces around the = in favour of ignoring any that exist using the trim function from another answer.
function trim()
{
local var=$1;
var="${var#"${var%%[![:space:]]*}"}"; # remove leading whitespace characters
var="${var%"${var##*[![:space:]]}"}"; # remove trailing whitespace characters
echo -n "$var";
}
while read line; do
if [[ "$line" =~ ^[^#]*= ]]; then
setting_name=$(trim "${line%%=*}");
setting_value=$(trim "${line#*=}");
case "$setting_name" in
max_foos)
prune_foos $setting_value;
;;
max_bars)
prune_bars $setting_value;
;;
*)
echo "Unrecognised setting: $setting_name";
;;
esac;
fi
done <"$config_file";
Thanks SiegeX. I think the later updates you mentioned does not reflect in this URL.
I had to edit the regex to remove the quotes to get it working. With quotes, array returned is empty.
i=0
while read line; do
if [[ "$line" =~ ^[^#]*= ]]; then
name[i]=${line%% =*}
value[i]=${line##*= }
((i++))
fi
done < vars
A still better version is .
i=0
while read line; do
if [[ "$line" =~ ^[^#]*= ]]; then
name[i]=`echo $line | cut -d'=' -f 1`
value[i]=`echo $line | cut -d'=' -f 2`
((i++))
fi
done < vars
The first version is seen to have issues if there is no space before and after "=" in the config file. Also if the value is missing, i see that the name and value are populated as same. The second version does not have any of these. In addition it trims out unwanted leading and trailing spaces.
This version reads values that can have = within it. Earlier version splits at first occurance of =.
i=0
while read line; do
if [[ "$line" =~ ^[^#]*= ]]; then
name[i]=`echo $line | cut -d'=' -f 1`
value[i]=`echo $line | cut -d'=' -f 2-`
((i++))
fi
done < vars

How to parse $QUERY_STRING from a bash CGI script?

I have a bash script that is being used in a CGI. The CGI sets the $QUERY_STRING environment variable by reading everything after the ? in the URL. For example, http://example.com?a=123&b=456&c=ok sets QUERY_STRING=a=123&b=456&c=ok.
Somewhere I found the following ugliness:
b=$(echo "$QUERY_STRING" | sed -n 's/^.*b=\([^&]*\).*$/\1/p' | sed "s/%20/ /g")
which will set $b to whatever was found in $QUERY_STRING for b. However, my script has grown to have over ten input parameters. Is there an easier way to automatically convert the parameters in $QUERY_STRING into environment variables usable by bash?
Maybe I'll just use a for loop of some sort, but it'd be even better if the script was smart enough to automatically detect each parameter and maybe build an array that looks something like this:
${parm[a]}=123
${parm[b]}=456
${parm[c]}=ok
How could I write code to do that?
Try this:
saveIFS=$IFS
IFS='=&'
parm=($QUERY_STRING)
IFS=$saveIFS
Now you have this:
parm[0]=a
parm[1]=123
parm[2]=b
parm[3]=456
parm[4]=c
parm[5]=ok
In Bash 4, which has associative arrays, you can do this (using the array created above):
declare -A array
for ((i=0; i<${#parm[#]}; i+=2))
do
array[${parm[i]}]=${parm[i+1]}
done
which will give you this:
array[a]=123
array[b]=456
array[c]=ok
Edit:
To use indirection in Bash 2 and later (using the parm array created above):
for ((i=0; i<${#parm[#]}; i+=2))
do
declare var_${parm[i]}=${parm[i+1]}
done
Then you will have:
var_a=123
var_b=456
var_c=ok
You can access these directly:
echo $var_a
or indirectly:
for p in a b c
do
name="var$p"
echo ${!name}
done
If possible, it's better to avoid indirection since it can make code messy and be a source of bugs.
you can break $QUERY down using IFS. For example, setting it to &
$ QUERY="a=123&b=456&c=ok"
$ echo $QUERY
a=123&b=456&c=ok
$ IFS="&"
$ set -- $QUERY
$ echo $1
a=123
$ echo $2
b=456
$ echo $3
c=ok
$ array=($#)
$ for i in "${array[#]}"; do IFS="=" ; set -- $i; echo $1 $2; done
a 123
b 456
c ok
And you can save to a hash/dictionary in Bash 4+
$ declare -A hash
$ for i in "${array[#]}"; do IFS="=" ; set -- $i; hash[$1]=$2; done
$ echo ${hash["b"]}
456
Please don't use the evil eval junk.
Here's how you can reliably parse the string and get an associative array:
declare -A param
while IFS='=' read -r -d '&' key value && [[ -n "$key" ]]; do
param["$key"]=$value
done <<<"${QUERY_STRING}&"
If you don't like the key check, you could do this instead:
declare -A param
while IFS='=' read -r -d '&' key value; do
param["$key"]=$value
done <<<"${QUERY_STRING:+"${QUERY_STRING}&"}"
Listing all the keys and values from the array:
for key in "${!param[#]}"; do
echo "$key: ${param[$key]}"
done
I packaged the sed command up into another script:
$cat getvar.sh
s='s/^.*'${1}'=\([^&]*\).*$/\1/p'
echo $QUERY_STRING | sed -n $s | sed "s/%20/ /g"
and I call it from my main cgi as:
id=`./getvar.sh id`
ds=`./getvar.sh ds`
dt=`./getvar.sh dt`
...etc, etc - you get idea.
works for me even with a very basic busybox appliance (my PVR in this case).
To converts the contents of QUERY_STRING into bash variables use the following command:
eval $(echo ${QUERY_STRING//&/;})
The inner step, echo ${QUERY_STRING//&/;}, substitutes all ampersands with semicolons producing a=123;b=456;c=ok which the eval then evaluates into the current shell.
The result can then be used as bash variables.
echo $a
echo $b
echo $c
The assumptions are:
values will never contain '&'
values will never contain ';'
QUERY_STRING will never contain malicious code
While the accepted answer is probably the most beautiful one, there might be cases where security is super-important, and it needs to be also well-visible from your script.
In such a case, first I wouldn't use bash for the task, but if it should be done on some reason, it might be better to avoid these new array - dictionary features, because you can't be sure, how exactly are they escaped.
In this case, the good old primitive solutions might work:
QS="${QUERY_STRING}"
while [ "${QS}" != "" ]
do
nameval="${QS%%&*}"
QS="${QS#$nameval}"
QS="${QS#&}"
name="${nameval%%=*}"
val="${nameval#$name}"
val="${nameval#=}"
# and here we have $name and $val as names and values
# ...
done
This iterates on the name-value pairs of the QUERY_STRING, and there is no way to circumvent it with any tricky escape sequence - the " is a very strong thing in bash, except a single variable name substitution, which is fully controlled by us, nothing can be tricked.
Furthermore, you can inject your own processing code into "# ...". This enables you to allow only your own, well-defined (and, ideally, short) list of the allowed variable names. Needless to say, LD_PRELOAD shouldn't be one of them. ;-)
Furthermore, no variable will be exported, and exclusively QS, nameval, name and val is used.
Following the correct answer, I've done myself some changes to support array variables like in this other question. I added also a decode function of which I can not find the author to give some credit.
Code appears somewhat messy, but it works. Changes and other recommendations would be greatly appreciated.
function cgi_decodevar() {
[ $# -ne 1 ] && return
local v t h
# replace all + with whitespace and append %%
t="${1//+/ }%%"
while [ ${#t} -gt 0 -a "${t}" != "%" ]; do
v="${v}${t%%\%*}" # digest up to the first %
t="${t#*%}" # remove digested part
# decode if there is anything to decode and if not at end of string
if [ ${#t} -gt 0 -a "${t}" != "%" ]; then
h=${t:0:2} # save first two chars
t="${t:2}" # remove these
v="${v}"`echo -e \\\\x${h}` # convert hex to special char
fi
done
# return decoded string
echo "${v}"
return
}
saveIFS=$IFS
IFS='=&'
VARS=($QUERY_STRING)
IFS=$saveIFS
for ((i=0; i<${#VARS[#]}; i+=2))
do
curr="$(cgi_decodevar ${VARS[i]})"
next="$(cgi_decodevar ${VARS[i+2]})"
prev="$(cgi_decodevar ${VARS[i-2]})"
value="$(cgi_decodevar ${VARS[i+1]})"
array=${curr%"[]"}
if [ "$curr" == "$next" ] && [ "$curr" != "$prev" ] ;then
j=0
declare var_${array}[$j]="$value"
elif [ $i -gt 1 ] && [ "$curr" == "$prev" ]; then
j=$((j + 1))
declare var_${array}[$j]="$value"
else
declare var_$curr="$value"
fi
done
I would simply replace the & to ;. It will become to something like:
a=123;b=456;c=ok
So now you need just evaluate and read your vars:
eval `echo "${QUERY_STRING}"|tr '&' ';'`
echo $a
echo $b
echo $c
A nice way to handle CGI query strings is to use Haserl which acts as a wrapper around your Bash cgi script, and offers convenient and secure query string parsing.
To bring this up to date, if you have a recent Bash version then you can achieve this with regular expressions:
q="$QUERY_STRING"
re1='^(\w+=\w+)&?'
re2='^(\w+)=(\w+)$'
declare -A params
while [[ $q =~ $re1 ]]; do
q=${q##*${BASH_REMATCH[0]}}
[[ ${BASH_REMATCH[1]} =~ $re2 ]] && params+=([${BASH_REMATCH[1]}]=${BASH_REMATCH[2]})
done
If you don't want to use associative arrays then just change the penultimate line to do what you want. For each iteration of the loop the parameter is in ${BASH_REMATCH[1]} and its value is in ${BASH_REMATCH[2]}.
Here is the same thing as a function in a short test script that iterates over the array outputs the query string's parameters and their values
#!/bin/bash
QUERY_STRING='foo=hello&bar=there&baz=freddy'
get_query_string() {
local q="$QUERY_STRING"
local re1='^(\w+=\w+)&?'
local re2='^(\w+)=(\w+)$'
while [[ $q =~ $re1 ]]; do
q=${q##*${BASH_REMATCH[0]}}
[[ ${BASH_REMATCH[1]} =~ $re2 ]] && eval "$1+=([${BASH_REMATCH[1]}]=${BASH_REMATCH[2]})"
done
}
declare -A params
get_query_string params
for k in "${!params[#]}"
do
v="${params[$k]}"
echo "$k : $v"
done
Note the parameters end up in the array in reverse order (it's associative so that shouldn't matter).
why not this
$ echo "${QUERY_STRING}"
name=carlo&last=lanza&city=pfungen-CH
$ saveIFS=$IFS
$ IFS='&'
$ eval $QUERY_STRING
$ IFS=$saveIFS
now you have this
name = carlo
last = lanza
city = pfungen-CH
$ echo "name is ${name}"
name is carlo
$ echo "last is ${last}"
last is lanza
$ echo "city is ${city}"
city is pfungen-CH
#giacecco
To include a hiphen in the regex you could change the two lines as such in answer from #starfry.
Change these two lines:
local re1='^(\w+=\w+)&?'
local re2='^(\w+)=(\w+)$'
To these two lines:
local re1='^(\w+=(\w+|-|)+)&?'
local re2='^(\w+)=((\w+|-|)+)$'
For all those who couldn't get it working with the posted answers (like me),
this guy figured it out.
Can't upvote his post unfortunately...
Let me repost the code here real quick:
#!/bin/sh
if [ "$REQUEST_METHOD" = "POST" ]; then
if [ "$CONTENT_LENGTH" -gt 0 ]; then
read -n $CONTENT_LENGTH POST_DATA <&0
fi
fi
#echo "$POST_DATA" > data.bin
IFS='=&'
set -- $POST_DATA
#2- Value1
#4- Value2
#6- Value3
#8- Value4
echo $2 $4 $6 $8
echo "Content-type: text/html"
echo ""
echo "<html><head><title>Saved</title></head><body>"
echo "Data received: $POST_DATA"
echo "</body></html>"
Hope this is of help for anybody.
Cheers
Actually I liked bolt's answer, so I made a version which works with Busybox as well (ash in Busybox does not support here string).
This code will accept key1 and key2 parameters, all others will be ignored.
while IFS= read -r -d '&' KEYVAL && [[ -n "$KEYVAL" ]]; do
case ${KEYVAL%=*} in
key1) KEY1=${KEYVAL#*=} ;;
key2) KEY2=${KEYVAL#*=} ;;
esac
done <<END
$(echo "${QUERY_STRING}&")
END
One can use the bash-cgi.sh, which processes :
the query string into the $QUERY_STRING_GET key and value array;
the post request data (x-www-form-urlencoded) into the $QUERY_STRING_POST key and value array;
the cookies data into the $HTTP_COOKIES key and value array.
Demands bash version 4.0 or higher (to define the key and value arrays above).
All processing is made by bash only (i.e. in an one process) without any external dependencies and additional processes invoking.
It has:
the check for max length of data, which can be transferred to it's input,
as well as processed as query string and cookies;
the redirect() procedure to produce redirect to itself with the extension changed to .html (it is useful for an one page's sites);
the http_header_tail() procedure to output the last two strings of the HTTP(S) respond's header;
the $REMOTE_ADDR value sanitizer from possible injections;
the parser and evaluator of the escaped UTF-8 symbols embedded into the values passed to the $QUERY_STRING_GET, $QUERY_STRING_POST and $HTTP_COOKIES;
the sanitizer of the $QUERY_STRING_GET, $QUERY_STRING_POST and $HTTP_COOKIES values against possible SQL injections (the escaping like the mysql_real_escape_string php function does, plus the escaping of # and $).
It is available here:
https://github.com/VladimirBelousov/fancy_scripts
This works in dash using for in loop
IFS='&'
for f in $query_string; do
value=${f##*=}
key=${f%%=*}
# if you need environment variable -> eval "qs_$key=$value"
done

Resources