Here is the simplest distilled version of my problem. The complexity that remains is for good reasons even if not self evident here. Additionally the script is internal and has no chance of executing malicious code, so the eval is perfectly fine; don't need to hear about how evil it is... ;) Assume for the moment that the pipes and colons in the key string are required delimiters.
key="target|platform|repo:revision"
hashname="hashmap"
declare -A $hashname
eval $hashname[$key]=0
echo $(eval $hashname[$key])
Of course the last two lines have problems because the eval is acting on the pipes and colons inside of the $key variable. My question is how can I protect a string like this from the eval? And I need the eval because I am referring to the hashmap's name rather than it itself. Thanks in advance.
==================================================================================
Ok I am actually beginning to think that my problem is not with the pipes and colon and the eval. So Let me paste the real code
function print_hashmap()
{
local hashname=$1
for key in `eval echo \$\{\!$hashname[#]\}`; do
local deref="$hashname[$key]"
local value=${!deref}
echo "key=${key} value=${value}"
done
}
function create_permutations2()
{
local hashname=$1 ; shift
local builds=$1 ; shift
local items=$1 ; shift
local separators=$#
echo "create_permutations(hashname=${hashname}, builds=${builds}, items=${items}, separators=${separators})"
if [ NULL != "${builds}" ]; then
for build in `echo $builds | tr ',' ' '`; do
local target="${build%%:*}"
local platforms="${build##*:}"
for platform in `echo $platforms | tr '|' ' '`; do
local key=
local ref=
if [ NULL != "${items}" ]; then
if [ NULL == "${separators}" ]; then separators=' '; fi
for separator in $separators; do
items=`echo $items | tr $separator ' '`
done
for item in $items; do
key="${target}|${platform}|${item}"
ref="$hashname[$key]"
declare "$ref"=0
done
else
key="${target}|${platform}"
ref="$hashname[$key]"
declare "$ref"=0
fi
done
done
fi
echo "created the following permutations:"
print_hashmap $hashname
}
builds="k:r|s|w,l:r|s|w"
repos="c:master,m:master"
hashname="hashmap"
declare -A $hashname
create_permutations2 $hashname $builds $repos ","
print_hashmap $hashname
I recently modified my code to conform with FatalError's suggestions to use ref instead of eval and the error is the same: syntax error in expression (error token near :master)
Maybe doubles quotes around $key?
eval $hashname["$key"]=0
echo $(eval $hashname["$key"]=0)
Try:
eval $hashname'[$key]'=0
eval result=$hashname'[$key]'
This passes the $ to eval rather than expanding the parameter first.
I won't lecture about the evilness of eval, however in this case it at least adds a little extra complexity. The easiest way I can think of, since it's evaluating the string result, would be to add some literal single quotes:
eval $hashname["'"$key"'"]=0
That would guard against everything except for single quotes in $key. However, you can accomplish the same thing with probably fewer headaches. Here's my updated script to illustrate:
key="target|platform|repo:revision"
hashname="hashmap"
declare -A $hashname
echo "With eval:"
eval $hashname["'"$key"'"]=0
eval echo \${$hashname["'"$key"'"]}
echo
echo "Without eval:"
ref="$hashname[$key]"
echo ${!ref}
declare "$ref"="1"
echo ${!ref}
Note I changed your original eval line because it didn't make sense to me -- it was trying to execute the result rather than print it. In the latter part you can use indirection to access the value and then declare to assign to it. Then you don't have to worry about these chars being interpreted.
This produces the result:
With eval:
0
Without eval:
0
1
I'm not 100% sure why that doesn't work, but this seems to:
builds="k:r|s|w,l:r|s|w"
repos="c:master,m:master"
hashname="hashmap"
declare -A $hashname
function print_hashmap()
{
local hashname=$1
for key in `eval echo \$\{\!$hashname[#]\}`; do
local deref="$hashname[$key]"
local value=${!deref}
echo "key=${key} value=${value}"
done
}
function create_permutations2()
{
local hashname=$1 ; shift
local builds=$1 ; shift
local items=$1 ; shift
local separators=$#
echo "create_permutations(hashname=${hashname}, builds=${builds}, items=${items}, separators=${separators})"
if [ NULL != "${builds}" ]; then
for build in `echo $builds | tr ',' ' '`; do
local target="${build%%:*}"
local platforms="${build##*:}"
for platform in `echo $platforms | tr '|' ' '`; do
local key=
if [ NULL != "${items}" ]; then
if [ NULL == "${separators}" ]; then separators=' '; fi
for separator in $separators; do
items=`echo $items | tr $separator ' '`
done
for item in $items; do
key="${target}|${platform}|${item}"
ref="$hashname[$key]"
eval $hashname[\'$key\']=0
done
else
key="${target}|${platform}"
eval $hashname[\'$key\']=0
fi
done
done
fi
echo "created the following permutations:"
print_hashmap $hashname
}
create_permutations2 $hashname $builds $repos ","
print_hashmap $hashname
for this small bit of the question: "My question is how can I protect a string like this from the eval?", i'll just mention bash's "parameter transformation" (see man page), ${parameter#operator}. in particular, the operator Q evaluates to a quoted form that can safely be used as input.
% echo ${key}
target|platform|repo:revision
% echo ${key#Q}
'target|platform|repo:revision'
Related
I've created a shell script that has the SSH address as the first argument, then has one or more arguments afterwards. The script is capable of working with two arguments or less (the address and the single argument), but it is not able to work properly after. Below is my code to get a better idea.
ssh.sh $1 << EOF
$(typeset -f sr_single "$#")
if [ "$#" -eq 2 ]; then
echo $2
sr_single $2
elif [ "$#" -lt 2 ]; then
echo "Needs at least two arguments: serial number and argument(s)"
else
echo "${#:2}"
for i in "${#:2}"; do
echo "'" $i "'"
sr_single $i
done
fi
EOF
Below is what it returns if I call the function "sr.sh test#ssh.com -l"
-l
And below is what it returns when I call "sr.sh test#ssh.com -l -v"
-l -v
' '
My question is how is this function not getting the second variable and the ones after on this one, where it seems to be working properly in the rest of the program? Thanks
To do this with less hair loss, define all your code in functions, like so:
rmt_main() {
if (( $# == 2 )); then
printf 'Exactly arguments received; $2 is: %q\n' "$2" >&2
elif (( $# < 2 )); then
echo 'Error: Needs at least two arguments' >&2
else
otherfunc "${#:2}"
fi
}
otherfunc() {
echo "Otherfunc called with arguments:" >&2
printf '%q\n' "$#"
}
...after which, calling can look like:
# generate an eval-safe string containing your arguments
printf -v args_str '%q ' "$#"
# explicitly invoke bash, so we don't need to worry about whether our escaping is POSIX-y
ssh "$1" 'bash -s' <<EOF
$(typeset -f rmt_main otherfunc) # emit function definitions
rmt_main $args_str # and call rmt_main with the eval-safe argument list
EOF
Note that the only contents we're expanding inside the heredoc are generated by the local shell in a form guaranteed to be correctly escaped to parse as code (by the remote shell). We are not under any circumstances expanding data (like command-line arguments) into the heredoc in unescaped form, and all our functions use completely conventional quoting (which is to say that all parameter expansions inside the function definitions are quoted).
I'm writing a POSIX compliant script in dash so I am having to get creative with using fake arrays.
Contents of fake_array.sh
fake_array_job() {
array="$1"
job_name="$2"
comma_count="$(echo "$array" | grep -o -F ',' | wc -l)"
if [ "$comma_count" -lt '1' ]; then
echo 'You gave a fake array to fake_array_job that does not contain at least one comma. Exiting...'
exit
fi
array_count="$(( comma_count + 1 ))"
position=1
while [ "$position" -le "$array_count" ]; do
item="$(echo "$array" | cut -d ',' -f "$position")"
"$job_name" || exit
position="$(( position + 1 ))"
done
}
Contents of script.sh
#!/bin/sh
. fake_array.sh
job_to_do() {
echo "$item"
}
fake_array_job 'goat,pig,sheep' 'job_to_do'
second_job() {
echo "$item"
}
fake_array_job 'apple,orange' 'second_job'
I am aware that it may seem silly to use a unique name for each job I pass to fake_array_job, but I like that I have to type it twice because it helps to reduce human error.
I keep reading that it is a bad idea to use a variable as a command. Does my use of "$job_name" to run a function have any negative implications as it concerns stability, security or efficiency?
(Read to the end for a good suggestion by Charles Duffy. I'm too lazy to completely rewrite my answer to mention it earlier...)
You can iterate over the "array" using simple parameter expansions without requiring multiple elements in the array.
fake_array_job() {
args=${1%,}, # Ensure the array ends with a comma
job_name=$2
while [ -n "$args" ]; do
item=${args%%,*}
"$job_name" || exit
args=${args#*,}
done
}
One problem with the above is that assures that the array is comma-terminated by assuming that foo,bar, is not a comma-delimited array with an empty last element. A better (though uglier) solution is to use read to break up the array.
fake_array_job () {
args=$1
job_name=$2
rest=$args
while [ -n "$rest" ]; do
IFS=, read -r item rest <<EOF
$rest
EOF
"$job_name" || exit
done
}
(You can use <<-EOF and make sure the here doc is indented with tabs, but it's hard to convey that here, so I'll just leave the ugly version.)
There's also Charles Duffy's good suggestion of using case to pattern match on the array to see if there are any commas left or not:
while [ -n "$args" ]; do
case $var in
*,*) next=${args%%,*}; var=${args#*,}; "$cmd" "$next";;
*) "$cmd" "$var"; break;;
esac;
done
In my script I need to expand an interval, e.g.:
input: 1,5-7
to get something like the following:
output: 1,5,6,7
I've found other solutions here, but they involve python and I can't use it in my script.
Solution with Just Bash 4 Builtins
You can use Bash range expansions. For example, assuming you've already parsed your input you can perform a series of successive operations to transform your range into a comma-separated series. For example:
value1=1
value2='5-7'
value2=${value2/-/..}
value2=`eval echo {$value2}`
echo "input: $value1,${value2// /,}"
All the usual caveats about the dangers of eval apply, and you'd definitely be better off solving this problem in Perl, Ruby, Python, or AWK. If you can't or won't, then you should at least consider including some pipeline tools like tr or sed in your conversions to avoid the need for eval.
Try something like this:
#!/bin/bash
for f in ${1//,/ }; do
if [[ $f =~ - ]]; then
a+=( $(seq ${f%-*} 1 ${f#*-}) )
else
a+=( $f )
fi
done
a=${a[*]}
a=${a// /,}
echo $a
Edit: As #Maxim_united mentioned in the comments, appending might be preferable to re-creating the array over and over again.
This should work with multiple ranges too.
#! /bin/bash
input="1,5-7,13-18,22"
result_str=""
for num in $(tr ',' ' ' <<< "$input"); do
if [[ "$num" == *-* ]]; then
res=$(seq -s ',' $(sed -n 's#\([0-9]\+\)-\([0-9]\+\).*#\1 \2#p' <<< "$num"))
else
res="$num"
fi
result_str="$result_str,$res"
done
echo ${result_str:1}
Will produce the following output:
1,5,6,7,13,14,15,16,17,18,22
expand_commas()
{
local arg
local st en i
set -- ${1//,/ }
for arg
do
case $arg in
[0-9]*-[0-9]*)
st=${arg%-*}
en=${arg#*-}
for ((i = st; i <= en; i++))
do
echo $i
done
;;
*)
echo $arg
;;
esac
done
}
Usage:
result=$(expand_commas arg)
eg:
result=$(expand_commas 1,5-7,9-12,3)
echo $result
You'll have to turn the separated words back into commas, of course.
It's a bit fragile with bad inputs but it's entirely in bash.
Here's my stab at it:
input=1,5-7,10,17-20
IFS=, read -a chunks <<< "$input"
output=()
for chunk in "${chunks[#]}"
do
IFS=- read -a args <<< "$chunk"
if (( ${#args[#]} == 1 )) # single number
then
output+=(${args[*]})
else # range
output+=($(seq "${args[#]}"))
fi
done
joined=$(sed -e 's/ /,/g' <<< "${output[*]}")
echo $joined
Basically split on commas, then interpret each piece. Then join back together with commas at the end.
A generic bash solution using the sequence expression `{x..y}'
#!/bin/bash
function doIt() {
local inp="${#/,/ }"
declare -a args=( $(echo ${inp/-/..}) )
local item
local sep
for item in "${args[#]}"
do
case ${item} in
*..*) eval "for i in {${item}} ; do echo -n \${sep}\${i}; sep=, ; done";;
*) echo -n ${sep}${item};;
esac
sep=,
done
}
doIt "1,5-7"
Should work with any input following the sample in the question. Also with multiple occurrences of x-y
Use only bash builtins
Using ideas from both #Ansgar Wiechers and #CodeGnome:
input="1,5-7,13-18,22"
for s in ${input//,/ }
do
if [[ $f =~ - ]]
then
a+=( $(eval echo {${s//-/..}}) )
else
a+=( $s )
fi
done
oldIFS=$IFS; IFS=$','; echo "${a[*]}"; IFS=$oldIFS
Works in Bash 3
Considering all the other answers, I came up with this solution, which does not use any sub-shells (but one call to eval for brace expansion) or separate processes:
# range list is assumed to be in $1 (e.g. 1-3,5,9-13)
# convert $1 to an array of ranges ("1-3" "5" "9-13")
IFS=,
local range=($1)
unset IFS
list=() # initialize result list
local r
for r in "${range[#]}"; do
if [[ $r == *-* ]]; then
# if the range is of the form "x-y",
# * convert to a brace expression "{x..y}",
# * using eval, this gets expanded to "x" "x+1" … "y" and
# * append this to the list array
eval list+=( {${r/-/..}} )
else
# otherwise, it is a simple number and can be appended to the array
list+=($r)
fi
done
# test output
echo ${list[#]}
I'm having a problem with a shell script (POSIX shell under HP-UX, FWIW). I have a function called print_arg into which I'm passing the name of a parameter as $1. Given the name of the parameter, I then want to print the name and the value of that parameter. However, I keep getting an error. Here's an example of what I'm trying to do:
#!/usr/bin/sh
function print_arg
{
# $1 holds the name of the argument to be shown
arg=$1
# The following line errors off with
# ./test_print.sh[9]: argval=${"$arg"}: The specified substitution is not valid for this command.
argval=${"$arg"}
if [[ $argval != '' ]] ; then
printf "ftp_func: $arg='$argval'\n"
fi
}
COMMAND="XYZ"
print_arg "COMMAND"
I've tried re-writing the offending line every way I can think of. I've consulted the local oracles. I've checked the online "BASH Scripting Guide". And I sharpened up the ol' wavy-bladed knife and scrubbed the altar until it gleamed, but then I discovered that our local supply of virgins has been cut down to, like, nothin'. Drat!
Any advice regarding how to get the value of a parameter whose name is passed into a function as a parameter will be received appreciatively.
You could use eval, though using direct indirection as suggested by SiegeX is probably nicer if you can use bash.
#!/bin/sh
foo=bar
print_arg () {
arg=$1
eval argval=\"\$$arg\"
echo "$argval"
}
print_arg foo
In bash (but not in other sh implementations), indirection is done by: ${!arg}
Input
foo=bar
bar=baz
echo $foo
echo ${!foo}
Output
bar
baz
This worked surprisingly well:
#!/bin/sh
foo=bar
print_arg () {
local line name value
set | \
while read line; do
name=${line%=*} value=${line#*=\'}
if [ "$name" = "$1" ]; then
echo ${value%\'}
fi
done
}
print_arg foo
It has all the POSIX clunkiness, in Bash would be much sorter, but then again, you won't need it because you have ${!}. This -in case it proves solid- would have the advantage of using only builtins and no eval. If I were to construct this function using an external command, it would have to be sed. Would obviate the need for the read loop and the substitutions. Mind that asking for indirections in POSIX without eval, has to be paid with clunkiness! So don't beat me!
Even though the answer's already accepted, here's another method for those who need to preserve newlines and special characters like Escape ( \033 ): Storing the variable in base64.
You need: bc, wc, echo, tail, tr, uuencode, uudecode
Example
#!/bin/sh
#====== Definition =======#
varA="a
b
c"
# uuencode the variable
varB="`echo "$varA" | uuencode -m -`"
# Skip the first line of the uuencode output.
varB="`NUM=\`(echo "$varB"|wc -l|tr -d "\n"; echo -1)|bc \`; echo "$varB" | tail -n $NUM)`"
#====== Access =======#
namevar1=varB
namevar2=varA
echo simple eval:
eval "echo \$$namevar2"
echo simple echo:
echo $varB
echo precise echo:
echo "$varB"
echo echo of base64
eval "echo \$$namevar1"
echo echo of base64 - with updated newlines
eval "echo \$$namevar1 | tr ' ' '\n'"
echo echo of un-based, using sh instead of eval (but could be made with eval, too)
export $namevar1
sh -c "(echo 'begin-base64 644 -'; echo \$$namevar1 | tr ' ' '\n' )|uudecode"
Result
simple eval:
a b c
simple echo:
YQpiCmMK ====
precise echo:
YQpiCmMK
====
echo of base64
YQpiCmMK ====
echo of base64 - with updated newlines
YQpiCmMK
====
echo of un-based, using sh instead of eval (but could be made with eval, too)
a
b
c
Alternative
You also could use the set command and parse it's output; with that, you don't need to treat the variable in a special way before it's accessed.
A safer solution with eval:
v=1
valid_var_name='[[:alpha:]_][[:alnum:]_]*$'
print_arg() {
local arg=$1
if ! expr "$arg" : "$valid_var_name" >/dev/null; then
echo "$0: invalid variable name ($arg)" >&2
exit 1
fi
local argval
eval argval=\$$arg
echo "$argval"
}
print_arg v
print_arg 'v; echo test'
Inspired by the following answer.
I have a bash script that is being used in a CGI. The CGI sets the $QUERY_STRING environment variable by reading everything after the ? in the URL. For example, http://example.com?a=123&b=456&c=ok sets QUERY_STRING=a=123&b=456&c=ok.
Somewhere I found the following ugliness:
b=$(echo "$QUERY_STRING" | sed -n 's/^.*b=\([^&]*\).*$/\1/p' | sed "s/%20/ /g")
which will set $b to whatever was found in $QUERY_STRING for b. However, my script has grown to have over ten input parameters. Is there an easier way to automatically convert the parameters in $QUERY_STRING into environment variables usable by bash?
Maybe I'll just use a for loop of some sort, but it'd be even better if the script was smart enough to automatically detect each parameter and maybe build an array that looks something like this:
${parm[a]}=123
${parm[b]}=456
${parm[c]}=ok
How could I write code to do that?
Try this:
saveIFS=$IFS
IFS='=&'
parm=($QUERY_STRING)
IFS=$saveIFS
Now you have this:
parm[0]=a
parm[1]=123
parm[2]=b
parm[3]=456
parm[4]=c
parm[5]=ok
In Bash 4, which has associative arrays, you can do this (using the array created above):
declare -A array
for ((i=0; i<${#parm[#]}; i+=2))
do
array[${parm[i]}]=${parm[i+1]}
done
which will give you this:
array[a]=123
array[b]=456
array[c]=ok
Edit:
To use indirection in Bash 2 and later (using the parm array created above):
for ((i=0; i<${#parm[#]}; i+=2))
do
declare var_${parm[i]}=${parm[i+1]}
done
Then you will have:
var_a=123
var_b=456
var_c=ok
You can access these directly:
echo $var_a
or indirectly:
for p in a b c
do
name="var$p"
echo ${!name}
done
If possible, it's better to avoid indirection since it can make code messy and be a source of bugs.
you can break $QUERY down using IFS. For example, setting it to &
$ QUERY="a=123&b=456&c=ok"
$ echo $QUERY
a=123&b=456&c=ok
$ IFS="&"
$ set -- $QUERY
$ echo $1
a=123
$ echo $2
b=456
$ echo $3
c=ok
$ array=($#)
$ for i in "${array[#]}"; do IFS="=" ; set -- $i; echo $1 $2; done
a 123
b 456
c ok
And you can save to a hash/dictionary in Bash 4+
$ declare -A hash
$ for i in "${array[#]}"; do IFS="=" ; set -- $i; hash[$1]=$2; done
$ echo ${hash["b"]}
456
Please don't use the evil eval junk.
Here's how you can reliably parse the string and get an associative array:
declare -A param
while IFS='=' read -r -d '&' key value && [[ -n "$key" ]]; do
param["$key"]=$value
done <<<"${QUERY_STRING}&"
If you don't like the key check, you could do this instead:
declare -A param
while IFS='=' read -r -d '&' key value; do
param["$key"]=$value
done <<<"${QUERY_STRING:+"${QUERY_STRING}&"}"
Listing all the keys and values from the array:
for key in "${!param[#]}"; do
echo "$key: ${param[$key]}"
done
I packaged the sed command up into another script:
$cat getvar.sh
s='s/^.*'${1}'=\([^&]*\).*$/\1/p'
echo $QUERY_STRING | sed -n $s | sed "s/%20/ /g"
and I call it from my main cgi as:
id=`./getvar.sh id`
ds=`./getvar.sh ds`
dt=`./getvar.sh dt`
...etc, etc - you get idea.
works for me even with a very basic busybox appliance (my PVR in this case).
To converts the contents of QUERY_STRING into bash variables use the following command:
eval $(echo ${QUERY_STRING//&/;})
The inner step, echo ${QUERY_STRING//&/;}, substitutes all ampersands with semicolons producing a=123;b=456;c=ok which the eval then evaluates into the current shell.
The result can then be used as bash variables.
echo $a
echo $b
echo $c
The assumptions are:
values will never contain '&'
values will never contain ';'
QUERY_STRING will never contain malicious code
While the accepted answer is probably the most beautiful one, there might be cases where security is super-important, and it needs to be also well-visible from your script.
In such a case, first I wouldn't use bash for the task, but if it should be done on some reason, it might be better to avoid these new array - dictionary features, because you can't be sure, how exactly are they escaped.
In this case, the good old primitive solutions might work:
QS="${QUERY_STRING}"
while [ "${QS}" != "" ]
do
nameval="${QS%%&*}"
QS="${QS#$nameval}"
QS="${QS#&}"
name="${nameval%%=*}"
val="${nameval#$name}"
val="${nameval#=}"
# and here we have $name and $val as names and values
# ...
done
This iterates on the name-value pairs of the QUERY_STRING, and there is no way to circumvent it with any tricky escape sequence - the " is a very strong thing in bash, except a single variable name substitution, which is fully controlled by us, nothing can be tricked.
Furthermore, you can inject your own processing code into "# ...". This enables you to allow only your own, well-defined (and, ideally, short) list of the allowed variable names. Needless to say, LD_PRELOAD shouldn't be one of them. ;-)
Furthermore, no variable will be exported, and exclusively QS, nameval, name and val is used.
Following the correct answer, I've done myself some changes to support array variables like in this other question. I added also a decode function of which I can not find the author to give some credit.
Code appears somewhat messy, but it works. Changes and other recommendations would be greatly appreciated.
function cgi_decodevar() {
[ $# -ne 1 ] && return
local v t h
# replace all + with whitespace and append %%
t="${1//+/ }%%"
while [ ${#t} -gt 0 -a "${t}" != "%" ]; do
v="${v}${t%%\%*}" # digest up to the first %
t="${t#*%}" # remove digested part
# decode if there is anything to decode and if not at end of string
if [ ${#t} -gt 0 -a "${t}" != "%" ]; then
h=${t:0:2} # save first two chars
t="${t:2}" # remove these
v="${v}"`echo -e \\\\x${h}` # convert hex to special char
fi
done
# return decoded string
echo "${v}"
return
}
saveIFS=$IFS
IFS='=&'
VARS=($QUERY_STRING)
IFS=$saveIFS
for ((i=0; i<${#VARS[#]}; i+=2))
do
curr="$(cgi_decodevar ${VARS[i]})"
next="$(cgi_decodevar ${VARS[i+2]})"
prev="$(cgi_decodevar ${VARS[i-2]})"
value="$(cgi_decodevar ${VARS[i+1]})"
array=${curr%"[]"}
if [ "$curr" == "$next" ] && [ "$curr" != "$prev" ] ;then
j=0
declare var_${array}[$j]="$value"
elif [ $i -gt 1 ] && [ "$curr" == "$prev" ]; then
j=$((j + 1))
declare var_${array}[$j]="$value"
else
declare var_$curr="$value"
fi
done
I would simply replace the & to ;. It will become to something like:
a=123;b=456;c=ok
So now you need just evaluate and read your vars:
eval `echo "${QUERY_STRING}"|tr '&' ';'`
echo $a
echo $b
echo $c
A nice way to handle CGI query strings is to use Haserl which acts as a wrapper around your Bash cgi script, and offers convenient and secure query string parsing.
To bring this up to date, if you have a recent Bash version then you can achieve this with regular expressions:
q="$QUERY_STRING"
re1='^(\w+=\w+)&?'
re2='^(\w+)=(\w+)$'
declare -A params
while [[ $q =~ $re1 ]]; do
q=${q##*${BASH_REMATCH[0]}}
[[ ${BASH_REMATCH[1]} =~ $re2 ]] && params+=([${BASH_REMATCH[1]}]=${BASH_REMATCH[2]})
done
If you don't want to use associative arrays then just change the penultimate line to do what you want. For each iteration of the loop the parameter is in ${BASH_REMATCH[1]} and its value is in ${BASH_REMATCH[2]}.
Here is the same thing as a function in a short test script that iterates over the array outputs the query string's parameters and their values
#!/bin/bash
QUERY_STRING='foo=hello&bar=there&baz=freddy'
get_query_string() {
local q="$QUERY_STRING"
local re1='^(\w+=\w+)&?'
local re2='^(\w+)=(\w+)$'
while [[ $q =~ $re1 ]]; do
q=${q##*${BASH_REMATCH[0]}}
[[ ${BASH_REMATCH[1]} =~ $re2 ]] && eval "$1+=([${BASH_REMATCH[1]}]=${BASH_REMATCH[2]})"
done
}
declare -A params
get_query_string params
for k in "${!params[#]}"
do
v="${params[$k]}"
echo "$k : $v"
done
Note the parameters end up in the array in reverse order (it's associative so that shouldn't matter).
why not this
$ echo "${QUERY_STRING}"
name=carlo&last=lanza&city=pfungen-CH
$ saveIFS=$IFS
$ IFS='&'
$ eval $QUERY_STRING
$ IFS=$saveIFS
now you have this
name = carlo
last = lanza
city = pfungen-CH
$ echo "name is ${name}"
name is carlo
$ echo "last is ${last}"
last is lanza
$ echo "city is ${city}"
city is pfungen-CH
#giacecco
To include a hiphen in the regex you could change the two lines as such in answer from #starfry.
Change these two lines:
local re1='^(\w+=\w+)&?'
local re2='^(\w+)=(\w+)$'
To these two lines:
local re1='^(\w+=(\w+|-|)+)&?'
local re2='^(\w+)=((\w+|-|)+)$'
For all those who couldn't get it working with the posted answers (like me),
this guy figured it out.
Can't upvote his post unfortunately...
Let me repost the code here real quick:
#!/bin/sh
if [ "$REQUEST_METHOD" = "POST" ]; then
if [ "$CONTENT_LENGTH" -gt 0 ]; then
read -n $CONTENT_LENGTH POST_DATA <&0
fi
fi
#echo "$POST_DATA" > data.bin
IFS='=&'
set -- $POST_DATA
#2- Value1
#4- Value2
#6- Value3
#8- Value4
echo $2 $4 $6 $8
echo "Content-type: text/html"
echo ""
echo "<html><head><title>Saved</title></head><body>"
echo "Data received: $POST_DATA"
echo "</body></html>"
Hope this is of help for anybody.
Cheers
Actually I liked bolt's answer, so I made a version which works with Busybox as well (ash in Busybox does not support here string).
This code will accept key1 and key2 parameters, all others will be ignored.
while IFS= read -r -d '&' KEYVAL && [[ -n "$KEYVAL" ]]; do
case ${KEYVAL%=*} in
key1) KEY1=${KEYVAL#*=} ;;
key2) KEY2=${KEYVAL#*=} ;;
esac
done <<END
$(echo "${QUERY_STRING}&")
END
One can use the bash-cgi.sh, which processes :
the query string into the $QUERY_STRING_GET key and value array;
the post request data (x-www-form-urlencoded) into the $QUERY_STRING_POST key and value array;
the cookies data into the $HTTP_COOKIES key and value array.
Demands bash version 4.0 or higher (to define the key and value arrays above).
All processing is made by bash only (i.e. in an one process) without any external dependencies and additional processes invoking.
It has:
the check for max length of data, which can be transferred to it's input,
as well as processed as query string and cookies;
the redirect() procedure to produce redirect to itself with the extension changed to .html (it is useful for an one page's sites);
the http_header_tail() procedure to output the last two strings of the HTTP(S) respond's header;
the $REMOTE_ADDR value sanitizer from possible injections;
the parser and evaluator of the escaped UTF-8 symbols embedded into the values passed to the $QUERY_STRING_GET, $QUERY_STRING_POST and $HTTP_COOKIES;
the sanitizer of the $QUERY_STRING_GET, $QUERY_STRING_POST and $HTTP_COOKIES values against possible SQL injections (the escaping like the mysql_real_escape_string php function does, plus the escaping of # and $).
It is available here:
https://github.com/VladimirBelousov/fancy_scripts
This works in dash using for in loop
IFS='&'
for f in $query_string; do
value=${f##*=}
key=${f%%=*}
# if you need environment variable -> eval "qs_$key=$value"
done