Is there a way to avoid positional arguments in bash? - bash

I have to write a function in bash. The function will take about 7 arguments. I know that I can call a function like this:
To call a function with parameters:
function_name $arg1 $arg2
And I can refer my parameters like this inside the function:
function_name () {
echo "Parameter #1 is $1"
}
My question is, is there a better way refer to the parameters inside the function? Can I avoid the $1, $2, $3, .... thing and simply use the $arg1, $arg2, ...?
Is there a proper method for this or do I need to re-assign these parameters to some other variables inside the function? E.g.:
function_name () {
$ARG1=$1
echo "Parameter #1 is $ARG1"
}
Any example would be much appreciated.

The common way of doing that is assigning the arguments to local variables in the function, i.e.:
copy() {
local from=${1}
local to=${2}
# ...
}
Another solution may be getopt-style option parsing.
copy() {
local arg from to
while getopts 'f:t:' arg
do
case ${arg} in
f) from=${OPTARG};;
t) to=${OPTARG};;
*) return 1 # illegal option
esac
done
}
copy -f /tmp/a -t /tmp/b
Sadly, bash can't handle long options which would be more readable, i.e.:
copy --from /tmp/a --to /tmp/b
For that, you either need to use the external getopt program (which I think has long option support only on GNU systems) or implement the long option parser by hand, i.e.:
copy() {
local from to
while [[ ${1} ]]; do
case "${1}" in
--from)
from=${2}
shift
;;
--to)
to=${2}
shift
;;
*)
echo "Unknown parameter: ${1}" >&2
return 1
esac
if ! shift; then
echo 'Missing parameter argument.' >&2
return 1
fi
done
}
copy --from /tmp/a --to /tmp/b
Also see: using getopts in bash shell script to get long and short command line options
You can also be lazy, and just pass the 'variables' as arguments to the function, i.e.:
copy() {
local "${#}"
# ...
}
copy from=/tmp/a to=/tmp/b
and you'll have ${from} and ${to} in the function as local variables.
Just note that the same issue as below applies — if a particular variable is not passed, it will be inherited from parent environment. You may want to add a 'safety line' like:
copy() {
local from to # reset first
local "${#}"
# ...
}
to ensure that ${from} and ${to} will be unset when not passed.
And if something very bad is of your interest, you could also assign the arguments as global variables when invoking the function, i.e.:
from=/tmp/a to=/tmp/b copy
Then you could just use ${from} and ${to} within the copy() function. Just note that you should then always pass all parameters. Otherwise, a random variable may leak into the function.
from= to=/tmp/b copy # safe
to=/tmp/b copy # unsafe: ${from} may be declared elsewhere
If you have bash 4.1 (I think), you can also try using associative arrays. It will allow you to pass named arguments but it will be ugly. Something like:
args=( [from]=/tmp/a [to]=/tmp/b )
copy args
And then in copy(), you'd need to grab the array.

You can always pass things through the environment:
#!/bin/sh
foo() {
echo arg1 = "$arg1"
echo arg2 = "$arg2"
}
arg1=banana arg2=apple foo

All you have to do is name variables on the way in to the function call.
function test() {
echo $a
}
a='hello world' test
#prove variable didnt leak
echo $a .
This isn't just a feature of functions, you could have that function in it's own script and call a='hello world' test.sh and it would work just the same
As an extra little bit of fun, you can combine this method with positional arguments (say you were making a script and some users mightn't know the variable names).
Heck, why not let it have defaults for those arguments too? Well sure, easy peasy!
function test2() {
[[ -n "$1" ]] && local a="$1"; [[ -z "$a" ]] && local a='hi'
[[ -n "$2" ]] && local b="$2"; [[ -z "$b" ]] && local b='bye'
echo $a $b
}
#see the defaults
test2
#use positional as usual
test2 '' there
#use named parameter
a=well test2
#mix it up
b=one test2 nice
#prove variables didnt leak
echo $a $b .
Note that if test was its own script, you don't need to use the local keyword.

Shell functions have full access to any variable available in their calling scope, except for those variable names that are used as local variables inside the function itself. In addition, any non-local variable set within a function is available on the outside after the function has been called. Consider the following example:
A=aaa
B=bbb
echo "A=$A B=$B C=$C"
example() {
echo "example(): A=$A B=$B C=$C"
A=AAA
local B=BBB
C=CCC
echo "example(): A=$A B=$B C=$C"
}
example
echo "A=$A B=$B C=$C"
This snippet has the following output:
A=aaa B=bbb C=
example(): A=aaa B=bbb C=
example(): A=AAA B=BBB C=CCC
A=AAA B=bbb C=CCC
The obvious disadvantage of this approach is that functions are not self-contained any more and that setting a variable outside a function may have unintended side-effects. It would also make things harder if you wanted to pass data to a function without assigning it to a variable first, since this function is not using positional parameters any more.
The most common way to handle this is to use local variables for arguments and any temporary variable within a function:
example() {
local A="$1" B="$2" C="$3" TMP="/tmp"
...
}
This avoids polluting the shell namespace with function-local variables.

I think I have a solution for you.
With a few tricks you can actually pass named parameters to functions, along with arrays.
The method I developed allows you to access parameters passed to a function like this:
testPassingParams() {
#var hello
l=4 #array anArrayWithFourElements
l=2 #array anotherArrayWithTwo
#var anotherSingle
#reference table # references only work in bash >=4.3
#params anArrayOfVariedSize
test "$hello" = "$1" && echo correct
#
test "${anArrayWithFourElements[0]}" = "$2" && echo correct
test "${anArrayWithFourElements[1]}" = "$3" && echo correct
test "${anArrayWithFourElements[2]}" = "$4" && echo correct
# etc...
#
test "${anotherArrayWithTwo[0]}" = "$6" && echo correct
test "${anotherArrayWithTwo[1]}" = "$7" && echo correct
#
test "$anotherSingle" = "$8" && echo correct
#
test "${table[test]}" = "works"
table[inside]="adding a new value"
#
# I'm using * just in this example:
test "${anArrayOfVariedSize[*]}" = "${*:10}" && echo correct
}
fourElements=( a1 a2 "a3 with spaces" a4 )
twoElements=( b1 b2 )
declare -A assocArray
assocArray[test]="works"
testPassingParams "first" "${fourElements[#]}" "${twoElements[#]}" "single with spaces" assocArray "and more... " "even more..."
test "${assocArray[inside]}" = "adding a new value"
In other words, not only you can call your parameters by their names (which makes up for a more readable core), you can actually pass arrays (and references to variables - this feature works only in bash 4.3 though)! Plus, the mapped variables are all in the local scope, just as $1 (and others).
The code that makes this work is pretty light and works both in bash 3 and bash 4 (these are the only versions I've tested it with). If you're interested in more tricks like this that make developing with bash much nicer and easier, you can take a look at my Bash Infinity Framework, the code below was developed for that purpose.
Function.AssignParamLocally() {
local commandWithArgs=( $1 )
local command="${commandWithArgs[0]}"
shift
if [[ "$command" == "trap" || "$command" == "l="* || "$command" == "_type="* ]]
then
paramNo+=-1
return 0
fi
if [[ "$command" != "local" ]]
then
assignNormalCodeStarted=true
fi
local varDeclaration="${commandWithArgs[1]}"
if [[ $varDeclaration == '-n' ]]
then
varDeclaration="${commandWithArgs[2]}"
fi
local varName="${varDeclaration%%=*}"
# var value is only important if making an object later on from it
local varValue="${varDeclaration#*=}"
if [[ ! -z $assignVarType ]]
then
local previousParamNo=$(expr $paramNo - 1)
if [[ "$assignVarType" == "array" ]]
then
# passing array:
execute="$assignVarName=( \"\${#:$previousParamNo:$assignArrLength}\" )"
eval "$execute"
paramNo+=$(expr $assignArrLength - 1)
unset assignArrLength
elif [[ "$assignVarType" == "params" ]]
then
execute="$assignVarName=( \"\${#:$previousParamNo}\" )"
eval "$execute"
elif [[ "$assignVarType" == "reference" ]]
then
execute="$assignVarName=\"\$$previousParamNo\""
eval "$execute"
elif [[ ! -z "${!previousParamNo}" ]]
then
execute="$assignVarName=\"\$$previousParamNo\""
eval "$execute"
fi
fi
assignVarType="$__capture_type"
assignVarName="$varName"
assignArrLength="$__capture_arrLength"
}
Function.CaptureParams() {
__capture_type="$_type"
__capture_arrLength="$l"
}
alias #trapAssign='Function.CaptureParams; trap "declare -i \"paramNo+=1\"; Function.AssignParamLocally \"\$BASH_COMMAND\" \"\$#\"; [[ \$assignNormalCodeStarted = true ]] && trap - DEBUG && unset assignVarType && unset assignVarName && unset assignNormalCodeStarted && unset paramNo" DEBUG; '
alias #param='#trapAssign local'
alias #reference='_type=reference #trapAssign local -n'
alias #var='_type=var #param'
alias #params='_type=params #param'
alias #array='_type=array #param'

I was personally hoping to see some sort of syntax like
func(a b){
echo $a
echo $b
}
But since that's not a thing, and a I see quite a few references to global variables (not without the caveat of scoping and naming conflicts), I'll share my approach.
Using the copy function from Michal's answer:
copy(){
cp $from $to
}
from=/tmp/a
to=/tmp/b
copy
This is bad, because from and to are such broad words that any number of functions could use this. You could quickly end up with a naming conflict or a "leak" on your hands.
letter(){
echo "From: $from"
echo "To: $to"
echo
echo "$1"
}
to=Emily
letter "Hello Emily, you're fired for missing two days of work."
# Result:
# From: /tmp/a
# To: Emily
# Hello Emily, you're fired for missing two days of work.
So my approach is to "namespace" them. I name the variable after the function and delete it after the function is done with it. Of course, I only use it for optional values that have default values. Otherwise, I just use positional args.
copy(){
if [[ $copy_from ]] && [[ $copy_to ]]; then
cp $copy_from $copy_to
unset copy_from copy_to
fi
}
copy_from=/tmp/a
copy_to=/tmp/b
copy # Copies /tmp/a to /tmp/b
copy # Does nothing, as it ought to
letter "Emily, you're 'not' re-hired for the 'not' bribe ;)"
# From: (no /tmp/a here!)
# To:
# Emily, you're 'not' re-hired for the 'not' bribe ;)
I would make a terrible boss...
In practice, my function names are more elaborate than "copy" or "letter".
The most recent example to my memory is get_input(), which has gi_no_sort and gi_prompt.
gi_no_sort is a true/false value that determines whether the completion suggestions are sorted or not. Defaults to true
gi_prompt is a string that is...well, that's self-explanatory. Defaults to "".
The actual arguments the function takes are the source of the aforementioned 'completion suggestions' for the input prompt, and as said list is taken from $# in the function, the "named args" are optional[1], and there's no obvious way to distinguish between a string meant as a completion and a boolean/prompt-message, or really anything space-separated in bash, for that matter[2]; the above solution ended up saving me a lot of trouble.
notes:
So a hard-coded shift and $1, $2, etc. are out of the question.
E.g. is "0 Enter a command: {1..9} $(ls)" a value of 0, "Enter a command:", and a set of 1 2 3 4 5 6 7 8 9 <directory contents>? Or are "0", "Enter", "a", and "command:" part of that set as well? Bash will assume the latter whether you like it or not.

Arguments get sent to functions as an tuple of individual items, so they have no names as such, just positions. this allows some interesting possibilities like below, but it does mean that you are stuck with $1. $2, etc. as to whether to map them to better names, the question comes down to how big the function is, and how much clearer it will make reading the code. if its complex, then mapping meaningful names ($BatchID, $FirstName, $SourceFilePath) is a good idea. for simple stuff though, it probably isn't necessary. I certianly wouldn't bother if you are using names like $arg1.
now, if you just want to echo back the parameters, you can iterate over them:
for $arg in "$#"
do
echo "$arg"
done
just a fun fact; unless you are processing a list, you are probably interested in somthing more useful

this is an older topic, but still i'd like to share the function below (requires bash 4). It parses named arguments and sets the variables in the scripts environment. Just make sure you have sane default values for all parameters you need. The export statement at the end could also just be an eval. It's great in combination with shift to extend existing scripts which already take a few positional parameters and you dont want to change the syntax, but still add some flexibility.
parseOptions()
{
args=("$#")
for opt in "${args[#]}"; do
if [[ ! "${opt}" =~ .*=.* ]]; then
echo "badly formatted option \"${opt}\" should be: option=value, stopping..."
return 1
fi
local var="${opt%%=*}"
local value="${opt#*=}"
export ${var}="${value}"
done
return 0
}

Related

How is the $# array different from regular array

I wrote a function to check whether argument(s) was passed to script, so I had to create an alias variable for $# because inside function that would be function's arguments.
Here's the code:
script_args=$#
is_arg_passed() {
local passed=false
for passed_arg in ${script_args}; do
for arg in "$#"; do
if [[ "${passed_arg}" == "${arg}" ]]; then
passed=true
break
fi
done
[[ "${passed}" == true ]] && break
done
echo "${passed}"
}
Though I definitely would like to learn how it can be implemented shorter, that's not the topic of my question (though some advice would be appreciated ;]).
My question is related to the for passed_arg in ${script_args}; do line:
Why does it not work when script_args expanded within string i.e. "${script_args}", when "${#}" or "$#" does.
Only ${script_args} works.
So how is $# different from the regular array (like (a b c)), and how is script_args different from $#?
What is the catch?
$# is not an array, though it is array-like.
The assignment script_args=$# simply creates a regular parameter whose value is the contents of $# concatenated with a space. If you really want an array with the same contents, use
script_args=( "$#" ) # The quotes are critical!
is_arg_passed() {
local passed=false
for passed_arg in "${script_args[#]}"; do # So are these quotes!
for arg in "$#"; do
if [[ "${passed_arg}" == "${arg}" ]]; then
passed=true
break
fi
done
[[ "${passed}" == true ]] && break
done
echo "${passed}"
}
This kind of containment check can also be done using an associative array (as long as the arguments are not empty strings).
declare -A script_args
for arg; do
# The value doesn't matter; we'll only be checking
# if the key exists.
script_args["$arg"]=
done
is_arg_passed() {
for arg; do
if ! [[ -v "script_args[$arg]" ]]; then
echo false
return 1
fi
done
echo true
return 0
}
They're not different based on how you compare them because an array much like the positional parameters is expanded with a quote: "${script_args[#]}"
Also you're storing the arguments wrong. With script_args=$# you store a string value to script_args. The value is a merged form of the values of $# with the space used as a conjunctor.
To store them as an array, use script_name=("$#"). Read the Arrays section of the Bash manual to know more about it.

How to pass an argument passed to a bashrc function? [duplicate]

I am trying to search how to pass parameters in a Bash function, but what comes up is always how to pass parameter from the command line.
I would like to pass parameters within my script. I tried:
myBackupFunction("..", "...", "xx")
function myBackupFunction($directory, $options, $rootPassword) {
...
}
But the syntax is not correct. How can I pass a parameter to my function?
There are two typical ways of declaring a function. I prefer the second approach.
function function_name {
command...
}
or
function_name () {
command...
}
To call a function with arguments:
function_name "$arg1" "$arg2"
The function refers to passed arguments by their position (not by name), that is $1, $2, and so forth. $0 is the name of the script itself.
Example:
function_name () {
echo "Parameter #1 is $1"
}
Also, you need to call your function after it is declared.
#!/usr/bin/env sh
foo 1 # this will fail because foo has not been declared yet.
foo() {
echo "Parameter #1 is $1"
}
foo 2 # this will work.
Output:
./myScript.sh: line 2: foo: command not found
Parameter #1 is 2
Reference: Advanced Bash-Scripting Guide.
Knowledge of high level programming languages (C/C++, Java, PHP, Python, Perl, etc.) would suggest to the layman that Bourne Again Shell (Bash) functions should work like they do in those other languages.
Instead, Bash functions work like shell commands and expect arguments to be passed to them in the same way one might pass an option to a shell command (e.g. ls -l). In effect, function arguments in Bash are treated as positional parameters ($1, $2..$9, ${10}, ${11}, and so on). This is no surprise considering how getopts works. Do not use parentheses to call a function in Bash.
(Note: I happen to be working on OpenSolaris at the moment.)
# Bash style declaration for all you PHP/JavaScript junkies. :-)
# $1 is the directory to archive
# $2 is the name of the tar and zipped file when all is done.
function backupWebRoot ()
{
tar -cvf - "$1" | zip -n .jpg:.gif:.png "$2" - 2>> $errorlog &&
echo -e "\nTarball created!\n"
}
# sh style declaration for the purist in you. ;-)
# $1 is the directory to archive
# $2 is the name of the tar and zipped file when all is done.
backupWebRoot ()
{
tar -cvf - "$1" | zip -n .jpg:.gif:.png "$2" - 2>> $errorlog &&
echo -e "\nTarball created!\n"
}
# In the actual shell script
# $0 $1 $2
backupWebRoot ~/public/www/ webSite.tar.zip
Want to use names for variables? Just do something this.
local filename=$1 # The keyword declare can be used, but local is semantically more specific.
Be careful, though. If an argument to a function has a space in it, you may want to do this instead! Otherwise, $1 might not be what you think it is.
local filename="$1" # Just to be on the safe side. Although, if $1 was an integer, then what? Is that even possible? Humm.
Want to pass an array to a function by value?
callingSomeFunction "${someArray[#]}" # Expands to all array elements.
Inside the function, handle the arguments like this.
function callingSomeFunction ()
{
for value in "$#" # You want to use "$#" here, not "$*" !!!!!
do
:
done
}
Need to pass a value and an array, but still use "$#" inside the function?
function linearSearch ()
{
local myVar="$1"
shift 1 # Removes $1 from the parameter list
for value in "$#" # Represents the remaining parameters.
do
if [[ $value == $myVar ]]
then
echo -e "Found it!\t... after a while."
return 0
fi
done
return 1
}
linearSearch $someStringValue "${someArray[#]}"
In Bash 4.3 and above, you can pass an array to a function by reference by defining the parameter of a function with the -n option.
function callingSomeFunction ()
{
local -n someArray=$1 # also ${1:?} to make the parameter mandatory.
for value in "${someArray[#]}" # Nice!
do
:
done
}
callingSomeFunction myArray # No $ in front of the argument. You pass by name, not expansion / value.
If you prefer named parameters, it's possible (with a few tricks) to actually pass named parameters to functions (also makes it possible to pass arrays and references).
The method I developed allows you to define named parameters passed to a function like this:
function example { args : string firstName , string lastName , integer age } {
echo "My name is ${firstName} ${lastName} and I am ${age} years old."
}
You can also annotate arguments as #required or #readonly, create ...rest arguments, create arrays from sequential arguments (using e.g. string[4]) and optionally list the arguments in multiple lines:
function example {
args
: #required string firstName
: string lastName
: integer age
: string[] ...favoriteHobbies
echo "My name is ${firstName} ${lastName} and I am ${age} years old."
echo "My favorite hobbies include: ${favoriteHobbies[*]}"
}
In other words, not only you can call your parameters by their names (which makes up for a more readable core), you can actually pass arrays (and references to variables - this feature works only in Bash 4.3 though)! Plus, the mapped variables are all in the local scope, just as $1 (and others).
The code that makes this work is pretty light and works both in Bash 3 and Bash 4 (these are the only versions I've tested it with). If you're interested in more tricks like this that make developing with bash much nicer and easier, you can take a look at my Bash Infinity Framework, the code below is available as one of its functionalities.
shopt -s expand_aliases
function assignTrap {
local evalString
local -i paramIndex=${__paramIndex-0}
local initialCommand="${1-}"
if [[ "$initialCommand" != ":" ]]
then
echo "trap - DEBUG; eval \"${__previousTrap}\"; unset __previousTrap; unset __paramIndex;"
return
fi
while [[ "${1-}" == "," || "${1-}" == "${initialCommand}" ]] || [[ "${##}" -gt 0 && "$paramIndex" -eq 0 ]]
do
shift # First colon ":" or next parameter's comma ","
paramIndex+=1
local -a decorators=()
while [[ "${1-}" == "#"* ]]
do
decorators+=( "$1" )
shift
done
local declaration=
local wrapLeft='"'
local wrapRight='"'
local nextType="$1"
local length=1
case ${nextType} in
string | boolean) declaration="local " ;;
integer) declaration="local -i" ;;
reference) declaration="local -n" ;;
arrayDeclaration) declaration="local -a"; wrapLeft= ; wrapRight= ;;
assocDeclaration) declaration="local -A"; wrapLeft= ; wrapRight= ;;
"string["*"]") declaration="local -a"; length="${nextType//[a-z\[\]]}" ;;
"integer["*"]") declaration="local -ai"; length="${nextType//[a-z\[\]]}" ;;
esac
if [[ "${declaration}" != "" ]]
then
shift
local nextName="$1"
for decorator in "${decorators[#]}"
do
case ${decorator} in
#readonly) declaration+="r" ;;
#required) evalString+="[[ ! -z \$${paramIndex} ]] || echo \"Parameter '$nextName' ($nextType) is marked as required by '${FUNCNAME[1]}' function.\"; " >&2 ;;
#global) declaration+="g" ;;
esac
done
local paramRange="$paramIndex"
if [[ -z "$length" ]]
then
# ...rest
paramRange="{#:$paramIndex}"
# trim leading ...
nextName="${nextName//\./}"
if [[ "${##}" -gt 1 ]]
then
echo "Unexpected arguments after a rest array ($nextName) in '${FUNCNAME[1]}' function." >&2
fi
elif [[ "$length" -gt 1 ]]
then
paramRange="{#:$paramIndex:$length}"
paramIndex+=$((length - 1))
fi
evalString+="${declaration} ${nextName}=${wrapLeft}\$${paramRange}${wrapRight}; "
# Continue to the next parameter:
shift
fi
done
echo "${evalString} local -i __paramIndex=${paramIndex};"
}
alias args='local __previousTrap=$(trap -p DEBUG); trap "eval \"\$(assignTrap \$BASH_COMMAND)\";" DEBUG;'
Drop the parentheses and commas:
myBackupFunction ".." "..." "xx"
And the function should look like this:
function myBackupFunction() {
# Here $1 is the first parameter, $2 the second, etc.
}
A simple example that will clear both during executing script or inside script while calling a function.
#!/bin/bash
echo "parameterized function example"
function print_param_value(){
value1="${1}" # $1 represent first argument
value2="${2}" # $2 represent second argument
echo "param 1 is ${value1}" # As string
echo "param 2 is ${value2}"
sum=$(($value1+$value2)) # Process them as number
echo "The sum of two value is ${sum}"
}
print_param_value "6" "4" # Space-separated value
# You can also pass parameters during executing the script
print_param_value "$1" "$2" # Parameter $1 and $2 during execution
# Suppose our script name is "param_example".
# Call it like this:
#
# ./param_example 5 5
#
# Now the parameters will be $1=5 and $2=5
It takes two numbers from the user, feeds them to the function called add (in the very last line of the code), and add will sum them up and print them.
#!/bin/bash
read -p "Enter the first value: " x
read -p "Enter the second value: " y
add(){
arg1=$1 # arg1 gets to be the first assigned argument (note there are no spaces)
arg2=$2 # arg2 gets to be the second assigned argument (note there are no spaces)
echo $(($arg1 + $arg2))
}
add x y # Feeding the arguments
Another way to pass named parameters to Bash... is passing by reference. This is supported as of Bash 4.0
#!/bin/bash
function myBackupFunction(){ # directory options destination filename
local directory="$1" options="$2" destination="$3" filename="$4";
echo "tar cz ${!options} ${!directory} | ssh root#backupserver \"cat > /mnt/${!destination}/${!filename}.tgz\"";
}
declare -A backup=([directory]=".." [options]="..." [destination]="backups" [filename]="backup" );
myBackupFunction backup[directory] backup[options] backup[destination] backup[filename];
An alternative syntax for Bash 4.3 is using a nameref.
Although the nameref is a lot more convenient in that it seamlessly dereferences, some older supported distros still ship an older version, so I won't recommend it quite yet.

Evaluate variable at time of function declaration in shell

I'm setting up my shell environments and I want to be able to use some of the same functions/aliases in zsh as in bash. One of these functions opens either .bashrc or .zshrc in an editor (whichever file is relevant), waits for the editor to close, then reloads the rc file.
# a very simplified version of this function
editrc() {
local rcfile=".$(basename $SHELL)rc"
code -w ~/$rcfile
. ~/$rcfile
}
I use the value of rcfile in a few other functions, so I've pulled it out of the function declaration.
_rc=".$(basename $SHELL)rc"
editrc() {
code -w ~/$_rc
. ~/$_rc
}
# ... other functions that use it ...
unset _rc
However, because I'm a neat freak, I want to unset _rc at the end of my script, but I still want my functions to run correctly. Is there a clever way to evaluate $_rc at the time the function is declared?
I know I could use eval and place everything except $_rc instances within single quotes, but that seems like a pain, since the full version of my function uses both single-quotes and double-quotes.
_rc=".$(basename $SHELL)rc"
eval 'editrc() {
echo Here'"'"'s a thing that uses single quotes. As you can see it'"'"'s a pain.
code -w ~/'$_rc'
. ~/'$_rc'
}'
# ... other functions using `_rc`
unset _rc
I'm guessing I could declare my functions, then do some magic with eval "$(declare -f editrc | awk)". It very well be more pain than it's worth, but I'm always interested in learning new things.
Note: I'd love to generalize this into a utility function that does this.
_myvar=foo
anothervar=bar
myfunc() {
echo $_myvar $anothervar
}
# redeclares myfunc with `$_myvar` expanded, but leaves `$anothervar` as-is
expandfunctionvars myfunc '$_myvar'
Is there a clever way to evaluate $_rc at the time the function is declared?
_rc=".$(basename "$SHELL")rc"
# while you could eval here, source lets you work with a stream
source <(
cat <<EOF
editrc() {
local _rc
# first safely trasfer context
$(declare -p _rc)
EOF
# use quoted here string to do anything inside without caring.
cat <<'EOF'
# do anything else
echo "Here's a thing that uses single quotes. As you can see it's not a pain, just choose proper quoting."
code -w "~/$_rc"
. "~/$_rc"
}
EOF
)
unset _rc
Generally first use declare -p to transfer variables as strings to be evaluated. Then after you "import" variables, use a quoted here document to do anything as in a normal script.
References to read:
<<EOF is a here document. Note the difference in parsing when the here delimiter is quoted vs unquoted.
<(..) is a process substitution
The source command reads a pipe created by process substitution. Inside the process subtitution I output the function to be sourced. With the first here document I output the function name definition, with a local of the variable so that it doesn't pollute global namespace. Then with declare -p I output the variable definition as a properly quoted string later to be sourced by source. Then with a quoted here document I output the rest of the function, so that I do not need to care about quoting.
The code is bash specific, I know nothing about zsh and don't use it.
You could do it with eval too:
eval '
editrc() {
local _rc
# first safely trasfer context
'"$(declare -p _rc)"'
# use quoted here string to do anything inside without caring.
# do anything else
echo "Here'\''s a thing that uses single quotes. As you can see it'\''s not a pain, just choose proper quoting."
code -w "~/$_rc"
. "~/$_rc"
}'
But for me using a quoted here document delimiter allows for easier writing.
While KamilCuck was working on their answer, I devised a function that will take in any function name and a set of variable names, expand just those variables, and redeclare the function.
expandFnVars() {
if [[ $# -lt 2 ]]; then
>&2 echo 'expandFnVars requires at least two arguments: the function name and the variable(s) to be expanded'
return 1
fi
local fn="$1"
shift
local vars=("$#")
if [[ -z "$(declare -F $fn 2> /dev/null)" ]]; then
>&2 echo $fn is not a function.
return 1
fi
foundAllVars=true
for v in $vars; do
if [[ -z "$(declare -p $v 2> /dev/null)" ]]; then
>&2 echo $v is not a declared value.
foundAllVars=false
fi
done
[[ $foundAllVars != true ]] && return 1
fn="$(declare -f $fn)"
for v in $vars; do
local val="$(eval 'echo $'$v)" # get the value of the varable represented by $v
val="${val//\"/\\\"}" # escape any double-quotes
val="${val//\\/\\\\\\}" # escape any backslashes
fn="$(echo "$fn" | sed -r 's/"?\$'$v'"?/"'"$val"'"/g')" # replace instances of "$$v" and $$v with $val
done
eval "$fn"
}
Usage:
foo="foo bar"
bar='$foo'
baz=baz
fn() {
echo $bar $baz
}
expandFnVars fn bar
declare -f fn
# prints:
# fn ()
# {
# echo "$foo" $baz
# }
expandFnVars fn foo
declare -f fn
# prints:
# fn ()
# {
# echo "foo bar" $baz
# }
Looking at it now, I see one flaw. Suppose $bar in the original function was in single-quotes. We probably would not want its value to be replaced. This could be fixed by some clever regex lookbehinds to count the number of unescaped 's, but I'm happy with it as-is.

Passing parameters to a Bash function

I am trying to search how to pass parameters in a Bash function, but what comes up is always how to pass parameter from the command line.
I would like to pass parameters within my script. I tried:
myBackupFunction("..", "...", "xx")
function myBackupFunction($directory, $options, $rootPassword) {
...
}
But the syntax is not correct. How can I pass a parameter to my function?
There are two typical ways of declaring a function. I prefer the second approach.
function function_name {
command...
}
or
function_name () {
command...
}
To call a function with arguments:
function_name "$arg1" "$arg2"
The function refers to passed arguments by their position (not by name), that is $1, $2, and so forth. $0 is the name of the script itself.
Example:
function_name () {
echo "Parameter #1 is $1"
}
Also, you need to call your function after it is declared.
#!/usr/bin/env sh
foo 1 # this will fail because foo has not been declared yet.
foo() {
echo "Parameter #1 is $1"
}
foo 2 # this will work.
Output:
./myScript.sh: line 2: foo: command not found
Parameter #1 is 2
Reference: Advanced Bash-Scripting Guide.
Knowledge of high level programming languages (C/C++, Java, PHP, Python, Perl, etc.) would suggest to the layman that Bourne Again Shell (Bash) functions should work like they do in those other languages.
Instead, Bash functions work like shell commands and expect arguments to be passed to them in the same way one might pass an option to a shell command (e.g. ls -l). In effect, function arguments in Bash are treated as positional parameters ($1, $2..$9, ${10}, ${11}, and so on). This is no surprise considering how getopts works. Do not use parentheses to call a function in Bash.
(Note: I happen to be working on OpenSolaris at the moment.)
# Bash style declaration for all you PHP/JavaScript junkies. :-)
# $1 is the directory to archive
# $2 is the name of the tar and zipped file when all is done.
function backupWebRoot ()
{
tar -cvf - "$1" | zip -n .jpg:.gif:.png "$2" - 2>> $errorlog &&
echo -e "\nTarball created!\n"
}
# sh style declaration for the purist in you. ;-)
# $1 is the directory to archive
# $2 is the name of the tar and zipped file when all is done.
backupWebRoot ()
{
tar -cvf - "$1" | zip -n .jpg:.gif:.png "$2" - 2>> $errorlog &&
echo -e "\nTarball created!\n"
}
# In the actual shell script
# $0 $1 $2
backupWebRoot ~/public/www/ webSite.tar.zip
Want to use names for variables? Just do something this.
local filename=$1 # The keyword declare can be used, but local is semantically more specific.
Be careful, though. If an argument to a function has a space in it, you may want to do this instead! Otherwise, $1 might not be what you think it is.
local filename="$1" # Just to be on the safe side. Although, if $1 was an integer, then what? Is that even possible? Humm.
Want to pass an array to a function by value?
callingSomeFunction "${someArray[#]}" # Expands to all array elements.
Inside the function, handle the arguments like this.
function callingSomeFunction ()
{
for value in "$#" # You want to use "$#" here, not "$*" !!!!!
do
:
done
}
Need to pass a value and an array, but still use "$#" inside the function?
function linearSearch ()
{
local myVar="$1"
shift 1 # Removes $1 from the parameter list
for value in "$#" # Represents the remaining parameters.
do
if [[ $value == $myVar ]]
then
echo -e "Found it!\t... after a while."
return 0
fi
done
return 1
}
linearSearch $someStringValue "${someArray[#]}"
In Bash 4.3 and above, you can pass an array to a function by reference by defining the parameter of a function with the -n option.
function callingSomeFunction ()
{
local -n someArray=$1 # also ${1:?} to make the parameter mandatory.
for value in "${someArray[#]}" # Nice!
do
:
done
}
callingSomeFunction myArray # No $ in front of the argument. You pass by name, not expansion / value.
If you prefer named parameters, it's possible (with a few tricks) to actually pass named parameters to functions (also makes it possible to pass arrays and references).
The method I developed allows you to define named parameters passed to a function like this:
function example { args : string firstName , string lastName , integer age } {
echo "My name is ${firstName} ${lastName} and I am ${age} years old."
}
You can also annotate arguments as #required or #readonly, create ...rest arguments, create arrays from sequential arguments (using e.g. string[4]) and optionally list the arguments in multiple lines:
function example {
args
: #required string firstName
: string lastName
: integer age
: string[] ...favoriteHobbies
echo "My name is ${firstName} ${lastName} and I am ${age} years old."
echo "My favorite hobbies include: ${favoriteHobbies[*]}"
}
In other words, not only you can call your parameters by their names (which makes up for a more readable core), you can actually pass arrays (and references to variables - this feature works only in Bash 4.3 though)! Plus, the mapped variables are all in the local scope, just as $1 (and others).
The code that makes this work is pretty light and works both in Bash 3 and Bash 4 (these are the only versions I've tested it with). If you're interested in more tricks like this that make developing with bash much nicer and easier, you can take a look at my Bash Infinity Framework, the code below is available as one of its functionalities.
shopt -s expand_aliases
function assignTrap {
local evalString
local -i paramIndex=${__paramIndex-0}
local initialCommand="${1-}"
if [[ "$initialCommand" != ":" ]]
then
echo "trap - DEBUG; eval \"${__previousTrap}\"; unset __previousTrap; unset __paramIndex;"
return
fi
while [[ "${1-}" == "," || "${1-}" == "${initialCommand}" ]] || [[ "${##}" -gt 0 && "$paramIndex" -eq 0 ]]
do
shift # First colon ":" or next parameter's comma ","
paramIndex+=1
local -a decorators=()
while [[ "${1-}" == "#"* ]]
do
decorators+=( "$1" )
shift
done
local declaration=
local wrapLeft='"'
local wrapRight='"'
local nextType="$1"
local length=1
case ${nextType} in
string | boolean) declaration="local " ;;
integer) declaration="local -i" ;;
reference) declaration="local -n" ;;
arrayDeclaration) declaration="local -a"; wrapLeft= ; wrapRight= ;;
assocDeclaration) declaration="local -A"; wrapLeft= ; wrapRight= ;;
"string["*"]") declaration="local -a"; length="${nextType//[a-z\[\]]}" ;;
"integer["*"]") declaration="local -ai"; length="${nextType//[a-z\[\]]}" ;;
esac
if [[ "${declaration}" != "" ]]
then
shift
local nextName="$1"
for decorator in "${decorators[#]}"
do
case ${decorator} in
#readonly) declaration+="r" ;;
#required) evalString+="[[ ! -z \$${paramIndex} ]] || echo \"Parameter '$nextName' ($nextType) is marked as required by '${FUNCNAME[1]}' function.\"; " >&2 ;;
#global) declaration+="g" ;;
esac
done
local paramRange="$paramIndex"
if [[ -z "$length" ]]
then
# ...rest
paramRange="{#:$paramIndex}"
# trim leading ...
nextName="${nextName//\./}"
if [[ "${##}" -gt 1 ]]
then
echo "Unexpected arguments after a rest array ($nextName) in '${FUNCNAME[1]}' function." >&2
fi
elif [[ "$length" -gt 1 ]]
then
paramRange="{#:$paramIndex:$length}"
paramIndex+=$((length - 1))
fi
evalString+="${declaration} ${nextName}=${wrapLeft}\$${paramRange}${wrapRight}; "
# Continue to the next parameter:
shift
fi
done
echo "${evalString} local -i __paramIndex=${paramIndex};"
}
alias args='local __previousTrap=$(trap -p DEBUG); trap "eval \"\$(assignTrap \$BASH_COMMAND)\";" DEBUG;'
Drop the parentheses and commas:
myBackupFunction ".." "..." "xx"
And the function should look like this:
function myBackupFunction() {
# Here $1 is the first parameter, $2 the second, etc.
}
A simple example that will clear both during executing script or inside script while calling a function.
#!/bin/bash
echo "parameterized function example"
function print_param_value(){
value1="${1}" # $1 represent first argument
value2="${2}" # $2 represent second argument
echo "param 1 is ${value1}" # As string
echo "param 2 is ${value2}"
sum=$(($value1+$value2)) # Process them as number
echo "The sum of two value is ${sum}"
}
print_param_value "6" "4" # Space-separated value
# You can also pass parameters during executing the script
print_param_value "$1" "$2" # Parameter $1 and $2 during execution
# Suppose our script name is "param_example".
# Call it like this:
#
# ./param_example 5 5
#
# Now the parameters will be $1=5 and $2=5
It takes two numbers from the user, feeds them to the function called add (in the very last line of the code), and add will sum them up and print them.
#!/bin/bash
read -p "Enter the first value: " x
read -p "Enter the second value: " y
add(){
arg1=$1 # arg1 gets to be the first assigned argument (note there are no spaces)
arg2=$2 # arg2 gets to be the second assigned argument (note there are no spaces)
echo $(($arg1 + $arg2))
}
add x y # Feeding the arguments
Another way to pass named parameters to Bash... is passing by reference. This is supported as of Bash 4.0
#!/bin/bash
function myBackupFunction(){ # directory options destination filename
local directory="$1" options="$2" destination="$3" filename="$4";
echo "tar cz ${!options} ${!directory} | ssh root#backupserver \"cat > /mnt/${!destination}/${!filename}.tgz\"";
}
declare -A backup=([directory]=".." [options]="..." [destination]="backups" [filename]="backup" );
myBackupFunction backup[directory] backup[options] backup[destination] backup[filename];
An alternative syntax for Bash 4.3 is using a nameref.
Although the nameref is a lot more convenient in that it seamlessly dereferences, some older supported distros still ship an older version, so I won't recommend it quite yet.

How to parse $QUERY_STRING from a bash CGI script?

I have a bash script that is being used in a CGI. The CGI sets the $QUERY_STRING environment variable by reading everything after the ? in the URL. For example, http://example.com?a=123&b=456&c=ok sets QUERY_STRING=a=123&b=456&c=ok.
Somewhere I found the following ugliness:
b=$(echo "$QUERY_STRING" | sed -n 's/^.*b=\([^&]*\).*$/\1/p' | sed "s/%20/ /g")
which will set $b to whatever was found in $QUERY_STRING for b. However, my script has grown to have over ten input parameters. Is there an easier way to automatically convert the parameters in $QUERY_STRING into environment variables usable by bash?
Maybe I'll just use a for loop of some sort, but it'd be even better if the script was smart enough to automatically detect each parameter and maybe build an array that looks something like this:
${parm[a]}=123
${parm[b]}=456
${parm[c]}=ok
How could I write code to do that?
Try this:
saveIFS=$IFS
IFS='=&'
parm=($QUERY_STRING)
IFS=$saveIFS
Now you have this:
parm[0]=a
parm[1]=123
parm[2]=b
parm[3]=456
parm[4]=c
parm[5]=ok
In Bash 4, which has associative arrays, you can do this (using the array created above):
declare -A array
for ((i=0; i<${#parm[#]}; i+=2))
do
array[${parm[i]}]=${parm[i+1]}
done
which will give you this:
array[a]=123
array[b]=456
array[c]=ok
Edit:
To use indirection in Bash 2 and later (using the parm array created above):
for ((i=0; i<${#parm[#]}; i+=2))
do
declare var_${parm[i]}=${parm[i+1]}
done
Then you will have:
var_a=123
var_b=456
var_c=ok
You can access these directly:
echo $var_a
or indirectly:
for p in a b c
do
name="var$p"
echo ${!name}
done
If possible, it's better to avoid indirection since it can make code messy and be a source of bugs.
you can break $QUERY down using IFS. For example, setting it to &
$ QUERY="a=123&b=456&c=ok"
$ echo $QUERY
a=123&b=456&c=ok
$ IFS="&"
$ set -- $QUERY
$ echo $1
a=123
$ echo $2
b=456
$ echo $3
c=ok
$ array=($#)
$ for i in "${array[#]}"; do IFS="=" ; set -- $i; echo $1 $2; done
a 123
b 456
c ok
And you can save to a hash/dictionary in Bash 4+
$ declare -A hash
$ for i in "${array[#]}"; do IFS="=" ; set -- $i; hash[$1]=$2; done
$ echo ${hash["b"]}
456
Please don't use the evil eval junk.
Here's how you can reliably parse the string and get an associative array:
declare -A param
while IFS='=' read -r -d '&' key value && [[ -n "$key" ]]; do
param["$key"]=$value
done <<<"${QUERY_STRING}&"
If you don't like the key check, you could do this instead:
declare -A param
while IFS='=' read -r -d '&' key value; do
param["$key"]=$value
done <<<"${QUERY_STRING:+"${QUERY_STRING}&"}"
Listing all the keys and values from the array:
for key in "${!param[#]}"; do
echo "$key: ${param[$key]}"
done
I packaged the sed command up into another script:
$cat getvar.sh
s='s/^.*'${1}'=\([^&]*\).*$/\1/p'
echo $QUERY_STRING | sed -n $s | sed "s/%20/ /g"
and I call it from my main cgi as:
id=`./getvar.sh id`
ds=`./getvar.sh ds`
dt=`./getvar.sh dt`
...etc, etc - you get idea.
works for me even with a very basic busybox appliance (my PVR in this case).
To converts the contents of QUERY_STRING into bash variables use the following command:
eval $(echo ${QUERY_STRING//&/;})
The inner step, echo ${QUERY_STRING//&/;}, substitutes all ampersands with semicolons producing a=123;b=456;c=ok which the eval then evaluates into the current shell.
The result can then be used as bash variables.
echo $a
echo $b
echo $c
The assumptions are:
values will never contain '&'
values will never contain ';'
QUERY_STRING will never contain malicious code
While the accepted answer is probably the most beautiful one, there might be cases where security is super-important, and it needs to be also well-visible from your script.
In such a case, first I wouldn't use bash for the task, but if it should be done on some reason, it might be better to avoid these new array - dictionary features, because you can't be sure, how exactly are they escaped.
In this case, the good old primitive solutions might work:
QS="${QUERY_STRING}"
while [ "${QS}" != "" ]
do
nameval="${QS%%&*}"
QS="${QS#$nameval}"
QS="${QS#&}"
name="${nameval%%=*}"
val="${nameval#$name}"
val="${nameval#=}"
# and here we have $name and $val as names and values
# ...
done
This iterates on the name-value pairs of the QUERY_STRING, and there is no way to circumvent it with any tricky escape sequence - the " is a very strong thing in bash, except a single variable name substitution, which is fully controlled by us, nothing can be tricked.
Furthermore, you can inject your own processing code into "# ...". This enables you to allow only your own, well-defined (and, ideally, short) list of the allowed variable names. Needless to say, LD_PRELOAD shouldn't be one of them. ;-)
Furthermore, no variable will be exported, and exclusively QS, nameval, name and val is used.
Following the correct answer, I've done myself some changes to support array variables like in this other question. I added also a decode function of which I can not find the author to give some credit.
Code appears somewhat messy, but it works. Changes and other recommendations would be greatly appreciated.
function cgi_decodevar() {
[ $# -ne 1 ] && return
local v t h
# replace all + with whitespace and append %%
t="${1//+/ }%%"
while [ ${#t} -gt 0 -a "${t}" != "%" ]; do
v="${v}${t%%\%*}" # digest up to the first %
t="${t#*%}" # remove digested part
# decode if there is anything to decode and if not at end of string
if [ ${#t} -gt 0 -a "${t}" != "%" ]; then
h=${t:0:2} # save first two chars
t="${t:2}" # remove these
v="${v}"`echo -e \\\\x${h}` # convert hex to special char
fi
done
# return decoded string
echo "${v}"
return
}
saveIFS=$IFS
IFS='=&'
VARS=($QUERY_STRING)
IFS=$saveIFS
for ((i=0; i<${#VARS[#]}; i+=2))
do
curr="$(cgi_decodevar ${VARS[i]})"
next="$(cgi_decodevar ${VARS[i+2]})"
prev="$(cgi_decodevar ${VARS[i-2]})"
value="$(cgi_decodevar ${VARS[i+1]})"
array=${curr%"[]"}
if [ "$curr" == "$next" ] && [ "$curr" != "$prev" ] ;then
j=0
declare var_${array}[$j]="$value"
elif [ $i -gt 1 ] && [ "$curr" == "$prev" ]; then
j=$((j + 1))
declare var_${array}[$j]="$value"
else
declare var_$curr="$value"
fi
done
I would simply replace the & to ;. It will become to something like:
a=123;b=456;c=ok
So now you need just evaluate and read your vars:
eval `echo "${QUERY_STRING}"|tr '&' ';'`
echo $a
echo $b
echo $c
A nice way to handle CGI query strings is to use Haserl which acts as a wrapper around your Bash cgi script, and offers convenient and secure query string parsing.
To bring this up to date, if you have a recent Bash version then you can achieve this with regular expressions:
q="$QUERY_STRING"
re1='^(\w+=\w+)&?'
re2='^(\w+)=(\w+)$'
declare -A params
while [[ $q =~ $re1 ]]; do
q=${q##*${BASH_REMATCH[0]}}
[[ ${BASH_REMATCH[1]} =~ $re2 ]] && params+=([${BASH_REMATCH[1]}]=${BASH_REMATCH[2]})
done
If you don't want to use associative arrays then just change the penultimate line to do what you want. For each iteration of the loop the parameter is in ${BASH_REMATCH[1]} and its value is in ${BASH_REMATCH[2]}.
Here is the same thing as a function in a short test script that iterates over the array outputs the query string's parameters and their values
#!/bin/bash
QUERY_STRING='foo=hello&bar=there&baz=freddy'
get_query_string() {
local q="$QUERY_STRING"
local re1='^(\w+=\w+)&?'
local re2='^(\w+)=(\w+)$'
while [[ $q =~ $re1 ]]; do
q=${q##*${BASH_REMATCH[0]}}
[[ ${BASH_REMATCH[1]} =~ $re2 ]] && eval "$1+=([${BASH_REMATCH[1]}]=${BASH_REMATCH[2]})"
done
}
declare -A params
get_query_string params
for k in "${!params[#]}"
do
v="${params[$k]}"
echo "$k : $v"
done
Note the parameters end up in the array in reverse order (it's associative so that shouldn't matter).
why not this
$ echo "${QUERY_STRING}"
name=carlo&last=lanza&city=pfungen-CH
$ saveIFS=$IFS
$ IFS='&'
$ eval $QUERY_STRING
$ IFS=$saveIFS
now you have this
name = carlo
last = lanza
city = pfungen-CH
$ echo "name is ${name}"
name is carlo
$ echo "last is ${last}"
last is lanza
$ echo "city is ${city}"
city is pfungen-CH
#giacecco
To include a hiphen in the regex you could change the two lines as such in answer from #starfry.
Change these two lines:
local re1='^(\w+=\w+)&?'
local re2='^(\w+)=(\w+)$'
To these two lines:
local re1='^(\w+=(\w+|-|)+)&?'
local re2='^(\w+)=((\w+|-|)+)$'
For all those who couldn't get it working with the posted answers (like me),
this guy figured it out.
Can't upvote his post unfortunately...
Let me repost the code here real quick:
#!/bin/sh
if [ "$REQUEST_METHOD" = "POST" ]; then
if [ "$CONTENT_LENGTH" -gt 0 ]; then
read -n $CONTENT_LENGTH POST_DATA <&0
fi
fi
#echo "$POST_DATA" > data.bin
IFS='=&'
set -- $POST_DATA
#2- Value1
#4- Value2
#6- Value3
#8- Value4
echo $2 $4 $6 $8
echo "Content-type: text/html"
echo ""
echo "<html><head><title>Saved</title></head><body>"
echo "Data received: $POST_DATA"
echo "</body></html>"
Hope this is of help for anybody.
Cheers
Actually I liked bolt's answer, so I made a version which works with Busybox as well (ash in Busybox does not support here string).
This code will accept key1 and key2 parameters, all others will be ignored.
while IFS= read -r -d '&' KEYVAL && [[ -n "$KEYVAL" ]]; do
case ${KEYVAL%=*} in
key1) KEY1=${KEYVAL#*=} ;;
key2) KEY2=${KEYVAL#*=} ;;
esac
done <<END
$(echo "${QUERY_STRING}&")
END
One can use the bash-cgi.sh, which processes :
the query string into the $QUERY_STRING_GET key and value array;
the post request data (x-www-form-urlencoded) into the $QUERY_STRING_POST key and value array;
the cookies data into the $HTTP_COOKIES key and value array.
Demands bash version 4.0 or higher (to define the key and value arrays above).
All processing is made by bash only (i.e. in an one process) without any external dependencies and additional processes invoking.
It has:
the check for max length of data, which can be transferred to it's input,
as well as processed as query string and cookies;
the redirect() procedure to produce redirect to itself with the extension changed to .html (it is useful for an one page's sites);
the http_header_tail() procedure to output the last two strings of the HTTP(S) respond's header;
the $REMOTE_ADDR value sanitizer from possible injections;
the parser and evaluator of the escaped UTF-8 symbols embedded into the values passed to the $QUERY_STRING_GET, $QUERY_STRING_POST and $HTTP_COOKIES;
the sanitizer of the $QUERY_STRING_GET, $QUERY_STRING_POST and $HTTP_COOKIES values against possible SQL injections (the escaping like the mysql_real_escape_string php function does, plus the escaping of # and $).
It is available here:
https://github.com/VladimirBelousov/fancy_scripts
This works in dash using for in loop
IFS='&'
for f in $query_string; do
value=${f##*=}
key=${f%%=*}
# if you need environment variable -> eval "qs_$key=$value"
done

Resources