Getting piped data to functions - bash

Example output
Say I have a function, a:
function a() {
read -r VALUE
if [[ -n "$VALUE" ]]; then # empty variable check
echo "$VALUE"
else
echo "Default value"
fi
}
So, to demonstrate piping to that function:
nick#nick-lt:~$ echo "Something" | a
Something
However, piping data to this function should be optional. So, this should also be valid. and give the following output:
nick#nick-lt:~$ a
Default value
However, the function hangs, as the read command waits for data from stdin.
What I've tried
Honestly not a lot, because I don't know much about this, and searching on Google returned very little.
Conceptually, I thought there might be a way to "push" an empty (or whitespace, whatever works) value to the stdin stream, so that even empty stdin at least has this value appended/prepended, triggering read and then simply trim off that first/last character. I didn't find a way to do this.
Question
How can I, if possible, make both of the above scenarios work for function a, so that piping is optional?
EDIT: Apologies, quickly written question. Should work properly now.

One way is to check whether standard input (fd 0) is a terminal. If so, don't read, because that will cause the user to have to enter something.
function a() {
value=""
if [ \! -t 0 ] ; then # read only if fd 0 is a pipe (not a tty)
read -r value
fi
if [ "$value" ] ; then # if nonempty, print it!
echo "$value"
else
echo "Default value"
fi
}
I checked this on cygwin: a prints "Default value" and echo 42 | a prints "42".

Two issues:
Syntactic, You need a space, before closing ]]
Algorithmic, You need the -n (non-zero length) variable test, not -z (zero length)
So:
if [[ -n "$VALUE" ]]; then
Or simply:
if [[ "$VALUE" ]]; then
As [[ is a shell builtin, you don't strictly need the double quotes:
if [[ $VALUE ]]; then
Also refrain from using all uppercases as variable name, as these are usually used for denoting environment variables, and your defined one might somehow overwrite already existing one. So use lowercase variable name:
if [[ $value ]]; then
unless you are export-ing your variable, and strictly need it to be uppercased, also make sure it is not overwriting any already existing one.
Also, i would add a timeout to read e.g. -t 5 for 5 seconds, and if no input is entered, print the default value. Also change the function name to something more meaningful.
Do:
function myfunc () {
read -rt5 value
if [[ "$value" ]]; then
echo "$value"
else
echo "Default value"
fi
}
Example:
$ function myfunc () { read -rt5 value; if [[ "$value" ]]; then echo "$value"; else echo "Default value"; fi ;}
$ myfunc
Default value
$ echo "something" | myfunc
something
$ myfunc
foobar
foobar

Related

multi dimensional array (not real) variable name in bash 4.1.2

i want to use a manual created (fake multidimensinal) array in bash script but when using the array in conditions i want to use the array name from a variable.
Using bash version 4.1.2 so declare -n doesn't exist.
I guess my example will be more helpfull to see what i want to do:
declare -A test
test[ar,action]="dosomething"
test[bc,action2]="doelse"
test[bc,resolv]="dotest"
#works:
echo "this works: ${test[bc,action2]}"
#but if i want to use a variable name, bad substitution error
name="test"
echo "01 this works: ${$name[bc,action2]}"
#another test doesn't work also
echo "02 test2 : ${!name[bc,action2]}"
#final goal is to do something like this:
if [[ "${!name[bc,action2]}" == "doelse" ]]; then
echo "mission completed"
fi
checked other posts with "eval" but can't get it working.
also tested this and could work but i lost the index name in that way... i need that also.
all_elems_indirection="${name[#]}"
echo "works, a list of items : ${!all_elems_indirection}"
test3="${name}[$cust,buyer]"
echo "test3 works : ${!test3}"
second_elem_indirection="${name}[bc,action2]"
echo "test 3 works: ${!second_elem_indirection}"
#but when i want to loop through the indexes from the array with the linked values, it doesn't work, i lost the indexes.
for i in "${!all_elems_indirection}"; do
echo "index name: $i"
done
With eval, would you please try the following:
#!/bin/bash
declare -A test
test[bc,action2]="doelse"
name="test"
if [[ $(eval echo '$'{"$name"'[bc,action2]}') == "doelse" ]]; then
echo "mission completed"
fi
As eval allows execution of arbitrary code, we need to pay maximum
attention so that the codes, variables, and relevant files are under
full control and there is no room of alternation or injection.
It's just data. It's just text. Don't restrict yourself to Bash data structures. You can build your abstractions upon any underlying storage.
mydata_init() {
printf -v "$1" ""
}
mydata_put() {
printf -v "$1" "%s\n%s\n" "${!1}" "${*:2}"
}
mydata_get2() {
local IFS
unset IFS
while read -r a b v; do
if [[ "$a" == "$2" && "$b" == "$3" ]]; then
printf -v "$4" "%s" "$v"
return 0
fi
done <<<"${!1}"
return 1
}
mydata_init test
mydata_put test ar action dosomething
mydata_put test bc action2 doelse
mydata_put test bc resolv dotest
if mydata_get2 test bc action2 var && [[ "$var" == "doelse" ]]; then
echo "mission completed"
fi
When the built-in features of the language are not enough for you, you can: enhance the language, build your own abstractions, or use another language. Use Perl or Python, in which representing such data structures will be trivial.

Change variable named in argument to bash function [duplicate]

This question already has answers here:
Dynamic variable names in Bash
(19 answers)
How to use a variable's value as another variable's name in bash [duplicate]
(6 answers)
Closed 5 years ago.
In my bash scripts, I often prompt users for y/n answers. Since I often use this several times in a single script, I'd like to have a function that checks if the user input is some variant of Yes / No, and then cleans this answer to "y" or "n". Something like this:
yesno(){
temp=""
if [[ "$1" =~ ^([Yy](es|ES)?|[Nn][Oo]?)$ ]] ; then
temp=$(echo "$1" | tr '[:upper:]' '[:lower:]' | sed 's/es//g' | sed 's/no//g')
break
else
echo "$1 is not a valid answer."
fi
}
I then would like to use the function as follows:
while read -p "Do you want to do this? " confirm; do # Here the user types "YES"
yesno $confirm
done
if [[ $confirm == "y" ]]; then
[do something]
fi
Basically, I want to change the value of the first argument to the value of $confirm, so that when I exit the yesno function, $confirm is either "y" or "n".
I tried using set -- "$temp" within the yesnofunction, but I can't get it to work.
You could do it by outputting the new value and overwriting the variable in the caller.
yesno() {
if [[ "$1" =~ ^([Yy](es|ES)?|[Nn][Oo]?)$ ]] ; then
local answer=${1,,}
echo "${answer::1}"
else
echo "$1 is not a valid answer." >&2
echo "$1" # output the original value
return 1 # indicate failure in case the caller cares
fi
}
confirm=$(yesno "$confirm")
However, I'd recommend a more direct approach: have the function do the prompting and looping. Move all of that repeated logic inside. Then the call site is super simple.
confirm() {
local prompt=$1
local reply
while true; do
read -p "$prompt" reply
case ${reply,,} in
y*) return 0;;
n*) return 1;;
*) echo "$reply is not a valid answer." >&2;;
esac
done
}
if confirm "Do you want to do this? "; then
# Do it.
else
# Don't do it.
fi
(${reply,,} is a bash-ism that converts $reply to lowercase.)
You could use the nameref attribute of Bash (requires Bash 4.3 or newer) as follows:
#!/bin/bash
yesno () {
# Declare arg as reference to argument provided
declare -n arg=$1
local re1='(y)(es)?'
local re2='(n)o?'
# Set to empty and return if no regex matches
[[ ${arg,,} =~ $re1 ]] || [[ ${arg,,} =~ $re2 ]] || { arg= && return; }
# Assign "y" or "n" to reference
arg=${BASH_REMATCH[1]}
}
while read -p "Prompt: " confirm; do
yesno confirm
echo "$confirm"
done
A sample test run looks like this:
Prompt: YES
y
Prompt: nOoOoOo
n
Prompt: abc
Prompt:
The expressions are anchored at the start, so yessss etc. all count as well. If this is not desired, an end anchor ($) can be added.
If neither expression matches, the string is set to empty.

how to write a Bash function that confirms the value of an existing variable with a user

I have a large number of configuration variables for which I want users to issue confirmation of the values. So, there could be some variable specifying a run number in existence and I want the script to ask the user if the current value of the variable is ok. If the user responds that the value is not ok, the script requests a new value and assigns it to the variable.
I have made an initial attempt at a function for doing this, but there is some difficulty with its running; it stalls. I would value some assistance in solving the problem and also any criticisms of the approach I'm using. The code is as follows:
confirmVariableValue(){
variableName="${1}"
variableValue="${!variableName}"
while [[ "${userInput}" != "n" && "${userInput}" != "y" ]]; do
echo "variable "${variableName}" value: "${variableValue}""
echo "Is this correct? (y: continue / n: change it / other: exit)"
read userInput
# Make the user input lowercase.
userInput="$(echo "${userInput}" | sed 's/\(.*\)/\L\1/')"
# If the user input is "n", request a new value for the variable. If the
# user input is anything other than "y" or "n", exit. If the user input
# is "y", then the user confirmation loop ends.
if [[ "${userInput}" == "n" ]]; then
echo "enter variable "${variableName}" value:"
read variableValue
elif [[ "${userInput}" != "y" && "${userInput}" != "n" ]]; then
echo "terminating"
exit 0
fi
done
echo "${variableValue}"
}
myVariable="run_2014-09-23T1909"
echo "--------------------------------------------------------------------------------"
echo "initial variable value: "${myVariable}""
myVariable="$(confirmVariableValue "myVariable")"
echo "final variable value: "${myVariable}""
echo "--------------------------------------------------------------------------------"
The problem is here:
myVariable="$(confirmVariableValue "myVariable")"
your questions, like
echo "Is this correct? (y: continue / n: change it / other: exit)"
are going into the myVariable and not to the screen.
Try print questions to STDERR, or any other file-descriptor but STDOUT.
Opinion based comment: I would be unhappy with such config-script. It is way too chatty. For me is better:
print out the description and the default value
and ask Press Enter for confirm or enter a new value or <something> for exit>
You can also, use the following technique:
use the bash readline library for the read command with -e
use the -i value for set the default value for the editing
use the printf -v variable to print into variable, so you don't need to use var=$(...) nor any (potentially) dangerous eval...
example:
err() { echo "$#" >&2; return 1; }
getval() {
while :
do
read -e -i "${!1}" -p "$1>" inp
case "$inp" in
Q|q) err "Quitting...." || return 1 ;;
"") err "Must enter some value" ;;
*)
#validate the input here
#and print the new value into the variable
printf -v "$1" "%s" "$inp"
return 0
;;
esac
done
}
somevariable=val1
anotherone=val2
x=val3
for var in somevariable anotherone x
do
getval "$var" || exit
echo "new value for $var is: =${!var}="
done
I would not have them answer "Yes" then type in the new value. Just have them type in the new value if they want one, or leave it blank to accept the default.
This little function lets you set multiple variables in one call:
function confirm() {
echo "Confirming values for several variables."
for var; do
read -p "$var = ${!var} ... leave blank to accept or enter a new value: "
case $REPLY in
"") # empty use default
;;
*) # not empty, set the variable using printf -v
printf -v "$var" "$REPLY"
;;
esac
done
}
Used like so:
$ foo='foo_default_value'
$ bar='default_for_bar'
$ confirm foo bar
Confirming values for several variables.
foo = foo_default_value ... leave blank to accept or enter a new value: bar
bar = default_for_bar ... leave blank to accept or enter a new value:
foo=[bar], bar=[default_for_bar]
Of course, if blank can be a default, then you would need to account for that, like #jm666 use of read -i.

How to parse $QUERY_STRING from a bash CGI script?

I have a bash script that is being used in a CGI. The CGI sets the $QUERY_STRING environment variable by reading everything after the ? in the URL. For example, http://example.com?a=123&b=456&c=ok sets QUERY_STRING=a=123&b=456&c=ok.
Somewhere I found the following ugliness:
b=$(echo "$QUERY_STRING" | sed -n 's/^.*b=\([^&]*\).*$/\1/p' | sed "s/%20/ /g")
which will set $b to whatever was found in $QUERY_STRING for b. However, my script has grown to have over ten input parameters. Is there an easier way to automatically convert the parameters in $QUERY_STRING into environment variables usable by bash?
Maybe I'll just use a for loop of some sort, but it'd be even better if the script was smart enough to automatically detect each parameter and maybe build an array that looks something like this:
${parm[a]}=123
${parm[b]}=456
${parm[c]}=ok
How could I write code to do that?
Try this:
saveIFS=$IFS
IFS='=&'
parm=($QUERY_STRING)
IFS=$saveIFS
Now you have this:
parm[0]=a
parm[1]=123
parm[2]=b
parm[3]=456
parm[4]=c
parm[5]=ok
In Bash 4, which has associative arrays, you can do this (using the array created above):
declare -A array
for ((i=0; i<${#parm[#]}; i+=2))
do
array[${parm[i]}]=${parm[i+1]}
done
which will give you this:
array[a]=123
array[b]=456
array[c]=ok
Edit:
To use indirection in Bash 2 and later (using the parm array created above):
for ((i=0; i<${#parm[#]}; i+=2))
do
declare var_${parm[i]}=${parm[i+1]}
done
Then you will have:
var_a=123
var_b=456
var_c=ok
You can access these directly:
echo $var_a
or indirectly:
for p in a b c
do
name="var$p"
echo ${!name}
done
If possible, it's better to avoid indirection since it can make code messy and be a source of bugs.
you can break $QUERY down using IFS. For example, setting it to &
$ QUERY="a=123&b=456&c=ok"
$ echo $QUERY
a=123&b=456&c=ok
$ IFS="&"
$ set -- $QUERY
$ echo $1
a=123
$ echo $2
b=456
$ echo $3
c=ok
$ array=($#)
$ for i in "${array[#]}"; do IFS="=" ; set -- $i; echo $1 $2; done
a 123
b 456
c ok
And you can save to a hash/dictionary in Bash 4+
$ declare -A hash
$ for i in "${array[#]}"; do IFS="=" ; set -- $i; hash[$1]=$2; done
$ echo ${hash["b"]}
456
Please don't use the evil eval junk.
Here's how you can reliably parse the string and get an associative array:
declare -A param
while IFS='=' read -r -d '&' key value && [[ -n "$key" ]]; do
param["$key"]=$value
done <<<"${QUERY_STRING}&"
If you don't like the key check, you could do this instead:
declare -A param
while IFS='=' read -r -d '&' key value; do
param["$key"]=$value
done <<<"${QUERY_STRING:+"${QUERY_STRING}&"}"
Listing all the keys and values from the array:
for key in "${!param[#]}"; do
echo "$key: ${param[$key]}"
done
I packaged the sed command up into another script:
$cat getvar.sh
s='s/^.*'${1}'=\([^&]*\).*$/\1/p'
echo $QUERY_STRING | sed -n $s | sed "s/%20/ /g"
and I call it from my main cgi as:
id=`./getvar.sh id`
ds=`./getvar.sh ds`
dt=`./getvar.sh dt`
...etc, etc - you get idea.
works for me even with a very basic busybox appliance (my PVR in this case).
To converts the contents of QUERY_STRING into bash variables use the following command:
eval $(echo ${QUERY_STRING//&/;})
The inner step, echo ${QUERY_STRING//&/;}, substitutes all ampersands with semicolons producing a=123;b=456;c=ok which the eval then evaluates into the current shell.
The result can then be used as bash variables.
echo $a
echo $b
echo $c
The assumptions are:
values will never contain '&'
values will never contain ';'
QUERY_STRING will never contain malicious code
While the accepted answer is probably the most beautiful one, there might be cases where security is super-important, and it needs to be also well-visible from your script.
In such a case, first I wouldn't use bash for the task, but if it should be done on some reason, it might be better to avoid these new array - dictionary features, because you can't be sure, how exactly are they escaped.
In this case, the good old primitive solutions might work:
QS="${QUERY_STRING}"
while [ "${QS}" != "" ]
do
nameval="${QS%%&*}"
QS="${QS#$nameval}"
QS="${QS#&}"
name="${nameval%%=*}"
val="${nameval#$name}"
val="${nameval#=}"
# and here we have $name and $val as names and values
# ...
done
This iterates on the name-value pairs of the QUERY_STRING, and there is no way to circumvent it with any tricky escape sequence - the " is a very strong thing in bash, except a single variable name substitution, which is fully controlled by us, nothing can be tricked.
Furthermore, you can inject your own processing code into "# ...". This enables you to allow only your own, well-defined (and, ideally, short) list of the allowed variable names. Needless to say, LD_PRELOAD shouldn't be one of them. ;-)
Furthermore, no variable will be exported, and exclusively QS, nameval, name and val is used.
Following the correct answer, I've done myself some changes to support array variables like in this other question. I added also a decode function of which I can not find the author to give some credit.
Code appears somewhat messy, but it works. Changes and other recommendations would be greatly appreciated.
function cgi_decodevar() {
[ $# -ne 1 ] && return
local v t h
# replace all + with whitespace and append %%
t="${1//+/ }%%"
while [ ${#t} -gt 0 -a "${t}" != "%" ]; do
v="${v}${t%%\%*}" # digest up to the first %
t="${t#*%}" # remove digested part
# decode if there is anything to decode and if not at end of string
if [ ${#t} -gt 0 -a "${t}" != "%" ]; then
h=${t:0:2} # save first two chars
t="${t:2}" # remove these
v="${v}"`echo -e \\\\x${h}` # convert hex to special char
fi
done
# return decoded string
echo "${v}"
return
}
saveIFS=$IFS
IFS='=&'
VARS=($QUERY_STRING)
IFS=$saveIFS
for ((i=0; i<${#VARS[#]}; i+=2))
do
curr="$(cgi_decodevar ${VARS[i]})"
next="$(cgi_decodevar ${VARS[i+2]})"
prev="$(cgi_decodevar ${VARS[i-2]})"
value="$(cgi_decodevar ${VARS[i+1]})"
array=${curr%"[]"}
if [ "$curr" == "$next" ] && [ "$curr" != "$prev" ] ;then
j=0
declare var_${array}[$j]="$value"
elif [ $i -gt 1 ] && [ "$curr" == "$prev" ]; then
j=$((j + 1))
declare var_${array}[$j]="$value"
else
declare var_$curr="$value"
fi
done
I would simply replace the & to ;. It will become to something like:
a=123;b=456;c=ok
So now you need just evaluate and read your vars:
eval `echo "${QUERY_STRING}"|tr '&' ';'`
echo $a
echo $b
echo $c
A nice way to handle CGI query strings is to use Haserl which acts as a wrapper around your Bash cgi script, and offers convenient and secure query string parsing.
To bring this up to date, if you have a recent Bash version then you can achieve this with regular expressions:
q="$QUERY_STRING"
re1='^(\w+=\w+)&?'
re2='^(\w+)=(\w+)$'
declare -A params
while [[ $q =~ $re1 ]]; do
q=${q##*${BASH_REMATCH[0]}}
[[ ${BASH_REMATCH[1]} =~ $re2 ]] && params+=([${BASH_REMATCH[1]}]=${BASH_REMATCH[2]})
done
If you don't want to use associative arrays then just change the penultimate line to do what you want. For each iteration of the loop the parameter is in ${BASH_REMATCH[1]} and its value is in ${BASH_REMATCH[2]}.
Here is the same thing as a function in a short test script that iterates over the array outputs the query string's parameters and their values
#!/bin/bash
QUERY_STRING='foo=hello&bar=there&baz=freddy'
get_query_string() {
local q="$QUERY_STRING"
local re1='^(\w+=\w+)&?'
local re2='^(\w+)=(\w+)$'
while [[ $q =~ $re1 ]]; do
q=${q##*${BASH_REMATCH[0]}}
[[ ${BASH_REMATCH[1]} =~ $re2 ]] && eval "$1+=([${BASH_REMATCH[1]}]=${BASH_REMATCH[2]})"
done
}
declare -A params
get_query_string params
for k in "${!params[#]}"
do
v="${params[$k]}"
echo "$k : $v"
done
Note the parameters end up in the array in reverse order (it's associative so that shouldn't matter).
why not this
$ echo "${QUERY_STRING}"
name=carlo&last=lanza&city=pfungen-CH
$ saveIFS=$IFS
$ IFS='&'
$ eval $QUERY_STRING
$ IFS=$saveIFS
now you have this
name = carlo
last = lanza
city = pfungen-CH
$ echo "name is ${name}"
name is carlo
$ echo "last is ${last}"
last is lanza
$ echo "city is ${city}"
city is pfungen-CH
#giacecco
To include a hiphen in the regex you could change the two lines as such in answer from #starfry.
Change these two lines:
local re1='^(\w+=\w+)&?'
local re2='^(\w+)=(\w+)$'
To these two lines:
local re1='^(\w+=(\w+|-|)+)&?'
local re2='^(\w+)=((\w+|-|)+)$'
For all those who couldn't get it working with the posted answers (like me),
this guy figured it out.
Can't upvote his post unfortunately...
Let me repost the code here real quick:
#!/bin/sh
if [ "$REQUEST_METHOD" = "POST" ]; then
if [ "$CONTENT_LENGTH" -gt 0 ]; then
read -n $CONTENT_LENGTH POST_DATA <&0
fi
fi
#echo "$POST_DATA" > data.bin
IFS='=&'
set -- $POST_DATA
#2- Value1
#4- Value2
#6- Value3
#8- Value4
echo $2 $4 $6 $8
echo "Content-type: text/html"
echo ""
echo "<html><head><title>Saved</title></head><body>"
echo "Data received: $POST_DATA"
echo "</body></html>"
Hope this is of help for anybody.
Cheers
Actually I liked bolt's answer, so I made a version which works with Busybox as well (ash in Busybox does not support here string).
This code will accept key1 and key2 parameters, all others will be ignored.
while IFS= read -r -d '&' KEYVAL && [[ -n "$KEYVAL" ]]; do
case ${KEYVAL%=*} in
key1) KEY1=${KEYVAL#*=} ;;
key2) KEY2=${KEYVAL#*=} ;;
esac
done <<END
$(echo "${QUERY_STRING}&")
END
One can use the bash-cgi.sh, which processes :
the query string into the $QUERY_STRING_GET key and value array;
the post request data (x-www-form-urlencoded) into the $QUERY_STRING_POST key and value array;
the cookies data into the $HTTP_COOKIES key and value array.
Demands bash version 4.0 or higher (to define the key and value arrays above).
All processing is made by bash only (i.e. in an one process) without any external dependencies and additional processes invoking.
It has:
the check for max length of data, which can be transferred to it's input,
as well as processed as query string and cookies;
the redirect() procedure to produce redirect to itself with the extension changed to .html (it is useful for an one page's sites);
the http_header_tail() procedure to output the last two strings of the HTTP(S) respond's header;
the $REMOTE_ADDR value sanitizer from possible injections;
the parser and evaluator of the escaped UTF-8 symbols embedded into the values passed to the $QUERY_STRING_GET, $QUERY_STRING_POST and $HTTP_COOKIES;
the sanitizer of the $QUERY_STRING_GET, $QUERY_STRING_POST and $HTTP_COOKIES values against possible SQL injections (the escaping like the mysql_real_escape_string php function does, plus the escaping of # and $).
It is available here:
https://github.com/VladimirBelousov/fancy_scripts
This works in dash using for in loop
IFS='&'
for f in $query_string; do
value=${f##*=}
key=${f%%=*}
# if you need environment variable -> eval "qs_$key=$value"
done

Test for a Bash variable being unset, using a function

A simple Bash variable test goes:
${varName:? "${varName} is not defined"}
I'd like to reuse this, by putting it in a function. How can I do it?
The following fails
#
# Test a variable exists
tvar(){
val=${1:? "${1} must be defined, preferably in $basedir"}
if [ -z ${val} ]
then
echo Zero length value
else
echo ${1} exists, value ${1}
fi
}
I.e., I need to exit if the test fails.
Thanks to lhunath's answer, I was led to a part of the Bash man page that I've overlooked hundreds of times:
When not performing substring expansion, bash tests for a parameter that is unset or null; omitting the colon results in a test only for a parameter that is unset.
This prompted me to create the following truth table:
Unset
Set, but null
Set and not null
Meaning
${var-_}
T
F
T
Not null or not set
${var:-_}
T
T
T
Always true, use for subst.
$var
F
F
T
'var' is set and not null
${!var[#]}
F
T
T
'var' is set
This table introduces the specification in the last row. The Bash man page says "If name is not an array, expands to 0 if name is set and null otherwise." For purposes of this truth table, it behaves the same even if it's an array.
You're looking for indirection.
assertNotEmpty() {
: "${!1:? "$1 is empty, aborting."}"
}
That causes the script to abort with an error message if you do something like this:
$ foo=""
$ assertNotEmpty foo
bash: !1: foo is empty, aborting.
If you just want to test whether foo is empty, instead of aborting the script, use this instead of a function:
[[ $foo ]]
For example:
until read -p "What is your name? " name && [[ $name ]]; do
echo "You didn't enter your name. Please, try again." >&2
done
Also, note that there is a very important difference between an empty and an unset parameter. You should take care not to confuse these terms! An empty parameter is one that is set, but just set to an empty string. An unset parameter is one that doesn't exist at all.
The previous examples all test for empty parameters. If you want to test for unset parameters and consider all set parameters OK, whether they're empty or not, use this:
[[ ! $foo && ${foo-_} ]]
Use it in a function like this:
assertIsSet() {
[[ ! ${!1} && ${!1-_} ]] && {
echo "$1 is not set, aborting." >&2
exit 1
}
}
Which only aborts the script when the parameter name you pass denotes a parameter that isn't set:
$ ( foo="blah"; assertIsSet foo; echo "Still running." )
Still running.
$ ( foo=""; assertIsSet foo; echo "Still running." )
Still running.
$ ( unset foo; assertIsSet foo; echo "Still running." )
foo is not set, aborting.
You want to use [ -z ${parameter+word} ]
Some part of man bash:
Parameter Expansion
...
In each of the cases below, word is subject to tilde expansion, parameter expansion, command substitution, and
arithmetic expansion. When not performing substring expansion, bash tests for a parameter that is unset or null;
omitting the colon results in a test only for a parameter that is unset.
...
${parameter:+word}
Use Alternate Value. If parameter is null or unset, nothing is substituted, otherwise the expansion of
word is substituted.
...
in other words:
${parameter+word}
Use Alternate Value. If parameter is unset, nothing is substituted, otherwise the expansion of
word is substituted.
some examples:
$ set | grep FOOBAR
$ if [ -z "${FOOBAR+something}" ]; then echo "it is unset"; fi
it is unset
$ declare FOOBAR
$ if [ -z "${FOOBAR+something}" ]; then echo "it is unset"; fi
$ FOOBAR=
$ if [ -z "${FOOBAR+something}" ]; then echo "it is unset"; fi
$ FOOBAR=1
$ if [ -z "${FOOBAR+something}" ]; then echo "it is unset"; fi
$ unset FOOBAR
$ if [ -z "${FOOBAR+something}" ]; then echo "it is unset"; fi
it is unset
$
This function tests for variables that are currently set. The variable may even be an array. Note that in Bash: 0 == TRUE, 1 == FALSE.
function var.defined {
eval '[[ ${!'$1'[#]} ]]'
}
# Typical usage of var.defined {}
declare you="Your Name Here" ref='you';
read -p "What's your name: " you;
if var.defined you; then # Simple demo using literal text
echo "BASH recognizes $you";
echo "BASH also knows a reference to $ref as ${!ref}, by indirection.";
fi
unset you # Have just been killed by a master :D
if ! var.defined $ref; then # Standard demo using an expanded literal value
echo "BASH doesn't know $ref any longer";
fi
read -s -N 1 -p "Press any key to continue...";
echo "";
So to be clear here, the function tests literal text. Every time a command is called in Bash, variables are generally 'swapped-out' or 'substituted' with the underlying value unless:
$varRef ($) is escaped: $varRef
$varRef is single quoted '$varRef'
I.e., I need to exit if the test fails.
The code:
${varName:? "${varName} is not defined"}
will return a nonzero exit code when there is not a variable named "varName". The exit code of the last command is saved in $?.
About your code:
val=${1:? "${1} must be defined, preferably in $basedir"}
Maybe it is not doing what you need. In the case that $1 is not defined, the "${1}" will be substituted with nothing. Probably you want use the single quotes that literally writes ${1} without substitution.
val=${1:? '${1} must be defined, preferably in $basedir'
I am unsure if this is exactly what you want, but a handy trick I use when writing a new and complex script is to use "set -o":
set -o # Will make the script bomb out when it finds an unset variable
For example,
$ grep '$1' chex.sh
case "$1" in
$ ./chex.sh
./chex.sh: line 111: $1: unbound variable
$ ./chex.sh foo
incorrect/no options passed.. exiting
if set | grep -q '^VARIABLE='
then
echo VARIABLE is set
fi

Resources