Is there any way to retrieve a list of the options in a case statement? For example if I have this code:
tool=$1
case ${tool} in
brdf)
# Do stuff
;;
drift)
# Do other stuff
;;
*)
echo "ERROR: I don't know this tool. Valid options are: brdf, drift"
exit 1
;;
esac
This is easy to read, but the error message could easily get out of date when adding/removing tools from the list as I need to remember to change the names there too.
The repetition could be avoided using an array something like this:
tool=$1
validtools=(brdf drift)
case ${tool} in
${validtools[0]})
# Do stuff
;;
${validtools[1]})
# Do other stuff
;;
*)
echo "ERROR: I don't know this tool. Valid options are: ${validtools[#]}"
exit 1
;;
esac
But that is pretty horrible to read, and in any case would be even worse to maintain with the hardcoded array indices.
Is there a good way of doing this, perhaps some variable or command that retrieves a list of the available options, or do I just have to remember to update the error message when I add a new option?
The most used way is as your 1st example. See all init scripts in linux.
And it is for reason, because you can use constructions like:
case "$a" in
arg1|arg2|arg3) ... ;;
brg1|brg2) ... ;;
brg2) ... ;;
esac
and would be hard contstuct automatically the right usage message with the all possible variants.
And here is the shopt -s extglob too, what allows you to use extended pattern matching in the case statemens. For examples see this answer: https://stackoverflow.com/a/4555979/632407
But if you want use arrays, try to use associative array, what add a bit of readability. (But it is terrible anyway) :) Like the next:
declare -A arg
initargs() { for a in "$#"; do arg[$a]="$a"; done; }
initargs brd lbrs ubrs
for myarg
do
case "$myarg" in
${arg[brd]}) echo "brd";;
${arg[ubrs]}) echo "ubrs";;
${arg[lbrs]}) echo "lbrs";;
*) echo "Unknown arg =$myarg=. Known are: ${arg[#]}" ;;
esac
done
So the allowed args are: "brd" "lbrs" "ubrs" and the script for the next input
$ bash argtest ubrs badarg brd
produces:
ubrs
Unknown arg =badarg=. Known are: lbrs ubrs brd
brd
Related
I am trying to parse a command that includes multiple arguments, which is like below.
python run.py --country India Bangladesh Singapore --prefix country_prefix --tmp_dir /tmp/tmp.
I tried it achieving in a way similar to something below:
while true ; do
case "${1}" in
--prefix)
PREFIX="${2}"
shift 2
;;
--country)
COUNTRY="${2}"
shift 2
;;
--)
shift
break
;;
*)
echo "Not implemented: $1"
break
exit 1
esac
done
However I am aware that above codeline won't work because possible values for country arguments are dynamic. Can you please guild me, how can I script it for my requirement, where number of possible values (i.e country names) can be dynamic.
case "$action" in
a|b)
echo for a or b
;;&
b|c)
echo for c or b
;;&
*)
echo for everything ELSE
;;&
esac
So, as you can see, I'm using ;;& instead of ;; so that if action=b it will trigger both of the first two cases.
However, a drawback of this is that *) no longer 'means' "everything else", but will match "everything" instead; thus b will trigger the final one also.
PowerShell is able to do what I want because it has a dedicated default keyword, does Bash have something similar?
What about an exhaustive work-around like [!(a|b|c)]) or something?
It would have been handy to do something like case 5 in 4) echo bob; ;;& esac || echo DEFAULT but case doesn't seem to return any code.
From bash manual:
If the ‘;;’ operator is used, no subsequent matches are attempted after the first pattern match. Using ‘;&’ in place of ‘;;’ causes execution to continue with the command-list associated with the next clause, if any. Using ‘;;&’ in place of ‘;;’ causes the shell to test the patterns in the next clause, if any, and execute any associated command-list on a successful match, continuing the case statement execution as if the pattern list had not matched.
Maybe such idea:
case "$action" in
a|b)
echo for a or b
;;&
b|c)
echo for c or b
;;&
a|b|c) ;;
*)
echo for everything ELSE
esac
I have a wrapper shell script that needs to add some library and include paths before calling a compiler.
#!/bin/sh
LDIRS="-L/opt/lib"
LDFGS="-llibA -llibB"
exec /opt/bin/xxx $# $LDIRS $LDFGS
this works fine for compiling a simple test case
compiler -o test test.c
It falls apart if another program would like to call my compiler and pass in include directories like this
compiler -o in_file.xx -I/xxx -I/xxx
How could I generalize this to get the expected behavior of appending those includes to LDFGS?
I take it that the order of the LDIRS and LDFGS is the relevant issue; the additional things the user provides are supposed to end up in the LDFGS (i. e. need to be given after all the LDIRS to /opt/bin/xxx) but don't with your current implementation. So I take it that the -I/xxx arguments are to be added to the LDFGS. If I misunderstood your issue, please state more clearly what you wanted. Otherwise read on.
You have two options to solve an issue like this. Either you implement some kind of intelligence which understands the given options and sorts them properly into the lists where they belong. An approach to this could be like this:
#!/bin/sh
LDIRS="-L/opt/lib"
LDFGS="-llibA -llibB"
PRE=""
for argument in "$#"
do
case "$argument" in
-I*)
LDFGS="$LDFGS $argument"
;;
# other patterns could go here as well
*) # else case
PRE="$PRE $argument"
;;
esac
done
exec /opt/bin/xxx $PRE $LDIRS $LDFGS
As #charles-duffy already pointed out, using a bash or similar which supports proper lists would be way more robust in case any of your arguments contain spaces. If you are sure this won't happen, you should be fine for now. But your code maintainer will hate you for such things when they run into trouble because of this. So here's a less readable version in sh which should take care of this:
#!/bin/sh
LDIRS="-L/opt/lib"
LDFGS="-llibA -llibB"
PRE=""
for argument in "$#"
do
case "$argument" in
-I*)
LDFGS=`printf "%s %q" "$LDFGS" "$argument"`
;;
# other patterns could go here as well
*) # else case
PRE=`printf "%s %q" "$PRE" "$argument"`
;;
esac
done
eval "exec /opt/bin/xxx $PRE $LDIRS $LDFGS"
(Whenever eval is used, a disclaimer needs to be added: eval has it's dangers, some of them security-relevant, so please learn more about it before applying it wildly. The current unmodified case should be fine, though.)
If you think that you have no way of knowing all patterns which are supposed to go into LDFGS, LDIRS, and PRE, you need to hand this over to the user. Unfortunately, then the user needs to know more and pass this information. The calls then will need to look differently.
One way would be this:
#!/bin/sh
LDIRS="-L/opt/lib"
LDFGS="-llibA -llibB"
PRE=""
while [ $# -gt 1 ]
do
case "$1" in
-p) # go into PRE
PRE="$PRE $2"
shift 2
;;
-i) # go into LDIRS
LDIRS="$LDIRS $2"
shift 2
;;
-f) # go into LDFGS
LDFGS="$LDFGS $2"
shift 2
;;
*)
echo "Not understood: $1"
exit 1
;;
esac
done
eval "exec /opt/bin/xxx $PRE $LDIRS $LDFGS"
Now the call needs to look like this:
compiler -p "-o in_file.xx" -f "-I/xxx -I/xxx"
And again, if you have spaces issues, you should consider using proper array solutions or at least the ugly workaround I proposed above.
I have been trying to find a way to source a options file into my case script as shown below:
main file
case "$1" in
. options
esac
options file
#options file
1 ) do something
2 ) do something else
Has anyone got any ideas as all I am getting is a 'parse error'
Thanks in advance for the help
Reading between the lines, what you really want to do is to be able to add handlers for different values of $1 from external source files -- potentially an arbitrary number of such files, without them being aware of each other.
That's a perfectly fine thing to want. Don't use case to do it.
Consider the following:
# base-options
do__1() { echo "doing something"; }
do__2() { echo "doing something else; }
...and, perhaps, a separate file:
# extra-options
do__foo() { echo "doing yet another thing"; }
Now, from your main file, you can just source all these in:
# this loop will source in both base-options and extra-options
for f in *options; do
source "$f"
done
# and here, if we *have* a command -- either as a shell function or an external command --
# corresponding to our $1, we run that command.
if type "do__$1" >/dev/null; then
cmd="do__$1"; shift
"$cmd" "$#"
else
echo "Handler for $1 not found" >&2
exit 1
fi
Going the approach above also means that your handlers don't need to be shell functions at all! If you have an executable file named do__1, it'll be found by type so long as it's in your PATH, so you can write handlers in any language you want -- you're not locking yourself into bash.
This is closely related to how git handles subcommands -- see all the git-foo executables handling git foo commands. It's a common practice, and a good one. Use it.
TL;DR
The following works fine:
# options
case "$1" in
1) something ;;
2) something else ;;
esac
and, in your original code:
. options "$#"
...and all that differs from what you were trying to do is that you're adding a header and footer to options.
Explanation
. (or source) is itself a command. If you can't put arbitrary commands in a position, then you can't put a source command there either.
Since your options file is necessarily written knowing that it'll go inside a case statement, why not just put the case and esac lines inside it? This doesn't make it any less flexible: $1 can be overridden in the source command itself:
. my.sh "$#" # this passes the current arguments through...
whereas:
# this still works without changing options
. my.sh "$foo" # this passes $foo as $1
Alternate Ugly Literal Broken Hack
As a very ugly, awful hack that'll break a bunch of debugging functionality, you could do something like:
# Don't do this; it'll cause headaches later.
eval 'case "$1" in '"$(<options)"'; esac'
You need something to match against $1 first, otherwise what is the point of the case statement?
case "$1" in
options) . options ;;
things) echo "What should I do with things?" ;;
*) echo "I don't know what you want" ;;
esac
Upon seeing the update, the correct thing to do is to wrap the entire case statement in a function defined in your external file.
handle_options () {
case "$1" in
1 ) do something ;;
2 ) do something else ;;
esac
}
Then call that function from the script that sources your file.
. somefile
handle_options "$1"
I have a script that receive parameters from user input, but can have also have parameters coming from an environment variable.
e.g. :
export ADZ_DEPLOY_OPT="--from-git --rootDir XXXXXXX"
and in my script, I attempt to do something like :
$*="${ADZ_DEPLOY_OPT} $*"
while [[ $# -gt 1 ]]
do
key="$1"
case $key in
--rootDir)
ADZ_ROOT_DIR="$2"
shift # past argument
;;
--tagVersion)
ADZ_TAG_VERSION="$2"
shift # past argument
;;
--from-git)
ADZ_DEPLOY_FROM_GIT=1
;;
*)
# unknown option
;;
esac
shift # past argument or value
done
Except it doesn't work as it generates an error when I try to update $* to expand it with additional values - and I couldn't succeed to find a way to achieve this behavior.
The idea also is that the env variable will act as 'default' but may be overriden by what the user explicitly provide on the command line. That is with my example, if the user specify another rootDir, I want the user rootDir to be taken into account.
At last, I would like to keep long parameter names (I like being explicit) and therefore the use of getopts doesn't seem like an option.
Any help would be much appreciated !
Thanks in advance.
Replace
$*="${ADZ_DEPLOY_OPT} $*"
with:
set -- ${ADZ_DEPLOY_OPT} "$#"
Notes:
As bash is designed, one cannot assign directly to $*. Assignments are done with the set builtin.
You want to use "$#" not $*. The form $* is subjected to word splitting and the expansion of "$#" is not.
From your sample value for ADZ_DEPLOY_OPT, you do need word splitting for it. If you try to put complicated arguments in ADZ_DEPLOY_OPT, this won't work: you will need to use an array instead.