Bash script with options to run another script - bash

I wish to create a simple bash script.
buildapp.sh -build1
buildapp.sh -build2
etc
the option build1/2/3/ etc call an external script depending on option.
So something like
buildapp.sh -build1 → script1.sh
buildapp.sh -build2 → script2.sh

I think this is what you are looking for:
if [ "$1" = "-build1" ]; then
path/to/script1.sh
elif [ "$1" = "-build2" ]; then
path/to/script2.sh
elif [ "$1" = "-build3" ]; then
path/to/script3.sh
else
echo "Incorrect parameter"
fi
Another option is to use getops (see An example of how to use getopts in bash)

Solution
#!/bin/bash
./script${1//[!0-9]/}.sh # './' is the path to scriptX.sh, you may need to adjust it
A very tiny solution, that works with every number, by simply referencing the numeric argument suffix. E.g. it calls ./script123.sh with -build123.
Solution (extended)
#!/bin/bash
if [[ '-build' == "${1//[0-9]/}" ]]
then
./script${1//[!0-9]/}.sh
fi
Extends the above version, so that it only runs ./scriptXXX.sh, if the argument prefix is -build

Related

Bash == not found

I have a small bash script:
#!/bin/bash
if [[ "$#" == "pull" ]]
then
# stuff
elif [[ "$#" == "push" ]]
then
# stuff
else
echo "Command not recognised"
fi
it's located in /usr/bin/local and I made it executable. However whenever I run it I get script:1: == not found
Any ideas?
This is macOS if that matters.
Don't use [[, not defined by POSIX. Instead use [
Don't use ==, use =
Don't use $#, use $1
Don't use double quotes in this situation for pull and push, matter of fact don't use them at all
Don't use Bash when sh will do
Updated script:
#!/bin/sh
if [ "$1" = pull ]
then
# stuff
elif [ "$1" = push ]
then
# stuff
else
echo 'Command not recognised'
fi
Sticking with bash as your interpreter, your only issue is with your uses of "$#", which in tests like bash's [[ and POSIX's [ and test, expands to all arguments surrounded by quotes (just like "$*"). You probably want "$1" to test just the first argument.
You can also consider using a case (switch) statement:
#!/bin/bash
case "$1" in
( pull ) echo "you said pull" ;;
( push ) echo "you said push" ;;
( * ) echo "Command '$1' is not recognised" ;;
esac
(The above code will work in bash, sh, and zsh. I assume you still require bash due to other aspects of your code.)

Execute command in bash for a set of corresponding file names

I have a directory with lots of config files in it, they look like this:
us-sfo-building1.foo us-sfo-building1.bar
mx-mex-building15.foo mx-mex-building15.bar
Now, I want to execute a script which takes us-sfo-building1.foo and us-sfo-building1.bar as input parameters.
I want basically this: ./script $.foo $.bar but I have to make sure that I always have the matching pair of foo and bar, otherwise the script complains. I tried this, but it did not work as expected:
#/bin/bash
for x in "*.foo*; do
x=${x%.foo}
if [ -e "$x.bar" ]; then
./script "$x.foo" "$x.bar"
fi
done
Any idea on how to solve this or where my mistake is?
for x in "*.foo*; do doesn't need a ".
Fix:
#/bin/bash
for x in *.foo; do
x=${x%.foo}
if [ -e "$x.bar" ]; then
./script "$x.foo" "$x.bar"
fi
done
And a suggestion (includes preferred use of [[ ]]):
#/bin/bash
for x in *.foo; do
y=${x%.foo}.bar
[[ -e $y ]] && ./script "$x" "$y"
done

bash: if statement with multiple checks

In my bash script I need to check if the first CLI is defined and the second one is an existing file
Here is what I have:
if [!$2] && [! -f $1 ]; then
....
fi
So $2 should exist (string) and $1 should be the existing file on the filesystem!
Any suggestions ?
If by suggestions you mean what do I need to make it work, then what you need to do is to add spaces around brackets. Also it is good to quote the variables:
if [ -n "$2" ] && [ ! -f "$1" ]; then
...
fi
From man test:
-n STRING
the length of STRING is nonzero

Capturing multiple values for single argument in bash

I have a bash script like this :
usage="setup.sh [-localsource path to dir] [-help]";
for i in $#
do
if [ "$localSourceOpt" = 1 ]
then
localSource=$i
localSourceOpt=0;
fi
if [ "$i" = "-localsource" ]
then
localSourceOpt=1;
fi
if [ "$i" = "-help" ]
then
echo "$usage";
exit;
fi
done
which requires on argument e.g
setup.sh -localsource PATH
what I need is to add another argument which MIGHT have multiple argument values e.g
setup.sh -localsource PATH -locbranches one two three
What I should do to capture values passed for argument "-locbranches"
thanks in advance
I note that you're having to code a lot of logic to handle the simplest command-line argument mechanism, and I'd perhaps suggest using the bash getopts functionality instead.
This makes the single argument option work trivial. For multiple arguments it doesn't work well, and you would have to quote the args e.g. -option "1 2 3". However it does handle multiple arguments in the following scenario.
setup.sh -localsource PATH one two three
i.e. the one two three aren't linked to any command line option. An alternative is to specify the option for each argument e.g.
setup.sh -localsource PATH -locbranch one -locbranch two -locbranch three
You can do a switch with 4 cases ( -localsource , -locbranches, -help and default) and in each case you should enter in one state,
for i in $#; do
case "$i" in
"-help") echo "$usage"
;;
"-localsource") STATE="localsource"
;;
"-locbranches") STATE="locbranches"
;;
*)
if [ "$STATE" == "localsource" ]; then
PATH=$i
elif [ "$STATE" == "locbranches"]; then
# do something with argv from locbrances
else
echo "Wrong state!"
fi
;;
esac
done

Is there a way to avoid positional arguments in bash?

I have to write a function in bash. The function will take about 7 arguments. I know that I can call a function like this:
To call a function with parameters:
function_name $arg1 $arg2
And I can refer my parameters like this inside the function:
function_name () {
echo "Parameter #1 is $1"
}
My question is, is there a better way refer to the parameters inside the function? Can I avoid the $1, $2, $3, .... thing and simply use the $arg1, $arg2, ...?
Is there a proper method for this or do I need to re-assign these parameters to some other variables inside the function? E.g.:
function_name () {
$ARG1=$1
echo "Parameter #1 is $ARG1"
}
Any example would be much appreciated.
The common way of doing that is assigning the arguments to local variables in the function, i.e.:
copy() {
local from=${1}
local to=${2}
# ...
}
Another solution may be getopt-style option parsing.
copy() {
local arg from to
while getopts 'f:t:' arg
do
case ${arg} in
f) from=${OPTARG};;
t) to=${OPTARG};;
*) return 1 # illegal option
esac
done
}
copy -f /tmp/a -t /tmp/b
Sadly, bash can't handle long options which would be more readable, i.e.:
copy --from /tmp/a --to /tmp/b
For that, you either need to use the external getopt program (which I think has long option support only on GNU systems) or implement the long option parser by hand, i.e.:
copy() {
local from to
while [[ ${1} ]]; do
case "${1}" in
--from)
from=${2}
shift
;;
--to)
to=${2}
shift
;;
*)
echo "Unknown parameter: ${1}" >&2
return 1
esac
if ! shift; then
echo 'Missing parameter argument.' >&2
return 1
fi
done
}
copy --from /tmp/a --to /tmp/b
Also see: using getopts in bash shell script to get long and short command line options
You can also be lazy, and just pass the 'variables' as arguments to the function, i.e.:
copy() {
local "${#}"
# ...
}
copy from=/tmp/a to=/tmp/b
and you'll have ${from} and ${to} in the function as local variables.
Just note that the same issue as below applies — if a particular variable is not passed, it will be inherited from parent environment. You may want to add a 'safety line' like:
copy() {
local from to # reset first
local "${#}"
# ...
}
to ensure that ${from} and ${to} will be unset when not passed.
And if something very bad is of your interest, you could also assign the arguments as global variables when invoking the function, i.e.:
from=/tmp/a to=/tmp/b copy
Then you could just use ${from} and ${to} within the copy() function. Just note that you should then always pass all parameters. Otherwise, a random variable may leak into the function.
from= to=/tmp/b copy # safe
to=/tmp/b copy # unsafe: ${from} may be declared elsewhere
If you have bash 4.1 (I think), you can also try using associative arrays. It will allow you to pass named arguments but it will be ugly. Something like:
args=( [from]=/tmp/a [to]=/tmp/b )
copy args
And then in copy(), you'd need to grab the array.
You can always pass things through the environment:
#!/bin/sh
foo() {
echo arg1 = "$arg1"
echo arg2 = "$arg2"
}
arg1=banana arg2=apple foo
All you have to do is name variables on the way in to the function call.
function test() {
echo $a
}
a='hello world' test
#prove variable didnt leak
echo $a .
This isn't just a feature of functions, you could have that function in it's own script and call a='hello world' test.sh and it would work just the same
As an extra little bit of fun, you can combine this method with positional arguments (say you were making a script and some users mightn't know the variable names).
Heck, why not let it have defaults for those arguments too? Well sure, easy peasy!
function test2() {
[[ -n "$1" ]] && local a="$1"; [[ -z "$a" ]] && local a='hi'
[[ -n "$2" ]] && local b="$2"; [[ -z "$b" ]] && local b='bye'
echo $a $b
}
#see the defaults
test2
#use positional as usual
test2 '' there
#use named parameter
a=well test2
#mix it up
b=one test2 nice
#prove variables didnt leak
echo $a $b .
Note that if test was its own script, you don't need to use the local keyword.
Shell functions have full access to any variable available in their calling scope, except for those variable names that are used as local variables inside the function itself. In addition, any non-local variable set within a function is available on the outside after the function has been called. Consider the following example:
A=aaa
B=bbb
echo "A=$A B=$B C=$C"
example() {
echo "example(): A=$A B=$B C=$C"
A=AAA
local B=BBB
C=CCC
echo "example(): A=$A B=$B C=$C"
}
example
echo "A=$A B=$B C=$C"
This snippet has the following output:
A=aaa B=bbb C=
example(): A=aaa B=bbb C=
example(): A=AAA B=BBB C=CCC
A=AAA B=bbb C=CCC
The obvious disadvantage of this approach is that functions are not self-contained any more and that setting a variable outside a function may have unintended side-effects. It would also make things harder if you wanted to pass data to a function without assigning it to a variable first, since this function is not using positional parameters any more.
The most common way to handle this is to use local variables for arguments and any temporary variable within a function:
example() {
local A="$1" B="$2" C="$3" TMP="/tmp"
...
}
This avoids polluting the shell namespace with function-local variables.
I think I have a solution for you.
With a few tricks you can actually pass named parameters to functions, along with arrays.
The method I developed allows you to access parameters passed to a function like this:
testPassingParams() {
#var hello
l=4 #array anArrayWithFourElements
l=2 #array anotherArrayWithTwo
#var anotherSingle
#reference table # references only work in bash >=4.3
#params anArrayOfVariedSize
test "$hello" = "$1" && echo correct
#
test "${anArrayWithFourElements[0]}" = "$2" && echo correct
test "${anArrayWithFourElements[1]}" = "$3" && echo correct
test "${anArrayWithFourElements[2]}" = "$4" && echo correct
# etc...
#
test "${anotherArrayWithTwo[0]}" = "$6" && echo correct
test "${anotherArrayWithTwo[1]}" = "$7" && echo correct
#
test "$anotherSingle" = "$8" && echo correct
#
test "${table[test]}" = "works"
table[inside]="adding a new value"
#
# I'm using * just in this example:
test "${anArrayOfVariedSize[*]}" = "${*:10}" && echo correct
}
fourElements=( a1 a2 "a3 with spaces" a4 )
twoElements=( b1 b2 )
declare -A assocArray
assocArray[test]="works"
testPassingParams "first" "${fourElements[#]}" "${twoElements[#]}" "single with spaces" assocArray "and more... " "even more..."
test "${assocArray[inside]}" = "adding a new value"
In other words, not only you can call your parameters by their names (which makes up for a more readable core), you can actually pass arrays (and references to variables - this feature works only in bash 4.3 though)! Plus, the mapped variables are all in the local scope, just as $1 (and others).
The code that makes this work is pretty light and works both in bash 3 and bash 4 (these are the only versions I've tested it with). If you're interested in more tricks like this that make developing with bash much nicer and easier, you can take a look at my Bash Infinity Framework, the code below was developed for that purpose.
Function.AssignParamLocally() {
local commandWithArgs=( $1 )
local command="${commandWithArgs[0]}"
shift
if [[ "$command" == "trap" || "$command" == "l="* || "$command" == "_type="* ]]
then
paramNo+=-1
return 0
fi
if [[ "$command" != "local" ]]
then
assignNormalCodeStarted=true
fi
local varDeclaration="${commandWithArgs[1]}"
if [[ $varDeclaration == '-n' ]]
then
varDeclaration="${commandWithArgs[2]}"
fi
local varName="${varDeclaration%%=*}"
# var value is only important if making an object later on from it
local varValue="${varDeclaration#*=}"
if [[ ! -z $assignVarType ]]
then
local previousParamNo=$(expr $paramNo - 1)
if [[ "$assignVarType" == "array" ]]
then
# passing array:
execute="$assignVarName=( \"\${#:$previousParamNo:$assignArrLength}\" )"
eval "$execute"
paramNo+=$(expr $assignArrLength - 1)
unset assignArrLength
elif [[ "$assignVarType" == "params" ]]
then
execute="$assignVarName=( \"\${#:$previousParamNo}\" )"
eval "$execute"
elif [[ "$assignVarType" == "reference" ]]
then
execute="$assignVarName=\"\$$previousParamNo\""
eval "$execute"
elif [[ ! -z "${!previousParamNo}" ]]
then
execute="$assignVarName=\"\$$previousParamNo\""
eval "$execute"
fi
fi
assignVarType="$__capture_type"
assignVarName="$varName"
assignArrLength="$__capture_arrLength"
}
Function.CaptureParams() {
__capture_type="$_type"
__capture_arrLength="$l"
}
alias #trapAssign='Function.CaptureParams; trap "declare -i \"paramNo+=1\"; Function.AssignParamLocally \"\$BASH_COMMAND\" \"\$#\"; [[ \$assignNormalCodeStarted = true ]] && trap - DEBUG && unset assignVarType && unset assignVarName && unset assignNormalCodeStarted && unset paramNo" DEBUG; '
alias #param='#trapAssign local'
alias #reference='_type=reference #trapAssign local -n'
alias #var='_type=var #param'
alias #params='_type=params #param'
alias #array='_type=array #param'
I was personally hoping to see some sort of syntax like
func(a b){
echo $a
echo $b
}
But since that's not a thing, and a I see quite a few references to global variables (not without the caveat of scoping and naming conflicts), I'll share my approach.
Using the copy function from Michal's answer:
copy(){
cp $from $to
}
from=/tmp/a
to=/tmp/b
copy
This is bad, because from and to are such broad words that any number of functions could use this. You could quickly end up with a naming conflict or a "leak" on your hands.
letter(){
echo "From: $from"
echo "To: $to"
echo
echo "$1"
}
to=Emily
letter "Hello Emily, you're fired for missing two days of work."
# Result:
# From: /tmp/a
# To: Emily
# Hello Emily, you're fired for missing two days of work.
So my approach is to "namespace" them. I name the variable after the function and delete it after the function is done with it. Of course, I only use it for optional values that have default values. Otherwise, I just use positional args.
copy(){
if [[ $copy_from ]] && [[ $copy_to ]]; then
cp $copy_from $copy_to
unset copy_from copy_to
fi
}
copy_from=/tmp/a
copy_to=/tmp/b
copy # Copies /tmp/a to /tmp/b
copy # Does nothing, as it ought to
letter "Emily, you're 'not' re-hired for the 'not' bribe ;)"
# From: (no /tmp/a here!)
# To:
# Emily, you're 'not' re-hired for the 'not' bribe ;)
I would make a terrible boss...
In practice, my function names are more elaborate than "copy" or "letter".
The most recent example to my memory is get_input(), which has gi_no_sort and gi_prompt.
gi_no_sort is a true/false value that determines whether the completion suggestions are sorted or not. Defaults to true
gi_prompt is a string that is...well, that's self-explanatory. Defaults to "".
The actual arguments the function takes are the source of the aforementioned 'completion suggestions' for the input prompt, and as said list is taken from $# in the function, the "named args" are optional[1], and there's no obvious way to distinguish between a string meant as a completion and a boolean/prompt-message, or really anything space-separated in bash, for that matter[2]; the above solution ended up saving me a lot of trouble.
notes:
So a hard-coded shift and $1, $2, etc. are out of the question.
E.g. is "0 Enter a command: {1..9} $(ls)" a value of 0, "Enter a command:", and a set of 1 2 3 4 5 6 7 8 9 <directory contents>? Or are "0", "Enter", "a", and "command:" part of that set as well? Bash will assume the latter whether you like it or not.
Arguments get sent to functions as an tuple of individual items, so they have no names as such, just positions. this allows some interesting possibilities like below, but it does mean that you are stuck with $1. $2, etc. as to whether to map them to better names, the question comes down to how big the function is, and how much clearer it will make reading the code. if its complex, then mapping meaningful names ($BatchID, $FirstName, $SourceFilePath) is a good idea. for simple stuff though, it probably isn't necessary. I certianly wouldn't bother if you are using names like $arg1.
now, if you just want to echo back the parameters, you can iterate over them:
for $arg in "$#"
do
echo "$arg"
done
just a fun fact; unless you are processing a list, you are probably interested in somthing more useful
this is an older topic, but still i'd like to share the function below (requires bash 4). It parses named arguments and sets the variables in the scripts environment. Just make sure you have sane default values for all parameters you need. The export statement at the end could also just be an eval. It's great in combination with shift to extend existing scripts which already take a few positional parameters and you dont want to change the syntax, but still add some flexibility.
parseOptions()
{
args=("$#")
for opt in "${args[#]}"; do
if [[ ! "${opt}" =~ .*=.* ]]; then
echo "badly formatted option \"${opt}\" should be: option=value, stopping..."
return 1
fi
local var="${opt%%=*}"
local value="${opt#*=}"
export ${var}="${value}"
done
return 0
}

Resources