Read bash script arguments multiple times - bash

I have a bash script which can be passed a number of different arguments and variables to be used by the script itself. Some of the parameters get assigned to variables. The second while loop does not appear to be running when the program tries to execute it. I've simplified the following script for confidentiality/simplicity reasons.
./myscript --dir2 /new/path
while :; do
case $1 in
--var1) $var1=$2
;;
--var2) $var2=$2
;;
"") break
esac
shift
done
$dir1=/default/directory
$dir2=/default/directory
while :; do
case $1 in
--dir1) $dir1=$2
;;
--dir2) $dir2=$2
;;
"") break
esac
shift
done
echo "Expect /default/directory, returned: $dir1"
echo "Expect /new/path, returned: $dir2"
Here's what my program would effectively return.
Expected /default/directory, returned: /default/directory
Expected /new/path, returned: /default/directory
Is there a better way to go about this? Or another way to iterate over the parameters originally passed to the script? Thanks for the help!

If you want to preserve your arguments, you can copy them into an array, and then restore the original list from that array later:
#!/usr/bin/env bash
# ^^^^ - must be invoked as bash, not sh, for array support
# Copy arguments into an array
original_args=( "$#" )
# ...consume them during parsing...
while :; do # ...parse your arguments here...
shift
done
# ...restore from the array...
set -- "${original_args[#]}"
# ...and now you can parse them again.

It's because your using shift which consumes the elements. Try using a for loop to iterate over the args. There's a few ways to do this. Here is my preferred method:
for elem in "$#" # "$#" has the elements in an array
do
... # can access the current element as $elem
done
Another way is to access them by index, you can look up a tutorial on bash array syntax for that.

Related

How to consume N arbitrary arguments by name (position not guaranteed) and pass on the rest to a subcommand?

So for instance, if the shell script itself were to consume -env and -local, it would then need to pass on all other arguments (without knowing their number or ordering) EXCEPT -env and -local to some other script, while pulling out the values of -env and -local for its own personal use.
If the arguments the script were to consume were guaranteed to be the first ones it would be possible to use shift then and pass on the arguments inside "$#", but since they're not that doesn't work.
You can populate a new array with the remaining arguments.
#! /bin/bash
newargs=()
env=0
local=0
for arg ; do
case $arg in
(-env=*)
env=${arg#-env=} ;;
(-local=*)
local=${arg#-local=} ;;
(*)
newargs+=("$arg") ;;
esac
done
echo Env: $env
echo Local: $local
printf Args:' '
printf '%s ' "${newargs[#]}"
To extract the values from the -arg=val tuples, I used parameter expansion: # removes the given pattern from the beginning of the variable.

Shell: two loops over coomandline parameters

In a shell-script I have a loop over the positional parameters using the shift-command. After the loop I d like to reset and start another loop over the parameters. Is it possible to go back to start?
while [ $# -gt 0 ]; do
case "$1" in
"--bla")
doing sth
shift 2
;;
*)
shift 1
;;
esac
done
You can save arguments in a temporary array. Then restore positional arguments from it.
args=("$#") # save
while .....
done
set -- "${args[#]}" # restore
Don't use shift if you need to process the arguments twice. Use a for loop, twice:
for arg in "$#"
do
…
done
If you need to process argument options, consider using the GNU version of getopt (rather than the Bash built-in getopts because that only handles short options). See Using getopts in bash shell script to get long and short command line options for many details on how to do that.

How to read -argument from within a .sh shell script

I'm calling a bash script with the following arguments:
myscript.sh -d /tmp -e dev -id 12345 -payload /tmp/test.payload
and inside the script, would like to get the value for the -payload. I don't really care about the other arguments, but they will be present in the call.
Here's some code that almost works on retrieving the argument:
while getopts "d:e:payload:id:" arg; do
case $arg in
payload)
echo "payload"
;;
esac
done
Of course payload) in the case control structure doesn't work, so how can I grab the value for -payload and assign it to a variable?
i not sure if this is the best way to handle it... but check these marked lines in a script
in your case i'd use
while test $# -gt 0; do
case "$1" in
-payload)
shift
PAYLOAD=$1
;;
*)
# Catch other parameters here
# this part is not relevant
# to the answer but I added it
# to avoid infinite loop mentioned
shift
;;
esac
done

How to read arguments from bash [duplicate]

This question already has answers here:
Creating bash scripts that take arguments
(3 answers)
Closed 8 years ago.
I am curious as to how to pass in arguments via terminal to the bash script and read them and process the script functions based on the arguments.
So if I did something like:
./scriptname.sh install
#or
./scriptname.sh assets install
How would I say, ok the first argument installs something, while the second sais to do something else based on the first argument.
$0 is the name of the command
$1 first parameter
$2 second parameter
$3 third parameter etc. etc
$# total number of parameters
for args in $*
blah blah
One great way to pass arguments to a script is to use the bash builtin functionality getopts
you can use it like this:
# a script that accepts -h -a <argument> -b
while getopts "ha:b" OPTION
do
case $OPTION in
h)
# if -h, print help function and exit
helpFunction
exit 0
;;
a)
# -a requires an argument (because of ":" in the definition) so:
myScriptVariable=$OPTARG
;;
b)
# do something special
doSomeThingSpecial
;;
?)
echo "ERROR: unknonw options!! ABORT!!"
helpFunction
exit -1
;;
esac
done
You can access a particular argument with $1, $2, ... See eg What does "$1/*" mean in "for file in $1/*"
You can also use "$#" to loop on your arguments. Ex : https://github.com/gturri/dotfiles/blob/master/bootstrap.sh#L64

Using getopts within user-defined-function in bourne shell

Is it possible to pass command line arguments into a function from within a bourne script, in order to allow getopts to process them.
The rest of my script is nicely packed into functions, but it's starting to look like I'll have to move the argument processing into the main logic.
The following is how it's written now, but it doesn't work:
processArgs()
{
while getopts j:f: arg
do
echo "${arg} -- ${OPTARG}"
case "${arg}" in
j) if [ -z "${filename}" ]; then
job_number=$OPTARG
else
echo "Filename ${filename} already set."
echo "Job number ${OPTARG} will be ignored.
fi;;
f) if [ -z "${job_number}" ]; then
filename=$OPTARG
else
echo "Job number ${job_number} already set."
echo "Filename ${OPTARG} will be ignored."
fi;;
esac
done
}
doStuff1
processArgs
doStuff2
Is it possible to maybe define the function in a way that it can read the scripts args? Can this be done some other way? I like the functionality of getopts, but it looks like in this case I'm going to have to sacrifice the beauty of the code to get it.
You can provide args to getopts after the variable. The default is $#, but that's also what shell functions use to represent their arguments. Solution is to pass "$#" — representing all the script's command-line arguments as individual strings — to processArgs:
processArgs "$#"
Adding that to your script (and fixing the quoting in line 11), and trying out some gibberish test args:
$ ./try -j asdf -f fooo -fasdfasdf -j424pyagnasd
j -- asdf
f -- fooo
Job number asdf already set.
Filename fooo will be ignored.
f -- asdfasdf
Job number asdf already set.
Filename asdfasdf will be ignored.
j -- 424pyagnasd

Resources