I want to do a argument parser inside one argument.
Here is my launcher ./bin/kube/launch_market_quality_job.sh -e staging -j job_example
And here my script.
while [ ${#} -ne 0 ]; do
case "${1}" in
--environment | -e)
shift;
export ONE_ENVIRON=${1};
case $ONE_ENVIRON in
staging)
export REPOSITORY=<DTR>
;;
production)
export REPOSITORY=<DTR>
;;
*)
fail "You must provide the environment [staging|production]"
;;
esac
;;
--job | -j )
shift;
export JOB=${1}
case $JOB in
job_example_extra_args)
case "${1}" in
--name | -n )
export $NAME=${1}
[... extra args]
;;
*)
help
exit
;;
esac
shift
done
What I want to do is depending on "--j | -job" option, is to parse two more options depending if the jobs if one or another.
For example a normal job , called 'job_example' with the previous launcher it works correctly.
But if my job is 'job_example_extra_args' I need a new argument called 'name'.
`./bin/kube/launch_market_quality_job.sh -e staging -j job_example_extra_args --name "Jesus"`
I dont know how is the correct way, if this 'job_example_extra_args' is called, I want to obtain the flag correctly, and if its not included the script should fail or stop.
I want to add the flags options inside the --job flag, for activating it if the job is the one I want to.
I would catch all arguments and perform a check after all arguments are parsed, e.g.:
unset ONE_ENVIRON JOB NAME
while [ ${#} -ne 0 ]; do
case "${1}" in
--job | -j )
shift;
export JOB=${1}
;;
case "${1}" in
--name | -n )
shift
export NAME=${1}
;;
esac
shift
done
if [ "$JOB" = job_example_extra_args ] && [ -z "$NAME" ]; then
fail "You must provide a name for this job"
fi
This way, it does not matter if the user passes the --name NAME argument before or after the --job JOBNAME argument. All that matters is that the complete list of necessary arguments have been given at some point.
You could also report an error if the --name must be passed exclusively with the job_example_extra_args job:
if [ "$JOB" != job_example_extra_args ] && [ -n "$NAME" ]; then
fail "You must NOT provide a name for this job"
fi
PS. I am not showing --environment|-e nor * for clarity but there’s nothing wrong with them.
You could embed the secondary argument parsing inside the primary ("job") parse by inserting a complete parse loop inside each alternative job. But that's going to be hard to maintain, because the job-related logic becomes divorced from the job implementation.
Perhaps a better solution is to create an argument-parser+launcher for each job. Since you have already shifted the parsed arguments out of the way, you can just source an appropriate script using the . command. (Many shells allow source as a synonym for ..) Note that if you don't specify arguments after the filename in the . command, the existing arguments will not be altered so your unprocessed arguments will just get passed through. (For Posix shells, . might not even accept arguments following the filename.)
Related
I have a program which executes shell functions using call like sh -c "<command_string>". I can not change the way of calling these functions.
In this program I call different self written helper shell function, which are sourced into my environment. One of these looks like this. It unzips files with a given file pattern into a given directory.
function dwhUnzipFiles() {
declare OPTIND=1
while getopts "P:F:T:" opt; do
case "$opt" in
P) declare FILEPATTERN="$OPTARG" ;;
F) declare FROMDIR="$OPTARG" ;;
T) declare TODIR="$OPTARG" ;;
*) echo "Unbekannte Option | Usage: dwhUnzipFiles -P <filepattern> -F <fromdir> -T <todir>"
esac
done
shift $((OPTIND-1))
for currentfile in "${FROMDIR}"/"${FILEPATTERN}" ; do
unzip -o "$currentfile" -d "${TODIR}";
done
# error handling
# some more stuff
return $?
}
For this function I use arguments with wildcards for the FILEPATTERN variable. The function gets called by my program like this:
sh -c ". ~/dwh_env.sh && dwhUnzipFiles -P ${DWH_FILEPATTERN_MJF_WLTO}.xml.zip -F ${DWH_DIR_SRC_XML_CURR} -T ${DWH_DIR_SRC_XML_CURR}/workDir" where ${DWH_FILEPATTERN_MJF_WLTO} contains wildcards.
This works as intended. My confusion starts with another helper function, which is constructed in a similar way, but I'm not able to control the wildcard expansion correctly. It just deletes files in a directory depending on a given file pattern.
function dwhDeleteFiles() {
declare retFlag=0
declare OPTIND=1
while getopts "D:P:" opt; do
case "$opt" in
D) declare DIRECTORY="$OPTARG" ;;
P) declare FILEPATTERN="$OPTARG" ;;
*) echo "Unbekannte Option | Usage: dwhDeleteFiles -D <Directory> -P <Filepattern>"
esac
done
shift $((OPTIND-1))
for currentfile in "${DIRECTORY}"/"${FILEPATTERN}" ; do
rm -fv "${currentfile}";
done
# error handling
# some more stuff
return $retFlag
}
This function is called like this:
sh -c ". ~/dwh_env.sh && dwhDeleteFiles -P ${DWH_FILEPATTERN_MJF_WLTO}.xml -D ${DWH_DIR_SRC_XML_CURR}/workDir" where again ${DWH_FILEPATTERN_MJF_WLTO} contains wildcards. When I call this function with my program it results in doing nothing. I tried to play around with adding "" and \"\" to the arguments of my functions, but all what is happening is that instead of deleting all files in the given directory the function deletes only the first one in an alphanumerical order.
Can somebody explain to me, what is happening here? My idea is that the multiple passing of the variable, containing the wildcard, is not working. But how do I fix this and is it even possible in bash? And why is the dwhUnzipFilesfunction working and the dwhDeleteFiles is not?
Suppose that DWH_FILEPATTERN_MJF_WLTO is *, and you have a bunch of *.xml files, then the command is
dwhDeleteFiles -P *.xml -D ${DWH_DIR_SRC_XML_CURR}/workDir
which expands to
dwhDeleteFiles -P bar.xml baz.xml foo.xml zap.xml -D ${DWH_DIR_SRC_XML_CURR}/workDir
(Note alphabetical order of xml files). But the -P option only takes one arg, bar.xml (the first) and the remaining are treated as file arguments.
Try setting set -x in your script to see this in action.
I use following lines (hope this is best practice if not correct me please) to handle command line options:
#!/usr/bin/bash
read -r -d '' HELP <<EOF
OPTIONS:
-c Enable color output
-d Enable debug output
-v Enable verbose output
-n Only download files (mutually exclusive with -r)
-r Only remove files (mutually exclusive with -n)
-h Display this help
EOF
# DECLARE VARIABLES WITH DEFAULT VALUES
color=0
debug=0
verbose=0
download=0
remove=0
OPTIND=1 # Reset in case getopts has been used previously in the shell
invalid_options=(); # Array for invalid options
while getopts ":cdvnrh" opt; do
echo "Actual opt: $opt"
case $opt in
c)
color=1
;;
d)
debug=1
;;
v)
verbose=1
;;
n)
download=1
;;
r)
remove=1
;;
h)
echo "$HELP"
exit 1
;;
\?)
invalid_options+=($OPTARG)
;;
*)
invalid_options+=($OPTARG)
;;
esac
done
# HANDLE INVALID OPTIONS
if [ ${#invalid_options[#]} -ne 0 ]; then
echo "Invalid option(s):" >&2
for i in "${invalid_options[#]}"; do
echo $i >&2
done
echo "" >&2
echo "$HELP" >&2
exit 1
fi
# SET $1 TO FIRST MASS ARGUMENT, $2 TO SECOND MASS ARGUMENT ETC
shift $((OPTIND - 1))
# HANDLE CORRECT NUMBER OF MASS OPTIONS
if [ $# -ne 2 ]; then
echo "Correct number of mass arguments are 2"
echo "" >&2
echo "$HELP" >&2
exit 1
fi
# HANDLE MUTUALLY EXCLUSIVE OPTIONS
if [ $download -eq 1 ] && [ $remove -eq 1 ]; then
echo "Options for download and remove are mutually exclusive" >&2
echo "$HELP" >&2
exit 1
fi
echo "color: $color"
echo "debug: $debug"
echo "verbose: $verbose"
echo "download: $download"
echo "remove: $remove"
echo "\$1: $1"
echo "\$2: $2"
If I call the script way that mass arguments (those that are not switches or arguments for switches) are last arguments everything is working correctly:
$ ./getopts.sh -c -d -v -r a b
Actual opt: c
Actual opt: d
Actual opt: v
Actual opt: r
color: 1
debug: 1
verbose: 1
download: 0
remove: 1
$1: a
$2: b
The problem is when I want to call the script so the mass arguments are first (or somewhere in the middle of switches that do not use arguments)
$ ./getopts.sh a b -c -d -v -r
Correct number of mass arguments are 2
OPTIONS:
-c Enable color output
-d Enable debug output
-v Enable verbose output
-n Only download files (mutually exclusive with -r)
-r Only remove files (mutually exclusive with -d)
-h Display this help
or
$ ./getopts.sh -c a b -d -v -r
Actual opt: c
Correct number of mass arguments are 2
OPTIONS:
-c Enable color output
-d Enable debug output
-v Enable verbose output
-n Only download files (mutually exclusive with -r)
-r Only remove files (mutually exclusive with -d)
-h Display this help
I think this should be OK according (POSIX) standards, because following syntax which is basically the same is working as expected on my system:
$ cp test1/ test2/ -r
$ cp test1/ -r test2/
I have search over the Internet but only thing that was close to my problem was this one related to C.
getopts automatically breaks the while loop as soon as it detects a non-dash parameter (not including the argument given to dash parameters that take arguments). The POSIX standard is to have dashed parameters come first, and then have files. There's also none of this -- and + crap either. It's plain and simple.
However, Linux isn't Unix or POSIX compliant. It's just in the nature of the GNU utilities to be "better" than the standard Unix utilities. More features, more options, and handling things a bit differently.
On Linux, command line parameters can come after files in many GNU utilities.
For example:
$ cp -R foo bar
Work on my Unix certified Mac OS X and on Linux, However,
$ cp foo bar -R
Only works on Linux.
If you want getopts to work like a lot of Linux utilities, you need to do a wee bit of work.
First, you have to process your arguments yourself, and not depend upon $OPTIND to parse them. You also need to verify that you have an argument.
I came up with this as an example of doing what you want.
#! /bin/bash
while [[ $* ]]
do
OPTIND=1
echo $1
if [[ $1 =~ ^- ]]
then
getopts :a:b:cd parameter
case $parameter in
a) echo "a"
echo "the value is $OPTARG"
shift
;;
b) echo "b"
echo "the value is $OPTARG"
shift
;;
c) echo "c"
;;
d) echo "d"
;;
*) echo "This is an invalid argument: $parameter"
;;
esac
else
other_arguments="$other_arguments $1"
fi
shift
done
echo "$other_arguments"
I now loop as long as $* is set. (Maybe I should use $#?) I have to do a shift at the end of the loop. I also reset $OPTIND to 1 each time because I'm shifting the arguments off myself. $OPTARG is still set, but I have to do another shift to make sure everything works.
I also have to verify if a argument begins with a dash or not using a regular expression in my if statement.
Basic testing shows it works, but I can't say it's error free, but it does give you an idea how you have to handle your program.
There's still plenty of power you're getting from getopts, but it does take a bit more work.
Bash provides two methods for argument parsing.
The built-in command getopts is a newer, easy to use mechanism how to parse arguments but it is not very flexible. getopts does not allow to mix options and mass arguments.
The external command getopt is an older and more complex mechanism to parse arguments. It allows long/short options and the gnu extension allow to mix options and mass arguments.
I'm writing a script in bash. It will receive from 2 to 5 arguments. For example:
./foo.sh -n -v SomeString Type Directory
-n, -v and Directory are optional.
If script doesn't receive argument Directory it will search in current directory for a string.
Otherwise it will follow received path and search there. If this directory doesn't exist it will send a message.
The question is: Is there a way to check if the last arg is a path or not?
You can get last argument using variable reference:
numArgs=$#
lastArg="${!numArgs}"
# check if last argument is directory
if [[ -d "$lastArg" ]]; then
echo "it is a directory"
else
echo "it is not a directory"
fi
you can use this:
#!/bin/bash
if [[ -d ${!#} ]]
then
echo "DIR EXISTS"
else
echo "dosen't exists"
fi
First, use getopts to parse the options -n and -v (they will have to be used before any non-options, but that's not usually an issue).
while getopts nv opt; do
case $opt in
n) nflag=1 ;;
v) vflag=1 ;;
*) printf >&2 "Unrecognized option $opt\n"; exit 1 ;;
esac
done
shift $((OPTIND-1))
Now, you will have only your two required arguments, and possibly your third optional argument, in $#.
string_arg=$1
type_arg=$2
dir_arg=$3
if [ -d "$dir_arg" ]; then
# Do something with valid directory
fi
Note that this code will work in any POSIX-compliant shell, not just bash.
I wrote a script with mbratch's help.
I run as below;
./scriptname folder1
However, I see neither error nor results and I'm not sure what's wrong.
sh -x ./scriptname folder1
+ MAIN
+ check_directories
+ [ -d ]
This works fine for me:
Note: updated to support additional options.
opt="$1"
folder1="$2"
folder2="$3"
case "$opt" in
-d)
if [ -d "${folder1}" ] && [ -d "${folder2}" ] ; then
for i in "${folder1}/*" ; do
echo "test <$i>"
if [ -f "${folder2}/${i##*/}" ] ; then
echo "<${i##*/}>"
else
echo "! <${i##*/}>"
fi
done
fi
;;
# Example option
-h)
# Print help and exit.
;;
# Default case
*)
echo "Unknown option '$opt'" >&2
exit 1
;;
esac
Try replacing ~/ with $HOME/, and be sure to set folder1 and folder2 before using them. Note also that this will break if your directory or file names include spaces. In that case, use find; check the find man page for details.
Is that the entirety of your script? The variables $folder1 are never defined anywhere. By default, the program will take the first two chunks of text in as $1 and $2, so use those variables instead.
Put some echo statements in there to see what variables have what values.
You have a for loop already, so go with that. However you might have to put the part where you are getting a file list inside of $() to have it assigned to a variable, and then loop over it.
Do a quick search on "Looping through files in bash" and you will find a good template for the for loop.
I'm writing a bash script to automate some job, and in the meantime practice with bash. I'm writing a script to automate some git cloning and updating stuff. My script will have three options (-g, -f, -h). When someone types -h it should display the help message (which I have written but omitted it below), when someone writes -g, it should accept at least one argument, but it can also accept second one as optional. From which, the first one will be the url of the repository to clone and second one will be the directory, where to clone it. On the other hand, if someone types -f, the script should get just one argument, just a file name. Then I want to read the file, line by line, and do git cloning and updating for each git url inside the file.
If I run the script I get the following error message for each option, and if I call it without any option, or with a valid option followed by some other argument, it still doesn't do anything, just returns:
./gitupdate.sh: option requires an argument -- g
I guess it doesn't use $2 and $3 somehow in my code. But then again, when I pass -h, all it has to do is call the help function and display the message, doesn't need to use any other argument.
I guess the problem is because I have something wrong at the bottom, where I use getopts to get the option name specified by the user. In my code I assume that the option is the first argument, $1, and then the second one $2 is the url or filename, according to the option specified, and the $3 is the optional one that works only with -g option.
Below you can find my code:
#!/bin/bash
declare default_directory=$HOME
declare action
declare src
function clone_repo() {
if [ -n "$2" ]; then
if [ "$(ls -A "$2" 2>/dev/null)" ]; then
cd "$2"
else
git clone "$1" "$2"
fi
else
git clone "$1"
#TODO: Get the directory name created by cloning
# and cd to it.
fi
git remote add upstream "$1"
git fetch upstream
}
function read_repos_from_file() {
if [ -f "$1" ]; then
while read -r line; do
clone_repo "$line" "$2"
done < "$1"
else
echo -e "Error: The specified file could not be found."
exit 1
fi
}
while getopts "f:h:r" option
do
case "${option}" in
f) action=read_repos_from_file; src="$OPTARG";;
g) action=clone_repo; src="$OPTARG";;
h) help ; exit 1 ;;
esac
done
shift $((OPTIND-1))
[ -z "$action" ] && ( help; exit 1 )
If someone could help me, I would be glad.
EDIT: Corrected some typo mistakes in the above code. And updated the error message. The script doesn't work correctly. I guess I need to change something in my code, to make it do something, and actually get $2 and $3 arguments. It even doesn't display the help message, when -h option is passed, and all I do there is call the help function that I have previously created. Maybe I need to somehow change my getopts part.
EDIT 2: Made the advised changes, and changed the above code.
git() is the beginning of a function definition (the keyword function is optional when the function name is followed by parentheses). If you want to call a function git() you need to define it first, and call it without the parentheses:
function git() {
# do stuff
}
git
It's not good practice to create functions with the same name as existing binaries, though. In your case you should probably just call git clone with the line read from the file:
while read -r line; do
git clone "$line"
done < "${file}"
Edit: Updated, since the question changed significantly.
Your argument processing is … weird, to be frank. When you're using an option parser, you shouldn't work around the way that option parser works. "g:" means an option -g with exactly one argument. Don't try to make it an option with more than one argument, one of them being optional on top of it. If you need an additional (optional) argument for an output directory, make that either another option (e.g. "d:") or a non-option argument.
I'd suggest to change your option-processing to something like this:
while getopts "f:g:h" option; do
case "$option" in
f) action=file; src="$OPTARG";;
g) action=repo; src="$OPTARG";;
h) help; exit 1;;
esac
done
shift $((OPTIND-1))
[ -z "$action" ] && ( help; exit 1 )
After this "$#" holds only non-option arguments (in this case the optional output directory), so you could call your functions like this:
$action $src "$#"
with the functions simplified to something like this:
function repo() {
if [ -n "$2" ]; then
if [ "$(ls -A "$2" 2>/dev/null)" ]; then
cd "$2"
else
git clone "$1" "$2"
fi
else
git clone "$1"
fi
...
}
function file() {
if [ -f "$1" ]; then
while read -r line; do
repo "$line" "$2"
done < "$1"
else
echo "Error: The specified file could not be found."
exit 1
fi
}
On a more general note, you should make your names more self-explanatory. Better names for functions for cloning a repository or reading repositories from a file would be something like clone_repo and read_repos_from_file respectively. Also, -c or -r would be better mnemonics for an option that triggers the cloning of a repository.