Why does my bash script think an argument is not present? - bash

I am trying to write a simple bash script that would execute as follows
$ ./export.sh -n <my-file-name> -a <my-api-key>
I am using this as a way to pass some arguments at build time in a Go project.
A very simple version of the script is:
#!/bin/bash
while getopts n:a option
do
case "${option}"
in
n) FILENAME=${OPTARG};;
a) APIKEY=${OPTARG};;
esac
done
if [ -z "$FILENAME" ]
then
FILENAME=downloader
fi
if [ -z "$APIKEY" ]
then
echo "[ERROR] Missing API key"
exit 1
fi
cd src && go build -o ../build/${FILENAME}.exe downloader -ldflags "-X api.APIServiceKey="${APIKEY}
If the FILENAME does not exist I provide a default value, however if APIKEY is missing I would like to exist and show a message.
Running the script with all arguments however throws the error as if APIKEY was missing.

You are missing a colon in the getopts call. Since you expect an argument to -a, there must be a colon after it in the optstring: while getopts n:a: option
Quoting the getopts man page:
When the option requires an option-argument, the getopts utility shall
place it in the shell variable OPTARG. [...] If a character is followed
by a <colon>, the option shall be expected to have an argument which
should be supplied as a separate argument.

Related

How can I use getopts in a script that appends lines from files in a separate directory to a new file?

I am trying to write a bash script that takes in a directory, reads each file in the directory, and then appends the first line of each file in that directory to a new file. When I hard-code the variables in my script, it works fine.
This works:
#!/bin/bash
rm /local/SomePath/multigene.firstline.btab
touch /local/SomePath/multigene.firstline.btab
btabdir=/local/SomePath/test/*
outfile=/local/SomePath/multigene.firstline.btab
for f in $btabdir
do
head -1 $f >> $outfile
done
This does not work:
#!/bin/bash
while getopts ":d:o:" opt; do
case ${opt} in
d) btabdir=$OPTARG;;
o) outfile=$OPTARG;;
esac
done
rm $outfile
touch $outfile
for f in $btabdir
do
head -1 $f >> $outfile
done
Here is how I call the script:
bash /local/SomePath/Scripts/btab.besthits.wBp-q_wBm-r.sh -d /local/SomePath/test/* -o /local/SomePath/out.test/multigene.firstline.btab
And here is what I get when I run it:
rm: missing operand
Try 'rm --help' for more information.
touch: missing file operand
Try 'touch --help' for more information.
/local/SomePath/Scripts/btab.besthits.wBp-q_wBm-r.sh: line 23: $outfile: ambiguous redirect
Any suggestions? I'd like to be able to use getopts so I can make the script more generic. Thanks!
You have to pay extra attention to quoting and globbing when writing bash scripts.
When you call the script with a glob (* here) it gets expanded and split into words by your shell. This happends before your script even gets executed.
If you for example do cat *.txt cat will get all .txt files in the directory as its arguments. It will be the same as calling cat afile.txt nextfile.txt (and so on). Cat will never see the asterisk.
In your script it means that the input -d /local/SomePath/test/* gets expanded som something like /local/SomePath/test/someFile /local/SomePath/test/someOtherFile /test/someThirdFile.
Subsequently getopts only takes the first file after -d as for $btabdir and the -o doesn't get handled in the case switch.
I suggest you start by quoting every variable, preferable in the "${name}" style, and only invoke the script with quoted input.
It might also be send in a directory path, test that it is a directory (test -d), and change your for loop to for f in "${btabdir}"/*
This also works:
head -n1 -q /local/SomePath/test/* >> /local/SomePath/out.test/multigene.firstline.btab
I think the right answer here is "don't do it that way." :-)
The reason your current script isn't working may be that the wildcard is expanded by your interactive shell, not by your script. Try running your command with an echo at the beginning of the line for a hint at what's really happening. Once getopts sees the second of the matched files in the glob, it stops processing options, so -o never gets read, and $outfile remains unset. And since you don't quote your variable in rm $outfile, it's as if you're running rm without options. Test the difference in your shell between rm alone and rm "".
Also, what happens to your for loop if there's a space in a filename? Since you have bash, you have arrays. And arrays are much better for processing lists of files.
Perhaps use something like this instead:
#!/bin/bash
# initialize an array
files=()
while getopts :d:o: opt; do
case "$opt" in
d)
if [[ ! -d "$OPTARG" ]]; then
printf 'ERROR: not a directory: %s\n' "$OPTARG" >&2
exit 65
fi
# add to the array
files+=( "$OPTARG"/* )
;;
o) outfile="$OPTARG" ;;
*)
printf 'ERROR: unknown option: %s\n' "$opt" >&2
exit 64
;;
esac
done
if ! rm -f "$outfile" && touch "$outfile"; then
printf 'ERROR: cannot create %s\n' "$outfile" >&2
exit 73
fi
for f in "${files[#]}"; do
read -r < "$f"
printf '%s\n' "$REPLY"
done > "$outfile"
Here are some highlights of the changes....
We're using arrays, of course. The array ${files[#]} will contain one-file-per-record, without relying on whitespace, so with proper quoting you'll avoid problems with special characters in filenames.
We test for more error conditions, and actually show errors and exit if we see them. (The exit values are sysexits.)
Instead of using head, we use read and a single redirect to $outfile. This saves multiple forks to an external program, and multiple fopen() calls to your output file.
Note that the argument to -d should be a directory, not a glob. And you can specify options multiple times. Multiple -d options will be added together, but only the last -o option will be used.

Control wildcard expansion in sh command

I have a program which executes shell functions using call like sh -c "<command_string>". I can not change the way of calling these functions.
In this program I call different self written helper shell function, which are sourced into my environment. One of these looks like this. It unzips files with a given file pattern into a given directory.
function dwhUnzipFiles() {
declare OPTIND=1
while getopts "P:F:T:" opt; do
case "$opt" in
P) declare FILEPATTERN="$OPTARG" ;;
F) declare FROMDIR="$OPTARG" ;;
T) declare TODIR="$OPTARG" ;;
*) echo "Unbekannte Option | Usage: dwhUnzipFiles -P <filepattern> -F <fromdir> -T <todir>"
esac
done
shift $((OPTIND-1))
for currentfile in "${FROMDIR}"/"${FILEPATTERN}" ; do
unzip -o "$currentfile" -d "${TODIR}";
done
# error handling
# some more stuff
return $?
}
For this function I use arguments with wildcards for the FILEPATTERN variable. The function gets called by my program like this:
sh -c ". ~/dwh_env.sh && dwhUnzipFiles -P ${DWH_FILEPATTERN_MJF_WLTO}.xml.zip -F ${DWH_DIR_SRC_XML_CURR} -T ${DWH_DIR_SRC_XML_CURR}/workDir" where ${DWH_FILEPATTERN_MJF_WLTO} contains wildcards.
This works as intended. My confusion starts with another helper function, which is constructed in a similar way, but I'm not able to control the wildcard expansion correctly. It just deletes files in a directory depending on a given file pattern.
function dwhDeleteFiles() {
declare retFlag=0
declare OPTIND=1
while getopts "D:P:" opt; do
case "$opt" in
D) declare DIRECTORY="$OPTARG" ;;
P) declare FILEPATTERN="$OPTARG" ;;
*) echo "Unbekannte Option | Usage: dwhDeleteFiles -D <Directory> -P <Filepattern>"
esac
done
shift $((OPTIND-1))
for currentfile in "${DIRECTORY}"/"${FILEPATTERN}" ; do
rm -fv "${currentfile}";
done
# error handling
# some more stuff
return $retFlag
}
This function is called like this:
sh -c ". ~/dwh_env.sh && dwhDeleteFiles -P ${DWH_FILEPATTERN_MJF_WLTO}.xml -D ${DWH_DIR_SRC_XML_CURR}/workDir" where again ${DWH_FILEPATTERN_MJF_WLTO} contains wildcards. When I call this function with my program it results in doing nothing. I tried to play around with adding "" and \"\" to the arguments of my functions, but all what is happening is that instead of deleting all files in the given directory the function deletes only the first one in an alphanumerical order.
Can somebody explain to me, what is happening here? My idea is that the multiple passing of the variable, containing the wildcard, is not working. But how do I fix this and is it even possible in bash? And why is the dwhUnzipFilesfunction working and the dwhDeleteFiles is not?
Suppose that DWH_FILEPATTERN_MJF_WLTO is *, and you have a bunch of *.xml files, then the command is
dwhDeleteFiles -P *.xml -D ${DWH_DIR_SRC_XML_CURR}/workDir
which expands to
dwhDeleteFiles -P bar.xml baz.xml foo.xml zap.xml -D ${DWH_DIR_SRC_XML_CURR}/workDir
(Note alphabetical order of xml files). But the -P option only takes one arg, bar.xml (the first) and the remaining are treated as file arguments.
Try setting set -x in your script to see this in action.

Changing the value of an argument and using options in bash

I'm writing a bash script to automate some job, and in the meantime practice with bash. I'm writing a script to automate some git cloning and updating stuff. My script will have three options (-g, -f, -h). When someone types -h it should display the help message (which I have written but omitted it below), when someone writes -g, it should accept at least one argument, but it can also accept second one as optional. From which, the first one will be the url of the repository to clone and second one will be the directory, where to clone it. On the other hand, if someone types -f, the script should get just one argument, just a file name. Then I want to read the file, line by line, and do git cloning and updating for each git url inside the file.
If I run the script I get the following error message for each option, and if I call it without any option, or with a valid option followed by some other argument, it still doesn't do anything, just returns:
./gitupdate.sh: option requires an argument -- g
I guess it doesn't use $2 and $3 somehow in my code. But then again, when I pass -h, all it has to do is call the help function and display the message, doesn't need to use any other argument.
I guess the problem is because I have something wrong at the bottom, where I use getopts to get the option name specified by the user. In my code I assume that the option is the first argument, $1, and then the second one $2 is the url or filename, according to the option specified, and the $3 is the optional one that works only with -g option.
Below you can find my code:
#!/bin/bash
declare default_directory=$HOME
declare action
declare src
function clone_repo() {
if [ -n "$2" ]; then
if [ "$(ls -A "$2" 2>/dev/null)" ]; then
cd "$2"
else
git clone "$1" "$2"
fi
else
git clone "$1"
#TODO: Get the directory name created by cloning
# and cd to it.
fi
git remote add upstream "$1"
git fetch upstream
}
function read_repos_from_file() {
if [ -f "$1" ]; then
while read -r line; do
clone_repo "$line" "$2"
done < "$1"
else
echo -e "Error: The specified file could not be found."
exit 1
fi
}
while getopts "f:h:r" option
do
case "${option}" in
f) action=read_repos_from_file; src="$OPTARG";;
g) action=clone_repo; src="$OPTARG";;
h) help ; exit 1 ;;
esac
done
shift $((OPTIND-1))
[ -z "$action" ] && ( help; exit 1 )
If someone could help me, I would be glad.
EDIT: Corrected some typo mistakes in the above code. And updated the error message. The script doesn't work correctly. I guess I need to change something in my code, to make it do something, and actually get $2 and $3 arguments. It even doesn't display the help message, when -h option is passed, and all I do there is call the help function that I have previously created. Maybe I need to somehow change my getopts part.
EDIT 2: Made the advised changes, and changed the above code.
git() is the beginning of a function definition (the keyword function is optional when the function name is followed by parentheses). If you want to call a function git() you need to define it first, and call it without the parentheses:
function git() {
# do stuff
}
git
It's not good practice to create functions with the same name as existing binaries, though. In your case you should probably just call git clone with the line read from the file:
while read -r line; do
git clone "$line"
done < "${file}"
Edit: Updated, since the question changed significantly.
Your argument processing is … weird, to be frank. When you're using an option parser, you shouldn't work around the way that option parser works. "g:" means an option -g with exactly one argument. Don't try to make it an option with more than one argument, one of them being optional on top of it. If you need an additional (optional) argument for an output directory, make that either another option (e.g. "d:") or a non-option argument.
I'd suggest to change your option-processing to something like this:
while getopts "f:g:h" option; do
case "$option" in
f) action=file; src="$OPTARG";;
g) action=repo; src="$OPTARG";;
h) help; exit 1;;
esac
done
shift $((OPTIND-1))
[ -z "$action" ] && ( help; exit 1 )
After this "$#" holds only non-option arguments (in this case the optional output directory), so you could call your functions like this:
$action $src "$#"
with the functions simplified to something like this:
function repo() {
if [ -n "$2" ]; then
if [ "$(ls -A "$2" 2>/dev/null)" ]; then
cd "$2"
else
git clone "$1" "$2"
fi
else
git clone "$1"
fi
...
}
function file() {
if [ -f "$1" ]; then
while read -r line; do
repo "$line" "$2"
done < "$1"
else
echo "Error: The specified file could not be found."
exit 1
fi
}
On a more general note, you should make your names more self-explanatory. Better names for functions for cloning a repository or reading repositories from a file would be something like clone_repo and read_repos_from_file respectively. Also, -c or -r would be better mnemonics for an option that triggers the cloning of a repository.

Sourcing a script file in bash before starting an executable

I'm trying to write a bash script that "wraps" whatever the user wants to invoke (and its parameters) sourcing a fixed file just before actually invoking it.
To clarify: I have a "ConfigureMyEnvironment.bash" script that must be sourced before starting certain executables, so I'd like to have a "LaunchInMyEnvironment.bash" script that you can use as in:
LaunchInMyEnvironment <whatever_executable_i_want_to_wrap> arg0 arg1 arg2
I tried the following LaunchInMyEnvironment.bash:
#!/usr/bin/bash
launchee="$#"
if [ -e ConfigureMyEnvironment.bash ];
then source ConfigureMyEnvironment.bash;
fi
exec "$launchee"
where I have to use the "launchee" variable to save the $# var because after executing source, $# becomes empty.
Anyway, this doesn't work and fails as follows:
myhost $ LaunchInMyEnvironment my_executable -h
myhost $ /home/me/LaunchInMyEnvironment.bash: line 7: /home/bin/my_executable -h: No such file or directory
myhost $ /home/me/LaunchInMyEnvironment.bash: line 7: exec: /home/bin/my_executable -h: cannot execute: No such file or directory
That is, it seems like the "-h" parameter is being seen as part of the executable filename and not as a parameter... But it doesn't really make sense to me.
I tried also to use $* instead of $#, but with no better outcoume.
What I'm doing wrong?
Andrea.
Have you tried to remove double quotes in exec command?
Try this:
#!/usr/bin/bash
typeset -a launchee
launchee=("$#")
if [ -e ConfigureMyEnvironment.bash ];
then source ConfigureMyEnvironment.bash;
fi
exec "${launchee[#]}"
That will use arrays for storing arguments, so it will handle even calls like "space delimited string" and "string with ; inside"
Upd: simple example
test_array() { abc=("$#"); for x in "${abc[#]}"; do echo ">>$x<<"; done; }
test_array "abc def" ghi
should give
>>abc def<<
>>ghi<<
You might want to try this (untested):
#!/usr/bin/bash
launchee="$1"
shift
if [ -e ConfigureMyEnvironment.bash ];
then source ConfigureMyEnvironment.bash;
fi
exec "$launchee" $#
The syntax for exec is exec command [arguments], however becuase you've quoted $launchee, this is treated as a single argument - i.e., the command, rather than a command and it's arguments. Another variation may be to simply do: exec $#
Just execute it normally without exec
#!/usr/bin/bash
launchee="$#"
if [ -e ConfigureMyEnvironment.bash ];
then source ConfigureMyEnvironment.bash;
fi
$launchee
Try dividing your list of argumets:
ALL_ARG="${#}"
Executable="${1}"
Rest_of_Args=${ALL_ARG##$Executable}
And try then:
$Executable $Rest_of_Args
(or exec $Executable $Rest_of_Args)
Debugger

How do I pass in optional flags and parameters to bash script?

I have a bash script that I pass parameters into (and access via $1). This parameter is a single command that must be processed (i.e. git pull, checkout dev, etc.).
I run my script like ./script_name git pull
Now, I want to add an optional flag to my script to do some other functionality. So if I call my script like ./script_name -t git pull it will have a different functionality from ./script_name git pull.
How do I access this new flag along with the parameters passed in. I've tried using getopts, but can't seem to make it work with the other non-flag parameters that are passed into the script.
Using getopts is indeed the way to go:
has_t_option=false
while getopts :ht opt; do
case $opt in
h) show_some_help; exit ;;
t) has_t_option=true ;;
:) echo "Missing argument for option -$OPTARG"; exit 1;;
\?) echo "Unknown option -$OPTARG"; exit 1;;
esac
done
# here's the key part: remove the parsed options from the positional params
shift $(( OPTIND - 1 ))
# now, $1=="git", $2=="pull"
if $has_t_option; then
do_something
else
do_something_else
fi

Resources