I wanted to use Getopt::Long::GetOptions for getting command line options to the script.
I have a requirement like this:
perl script.pl -c <name1> -c <name2> -m <name3> argument
Here we have option flags -c and -mm which are optional, and argument is mandatory.
Can anyone point out the correct usage for GetOptions?
From the Getopt::Long documentation:
GetOptions does not return a false result when an option is not supplied
That's why they're called 'options'.
In other words, if you are expecting a mandatory parameter, you need to explicitly check for it outside of the GetOptions call.
If argument is meant to be part of #ARGV and not the options, use -- to signal the end of options. In the example below, the script would access argument via $ARGV[0]:
perl script.pl -c <name1> -c <name2> -m <name3> -- argument
Here's a sample code and result.
https://gist.github.com/kyanny/5634832
If you want to know more about how to handle multiple values option, see documantation: http://perldoc.perl.org/Getopt/Long.html#Options-with-multiple-values
One more thing, Getopt::Long::GetOptions does not provide the way to handle mandatory options. You should check if the mandatory options are in the #ARGV and raise Exceptions, etc. in your hand.
Related
I have a list/array like so:
['path/to/folder/a', 'path/to/folder/b']
This is an example, the array can be of any length. But for each item in the array I'd like to set up the following as a single command:
$ someTool <command> --flag <item-1> --flag <item-2> ... --flag <item-N>
At the moment I am currently doing a loop over the array but I am just wondering if doing them individually has a different behaviour to doing them all at once (which the tool specifies I should do).
for i in "${array[#]}"; do
someTool command --flag $i
done
Whether passing all flag arguments to a single invocation of the tool does the same thing as passing them one-at-a-time to separate invocations depends entirely on the tool and what it does. Without more information, it's impossible to say for sure, but if the instructions recommend passing them all at once, I'd go with that.
The simplest way to do this in bash is generally to create a second array with the flags and arguments as they need to be passed to the tool:
flagsArray=()
for i in "${array[#]}"; do
flagsArray+=(--flag "$i")
done
someTool command "${flagsArray[#]}"
Note: all of the above syntax -- all the quotes, braces, brackets, parentheses, etc -- matter to making this run properly and robustly. Don't leave anything out unless you know why it's there, and that leaving it out won't cause trouble.
BTW, if the option (--flag) doesn't have to be passed as a separate argument (i.e. if the tool allows --flag=path/to/folder/a instead of --flag path/to/folder/a), then you can use a substitution to add the --flag= bit to each element of the array in a single step:
someTool command "${array[#]/#/--flag=}"
Explanation: the /# means "replace at the beginning (of each element)", then the empty string for the thing to replace, / to delimit that from the replacement string, and --flag= as the replacement (/addition) string.
I'm writing a bash script to call functions of the Veeam Backup CLI.
In this script I have a function to configure a new backup job.
function configureBackupJob() {
jobname="$1"
reponame="$2"
objects="$3"
advancedOptions="$4"
scheduleOptions="$5"
activeFullBackupOptions="$6"
indexingOptions="$7"
command="veeamconfig job create filelevel --name ${jobname} --reponame ${reponame} --includedirs ${objects} ${advancedOptions} ${scheduleOptions} ${activeFullBackupOptions} ${indexingOptions} --nosnap"
echo "${command}"
veeamconfig job create filelevel --name "$jobname" --reponame "$reponame" --includedirs "$objects" "$advancedOptions" "$scheduleOptions" "$activeFullBackupOptions" "$indexingOptions" --nosnap
}
When calling the script I use a case to determine which function shall be called:
case $command in
# More cases before and after this one
configureBackupJob)
configureBackupJob "$2" "$3" "$4" "$5" "$6" "$7" "$8"
;;
*)
showHelp
;;
esac
I call the script like this:
sudo ./script.sh configureBackupJob "TheJobsName" "RepositoryName" "/path/FoldertoBeBackedUpByVeeam" "--daily --at 12:15" "--weekdays-full Monday,Wednesday" "--indexall"
I used this site from the Veeam help center to know the arguments: Veeam Help Center: Creating File-Level Backup Job
Calling the script results in an error message
Unknown argument: [--daily --at 12:15].
If I call veeamconfig manually the command that my echo shows works fine.
Why can I call the command directly but not from within the script? I tried calling the function without the double quotation marks but that doesn't work.
I can't hardcode all arguments like the
--includedirs
so I need to find a way to pass arguments like the
--daily --at 12:15
The basic problem is that you're passing "--daily --at 12:15" to the veeamconfig command as a single argument, rather than three separate arguments ("--daily", "--at", and "12:15"). This confuses veeamconfig. It looks ok when you echo it because that looses the distinction between spaces between arguments and spaces within arguments.
The best way to handle this depends on a couple of things: First, does your script care which options have to do with scheduling vs indexing vs full backups vs whatever, or is it ok if there's just a list of options to be given to the veeamconfig command? Second, is it possible that the paths/options/whatever might contain spaces (or filename wildcards) in them, as well as between them? (Note: I mostly use macOS, where paths with spaces are very common.)
If the script doesn't have to differentiate between the different types of options, it's pretty simple: just pass all of the options as separate arguments all the way through, and where necessary store them as an array rather than as separate strings (and use bash printf's %q option to properly quote/escape the command options for printing):
function configureBackupJob() {
jobname="$1"
reponame="$2"
object="$3"
allOptions=("${#:4}") # This stores all arguments starting with $4 in an array
printf -v command '%q ' veeamconfig job create filelevel --name "$jobname" --reponame "$reponame" --includedirs "$object" "${allOptions[#]}" --nosnap
echo "${command}"
veeamconfig job create filelevel --name "$jobname" --reponame "$reponame" --includedirs "$object" "${allOptions[#]}" --nosnap
}
And call it like this:
case $command in
# More cases before and after this one
configureBackupJob)
configureBackupJob "${#:2}"
...
And run the overall script like this:
sudo ./script.sh configureBackupJob "TheJobsName" "RepositoryName" "/path/FoldertoBeBackedUpByVeeam" --daily --at 12:15 --weekdays-full Monday,Wednesday --indexall
If the script needs to tell the different types of option apart, things get messier. If there's no possibility of spaces or wildcard-like characters in the options, you could leave those variables unquoted when you pass them to the veeamconfig command, and let the shell's word splitting break them up into individual arguments:
veeamconfig job create filelevel --name "$jobname" --reponame "$reponame" --includedirs "$objects" $advancedOptions $scheduleOptions $activeFullBackupOptions $indexingOptions --nosnap
Note that if you go this route, you need to keep them safely double-quoted at all other points in the process, especially when passing them to the configureBackupJob function. If word-splitting happens too early, it'll just make a mess.
If you need to keep types of options separate and also allow spaces and/or funny characters in the options, it's even more difficult. You might be tempted to put quotes and/or escapes within the options to control this, but word splitting doesn't respect those, so it doesn't work. I think I'll just refer you to this question and hope this doesn't apply.
I have a text file called OPTIONS.txt storing all flags of Makefile:
arg1=foo arg2="-foo -bar"
I want to pass all flags in this file to make. However,
make `cat OPTIONS.txt`
fails with make: invalid option -- 'a'. It seems that shell interprets it as:
make arg1=foo arg2="-foo -bar"
^argv[1] ^argv[2] ^argv[3]
Is there any way to make it interpreted as:
make arg1=foo arg2="-foo -bar"
^argv[1] ^--------argv[2]
Since you control the options file, store the options one per line:
arg1=foo
arg2="-foo -bar"
Then in the shell, you'll read the file into an array, one element per line:
readarray -t opts < OPTIONS.txt
Now you can invoke make and keep the options whole:
make "${opts[#]}"
If you want the shell to interpret quotes after backtick expansion you need to use eval, like this:
eval make `cat OPTIONS.txt`
however just be aware that this evaluates everything, so if you have quoted content outside of the backticks you'll get the same issue:
eval make `cat OPTIONS.txt` arg4="one two"
will give an error. You'd have to double-quote the arg4, something like this:
eval make `cat OPTIONS.txt` arg4='"one two"'
In general it's tricky to do stuff like this from the command line, outside of scripts.
ETA
The real problem here is that we don't have a set of requirements. Why do you want to put these into a file, and what kind of things are you adding; are they only makefile variable assignments, or are there other make options here as well such as -k or similar?
IF the OP controls (can change) the format of the file AND the file contains content only used by make AND the OP doesn't care about the variables being command line assignments vs. regular assignments AND there are only variable assignments and not other options, then they can just (a) put each variable assignment on its own line, (b) remove all quotes, and (c) use include OPTIONS.txt from inside the makefile to "import" them.
I have to pass a command with its arguments in a scheduled task, while separating the arguments from the command. I used:
split(/(?=\s-)/)
to do this, but it won't work when the argument is not passed as -arg format.
Example of commands can be passed in format:
"ping http://www.google.com" here url is argument
"abc-abc -V"
"abc-abc -L c:\\folder name\\test.log"
'"C:\\Program Files\\example\\program.exe" -arg1 -arg2'
"C:\\Program Files\\example\\program.exe"
To make this more clear these commands are not passed as command line argument which can get in ARGV
The command gets set in command property which accepts input in string format
command '"C:\\Program Files\\example\\program.exe" -arg1 -arg2'
Use Shellwords.split, from the standard library:
Shellwords.split("ping http:\\www.google.com here url is argument")
#=> ["ping", "http:www.google.com", "here", "url", "is", "argument"]
Shellwords.split("abc-abc -V")
#=> ["abc-abc", "-V"]
Shellwords.split("abc-abc -L c:\\folder name\\test.log")
#=> ["abc-abc", "-L", "c:folder", "nametest.log"]
Shellwords.split('"C:\\Program Files\\example\\program.exe" -arg1 -arg2')
#=> ["C:\\Program Files\\example\\program.exe", "-arg1", "-arg2"]
Shellwords.split('"C:\\Program Files\\example\\program.exe"')
#=> ["C:\\Program Files\\example\\program.exe"]
No need to reinvent the wheel with a custom regex/splitter, or an external system call.
It seems to me that if there's no consistent pattern to your command syntax, then any regex based approach will inevitably fail. It seems better instead to solve this problem the way a human would, i.e. with some knowledge of context.
In a *nix terminal, you can use the compgen command to list available commands. This Ruby script invokes that command to print the first 5 options from that list:
list = `cd ~ && compgen -c`
list_arr = list.split("\n")
list_arr[0,6].each{|x| puts x }
(The cd in the first line seems to be needed because of the context in which my Ruby is running with rvm.) For Windows, you may find this thread a useful starting point.
I'd match against the elements of this list to identify my commands, and take it from there.
Tom Lord's answer is far better than this one.
You probably want to look at OptionParser or GetOptLong if you need parsing of command line arguments provided to a ruby program.
If you are interested in parsing some strings that may or may not be commands with arguments, here's a quick-and-dirty:
I'd use scan instead of split with the following regex: /(".*"|[\w\:\:\.\-\\]+)/.
Best results come from: 'some string'.scan(/(".*"|[\w\:\:\.\-\\]+)/).flatten:
["ping", "http:\\www.google.com"]
["abc-abc", "-V"]
["abc-abc", "-L", "c:\\folder\\", "name\\test.log"]
# Technically, this is wrong, but so is the non-escaped whitespace.
["\"C:\\Program Files\\example\\program.exe\"", "-arg1", "-arg2"]
["\"C:\\Program Files\\example\\program.exe\""]
As an example, double dash or two hyphens -- is used like so:
npm test -- --coverage
Running npm without the double dash flag does not run in coverage mode so it seems to append subsequent flags, is this correct? I couldn't find the documentation on this.
-- as an argument on its own is standardized across all UNIX commands: It means that further arguments should be treated as positional arguments, not options. See Guideline 10 in POSIX Utility Syntax Conventions.
To give you a non-NPM-based example, ls -- -l will look for a file named -l, because the -- specified that all subsequent arguments are positional.
In this context, it means that --coverage isn't an option to npm itself; presumably, then, it's subsequently read by the test subcommand. For a tool that were following the conventions properly this wouldn't be necessary, because Guideline 9 specifies that all options shall be given before any arguments (thus that in this context --coverage should be treated as an argument since it comes after the argument test); however, inasmuch as NPM is only partially following the guidelines, this is a foreseeable result.
(Long --option-style options are actually a GNU extension as a whole, so what we have here is a mismash of multiple parsing styles; such is life, unfortunately).
I've done some further digging; according to the docs for my node version -
"--" Indicates the end of node options. Pass the rest of the arguments to the script. If no script filename or eval/print script is supplied prior to this, then the next argument will be used as a script filename.
But, a simple script that contains -
console.log(`process.execArgv:${process.execArgv}`);
console.log(`process.argv:${process.argv}`);
behaves as -
>node --prof argv.js --myArg
process.execArgv:--prof
process.argv:C:\Program Files\nodejs\node.exe,C:\Dev\Web\QA_Web_POC\argv.js,--myArg
>node --prof argv.js -- --myArg
process.execArgv:--prof
process.argv:C:\Program Files\nodejs\node.exe,C:\Dev\Web\QA_Web_POC\argv.js,--, --myArg
>node argv.js --prof -- --myArg
process.execArgv:
process.argv:C:\Program Files\nodejs\node.exe,C:\Dev\Web\QA_Web_POC\argv.js,--prof,--,--myArg
>node argv.js -- --prof --myArg
process.execArgv:
process.argv:C:\Program Files\nodejs\node.exe,C:\Dev\Web\QA_Web_POC\argv.js,--,--prof,--myArg
So, it seems there's a bug?