I want to implement a bash function that runs its arguments as a command, while (maybe optionally) printing the command before. Think of an installation script or test runner script.
Just using
function run () {
echo "Running $#"
"$#"
}
would not allow me to distinguish a call from run foo arg1 arg2 and run foo "arg1 arg2", so I need to properly escape arguments.
My best shot so far is
function run () {
echo -n "Running"
printf " %q" "$#"
echo
"$#"
}
Which works:
$ run echo "one_argument" "second argument" argument\"with\'quotes
Running echo one_argument second\ argument argument\"with\'quotes
one_argument second argument argument"with'quotes
but is not very elegant. How can I achieve an output of
$ run echo "one_argument" "second argument" argument\"with\'quotes
Running echo one_argument "second argument" "argument\"with'quotes"
one_argument second argument argument"with'quotes
i.e. how can I make printf to put quotation marks around arguments that need quoting, and properly escape quotes therein, so that the output can be copy’n’pasted correctly?
This will quote everything:
run() {
printf "Running:"
for arg; do
printf ' "%s"' "${arg//\"/\\\"}"
done
echo
"$#"
}
run echo "one_argument" "second argument" argument\"with\'quotes
Running: "echo" "one_argument" "second argument" "argument\"with'quotes"
one_argument second argument argument"with'quotes
This version only quotes arguments containing double quotes or whitespace:
run() {
local fmt arg
printf "Running:"
for arg; do
[[ $arg == *[\"[:space:]]* ]] && fmt=' "%s"' || fmt=" %s"
printf "$fmt" "${arg//\"/\\\"}"
done
echo
"$#"
}
run echo "one_argument" "second argument" argument\"with\'quotes
Running: echo one_argument "second argument" "argument\"with'quotes"
one_argument second argument argument"with'quotes
I don't think there's an elegant solution to what you want, because "$#" is handled by bash before any command ever gets to see it. You'll have to manually re-construct the command-line:
#!/bin/bash
function run() {
echo -n "Running:"
for arg in "$#"; do
arg="$(sed 's/"/\\&/g' <<<$arg)"
[[ $arg =~ [[:space:]\\\'] ]] && arg=\"arg\"
echo -n " $arg"
done
echo ""
"$#"
}
run "$#"
Output:
$ ./test.sh echo arg1 "arg 2" "arg3\"with'other\'\nstuff"
Running: echo arg1 "arg 2" "arg3\"with'other\'\nstuff"
arg1 arg 2 arg3"with'other\'\nstuff
Note that there are some corner cases where you won't get the exact input command line. This happens when you pass arguments that bash expands before passing them on, e.g.:
$ ./test.sh echo foo'bar'baz
Running: echo foobarbaz
foobarbaz
$ ./test.sh echo "foo\\bar"
Running: echo "foo\bar"
foobar
Related
In a script which request some arguments (arg) and options (-a), I would like to let the script user the possibility to place the options where he wants in the command line.
Here is my code :
while getopts "a" opt; do
echo "$opt"
done
shift $((OPTIND-1))
echo "all_end : $*"
With this order, I have the expected behaviour :
./test.sh -a arg
a
all_end : arg
I would like to get the same result with this order :
./test.sh arg -a
all_end : arg -a
The getopt command (part of the util-linux package and different from getopts) will do what you want. The bash faq has some opinions about using that, but honestly these days most systems will have the modern version of getopt.
Consider the following example:
#!/bin/sh
options=$(getopt -o o: --long option: -- "$#")
eval set -- "$options"
while :; do
case "$1" in
-o|--option)
shift
OPTION=$1
;;
--)
shift
break
;;
esac
shift
done
echo "got option: $OPTION"
echo "remaining args are: $#"
We can call this like this:
$ ./options.sh -o foo arg1 arg2
got option: foo
remaining args are: arg1 arg2
Or like this:
$ ./options.sh arg1 arg2 -o foo
got option: foo
remaining args are: arg1 arg2
You can still go with argument parsing by looking at each of them
#!/bin/bash
for i in "$#"
do
case $i in
-a)
ARG1="set"
shift
;;
*)
# the rest (not -a)
ARGS="${ARGS} $i"
;;
esac
done
if [ -z "$ARG1" ]; then
echo "You haven't passed -a"
else
echo "You have passed -a"
fi
echo "You have also passed: ${ARGS}"
and then you will get:
> ./script.sh arg -a
You have passed -a
You have also passed: arg
> ./script.sh -a arg
You have passed -a
You have also passed: arg
> ./script.sh -a
You have passed -a
You have also passed:
> ./script.sh arg
You haven't passed -a
You have also passed: arg
Consider this approach
#!/bin/bash
opt(){
case $1 in
-o|--option) option="$2";;
-h|--help ) echo "$help"; exit 0;;
*) echo "unknown option: $1"; exit 1;;
esac
}
while [[ $# ]]; do
case $1 in
arg1) var1=$1 ;;
arg2) var2=$1 ;;
-*) opt "$#"; shift;;
*) echo "unknown option: $1"; exit 1;;
esac
shift
done
echo args: $var1 $var2
echo opts: $option
In the many years of bash programming, I've found it useful to get rid of the bone in my head that says functions should look like f(x,y), especially since bash requires clumsy/inefficient code to handle command line arguments.
Option arguments often have default values, and can be more readily passed as environmental variables whose scope is limited to the called script, and save arguments for what must be provided.
Applying this to your example, The script would look like:
OPTION=${OPTION:="fee"}
echo "option: $OPTION"
echo "remaining args are: $*"
And would be called with:
OPTION="foo" ./options.sh arg1 arg2
I'm working on a bash script to grep a file then to run a perl command. I know the commands work since I have been using them, but I can't seem to get them to work for the bash script. I would appreciate any help.
When I output $1 and so on it has the values, but when I output the grep command with them in it. I get file can't be found or blank spaces.
#! /bin/bash
usage()
{
echo "Usage: $0"
echo $'\a'Supply 4 arguments
echo $0 arg1 arg2 arg3 arg4
echo "1. search parameters for bookmarks"
echo "2. What file to grep"
echo "3. file to temporaly store results"
echo "4. what file to store the final version"
exit 1
}
main()
{
# if [ "$#" -lt "4" ]
# then
# usage
# else
echo "beginning grep"
grep -ir "$1" "$2" >> "$3"
echo "grep complete"
# echo "beginning perl command"
# perl -0ne 'print "$2\n$1\n" while (/a href=\"(.*?)\">(.*?)<\/a>/igs)' "$3" >> "$4"
# echo "perl command complete"
# echo "done"
# exit 1
# fi
}
main
echo $0
echo $1
echo $2
echo $3
echo $4
Remember that when a bash function is called, the positional parameters are temporarily replaced with the function's parameters. So either don't make your mainline a function or pass your main function the input parameters. To pass the script's parameters to your main function, do this:
main "$#"
I want to check if a command line option exists when running a shell script, example
./test.sh arg1 arg2 arg3 arg4
Want to check if one of the argument is say arg3 (not necessarily the third argument)
The quick solution is to use a for loop and check if one of the argument matches a given string, but is there a better way to do it something of the form 'arg3' in $#.
(assuming bash)
I would do this:
have_arg3=false
for arg; do
if [[ $arg == "arg3" ]]; then
have_arg3=true
break
fi
done
if $have_arg3; then
echo "arg3 is present
fi
but you could do this (all quotes and spaces below are required!):
if [[ " $* " == *" arg3 "* ]]; then
echo "arg3 is present"
fi
can be encapsulated in a function:
$ arg_in_args () (
arg=$1
shift
IFS=:
[[ "$IFS$*$IFS" == *"$IFS$arg$IFS"* ]]
)
$ set -- foo bar baz
$ if arg_in_args "arg3" "$#"; then echo Y; else echo N; fi
N
$ if arg_in_args "baz" "$#"; then echo Y; else echo N; fi
Y
I'm trying to compare strings. I get "command not found" error. How do I compare the strings?
Code:
#!/bin/bash
STR="Hello World"
if [$STR="Hello World"]; then
echo "passed test"
else
echo "didn't pass test"
fi
Output:
test.sh: line 4: [Hello: command not found
didn't pass test
You should add spaces. Treat [[ or [ as if it's another command like test and other builtins. And like other commands, it requires a space after its name. It's also recommended that you use [[ ]] over [ ] in Bash since [[ ]] doesn't split its variables with IFS and do pathname expansions. It also has more features over the other.
#!/bin/bash
STR="Hello World"
if [[ $STR = "Hello World" ]]; then
echo "passed test"
else
echo "didn't pass test"
fi
Is it possible to implement sub-commands for bash scripts. I have something like this in mind:
http://docs.python.org/dev/library/argparse.html#sub-commands
Here's a simple unsafe technique:
#!/bin/bash
clean() {
echo rm -fR .
echo Thanks to koola, I let you off this time,
echo but you really shouldn\'t run random code you download from the net.
}
help() {
echo Whatever you do, don\'t use clean
}
args() {
printf "%s" options:
while getopts a:b:c:d:e:f:g:h:i:j:k:l:m:n:o:p:q:r:s:t:u:v:w:x:y:z: OPTION "$#"; do
printf " -%s '%s'" $OPTION $OPTARG
done
shift $((OPTIND - 1))
printf "arg: '%s'" "$#"
echo
}
"$#"
That's all very cool, but it doesn't limit what a subcommand could be. So you might want to replace the last line with:
if [[ $1 =~ ^(clean|help|args)$ ]]; then
"$#"
else
echo "Invalid subcommand $1" >&2
exit 1
fi
Some systems let you put "global" options before the subcommand. You can put a getopts loop before the subcommand execution if you want to. Remember to shift before you fall into the subcommand execution; also, reset OPTIND to 1 so that the subcommands getopts doesn't get confused.