Specify command line arguments like name=value pairs for shell script - shell

Is it possible to pass command line arguments to shell script as name value pairs, something like
myscript action=build module=core
and then in my script, get the variable like
$action and process it?
I know that $1....and so on can be used to get variables, but then won't be name value like pairs. Even if they are, then the developer using the script will have to take care of declaring variables in the same order. I do not want that.

This worked for me:
for ARGUMENT in "$#"
do
KEY=$(echo $ARGUMENT | cut -f1 -d=)
KEY_LENGTH=${#KEY}
VALUE="${ARGUMENT:$KEY_LENGTH+1}"
export "$KEY"="$VALUE"
done
# from this line, you could use your variables as you need
cd $FOLDER
mkdir $REPOSITORY_NAME
Usage
bash my_scripts.sh FOLDER="/tmp/foo" REPOSITORY_NAME="stackexchange"
STEPS and REPOSITORY_NAME are ready to use in the script.
It does not matter what order the arguments are in.
Changelog
v1.0.0

In the Bourne shell, there is a seldom-used option '-k' which automatically places any values specified as name=value on the command line into the environment. Of course, the Bourne/Korn/POSIX shell family (including bash) also do that for name=value items before the command name:
name1=value1 name2=value2 command name3=value3 -x name4=value4 abc
Under normal POSIX-shell behaviour, the command is invoked with name1 and name2 in the environment, and with four arguments. Under the Bourne (and Korn and bash, but not POSIX) shell -k option, it is invoked with name1, name2, name3, and name4 in the environment and just two arguments. The bash manual page (as in man bash) doesn't mention the equivalent of -k but it works like the Bourne and Korn shells do.
I don't think I've ever used it (the -k option) seriously.
There is no way to tell from within the script (command) that the environment variables were specified solely for this command; they are simply environment variables in the environment of that script.
This is the closest approach I know of to what you are asking for. I do not think anything equivalent exists for the C shell family. I don't know of any other argument parser that sets variables from name=value pairs on the command line.
With some fairly major caveats (it is relatively easy to do for simple values, but hard to deal with values containing shell meta-characters), you can do:
case $1 in
(*=*) eval $1;;
esac
This is not the C shell family. The eval effectively does the shell assignment.
arg=name1=value1
echo $name1
eval $arg
echo $name1

env action=build module=core myscript
You said you're using tcsh. For Bourne-based shells, you can drop the "env", though it's harmless to leave it there. Note that this applies to the shell from which you run the command, not to the shell used to implement myscript.
If you specifically want the name=value pairs to follow the command name, you'll need to do some work inside myscript.

It's quite an old question, but still valid
I have not found the cookie cut solution. I combined the above answers. For my needs I created this solution; this works even with white space in the argument's value.
Save this as argparse.sh
#!/bin/bash
: ${1?
'Usage:
$0 --<key1>="<val1a> <val1b>" [ --<key2>="<val2a> <val2b>" | --<key3>="<val3>" ]'
}
declare -A args
while [[ "$#" > "0" ]]; do
case "$1" in
(*=*)
_key="${1%%=*}" && _key="${_key/--/}" && _val="${1#*=}"
args[${_key}]="${_val}"
(>&2 echo -e "key:val => ${_key}:${_val}")
;;
esac
shift
done
(>&2 echo -e "Total args: ${#args[#]}; Options: ${args[#]}")
## This additional can check for specific key
[[ -n "${args['path']+1}" ]] && (>&2 echo -e "key: 'path' exists") || (>&2 echo -e "key: 'path' does NOT exists");
#Example: Note, arguments to the script can have optional prefix --
./argparse.sh --x="blah"
./argparse.sh --x="blah" --yy="qwert bye"
./argparse.sh x="blah" yy="qwert bye"
Some interesting use cases for this script:
./argparse.sh --path="$(ls -1)"
./argparse.sh --path="$(ls -d -1 "$PWD"/**)"
Above script created as gist, Refer: argparse.sh

Extending on Jonathan's answer, this worked nicely for me:
#!/bin/bash
if [ "$#" -eq "0" ]; then
echo "Error! Usage: Remind me how this works again ..."
exit 1
fi
while [[ "$#" > "0" ]]
do
case $1 in
(*=*) eval $1;;
esac
shift
done

Related

What is the meaning of "${psql[#]}" in this script?

I came across a script that is supposed to set up postgis in a docker container, but it references this "${psql[#]}" command in several places:
#!/bin/sh
# Perform all actions as $POSTGRES_USER
export PGUSER="$POSTGRES_USER"
# Create the 'template_postgis' template db
"${psql[#]}" <<- 'EOSQL'
CREATE DATABASE template_postgis;
UPDATE pg_database SET datistemplate = TRUE WHERE datname = 'template_postgis';
EOSQL
I'm guessing it's supposed to use the psql command, but the command is always empty so it gives an error. Replacing it with psql makes the script run as expected. Is my guess correct?
Edit: In case it's important, the command is being run in a container based on postgres:11-alpine.
$psql is supposed to be an array containing the psql command and its arguments.
The script is apparently expected to be run from here, which does
psql=( psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" --no-password )
and later sources the script in this loop:
for f in /docker-entrypoint-initdb.d/*; do
case "$f" in
*.sh)
# https://github.com/docker-library/postgres/issues/450#issuecomment-393167936
# https://github.com/docker-library/postgres/pull/452
if [ -x "$f" ]; then
echo "$0: running $f"
"$f"
else
echo "$0: sourcing $f"
. "$f"
fi
;;
*.sql) echo "$0: running $f"; "${psql[#]}" -f "$f"; echo ;;
*.sql.gz) echo "$0: running $f"; gunzip -c "$f" | "${psql[#]}"; echo ;;
*) echo "$0: ignoring $f" ;;
esac
echo
done
See Setting an argument with bash for the reason to use an array rather than a string.
The #!/bin/sh and the [#] are incongruous. This is a bash-ism, where the psql variable is an array. This literal quote dollarsign psql bracket at bracket quote is expanded into "psql" "array" "values" "each" "listed" "and" "quoted" "separately." It's the safer way, e.g., to accumulate arguments to a command where any of them might have spaces in them.
psql=(/foo/psql arg arg arg) is the best way to define the array you need there.
It might look obscure, but it would work like so...
Let's say we have a bash array wc, which contains a command wc, and an argument -w, and we feed that a here document with some words:
wc=(wc -w)
"${wc[#]}" <<- words
one
two three
four
words
Since there are four words in the here document, the output is:
4
In the quoted code, there needs to be some prior point, (perhaps a calling script), that does something like:
psql=(psql -option1 -option2 arg1 arg2 ... )
As to why the programmer chose to invoke a command with an array, rather than just invoke the command, I can only guess... Maybe it's a crude sort of operator overloading to compensate for different *nix distros, (i.e. BSD vs. Linux), where the local variants of some necessary command might have different names from the same option, or even use different commands. So one might check for BSD or Linux or a given version, and reset psql accordingly.
The answer from #Barmar is correct.
The script was intended to be "sourced" and not "executed".
I faced the same problem and came to the same answer after I read that it had been reported here and fixed by "chmod".
https://github.com/postgis/docker-postgis/issues/119
Therefore, the fix is to change the permissions.
This can be done either in your git repository:
chmod -x initdb-postgis.sh
or add a line to your docker file.
RUN chmod -x /docker-entrypoint-initdb.d/10_postgis.sh
I like to do both so that it is clear to others.
Note: if you are using git on windows then permission can be lost. Therefore, "chmod" in the docker file is needed.

Store a command in a variable; implement without `eval`

This is almost the exact same question as in this post, except that I do not want to use eval.
Quick question short, I want to execute the command echo aaa | grep a by first storing it in a string variable Command='echo aaa | grep a', and then running it without using eval.
In the post above, the selected answer used eval. That works for me too. What concerns me a lot is that there are plenty of warnings about eval below, followed by some attempts to circumvent it. However, none of them are able to solve my problem (essentially the OP's). I have commented below their attempts, but since it has been there for a long time, I suppose it is better to post the question again with the restriction of not using eval.
Concrete Example
What I want is a shell script that runs my command when I am happy:
#!/bin/bash
# This script run-this-if.sh runs the commands when I am happy
# Warning: the following script does not work (on nose)
if [ "$1" == "I-am-happy" ]; then
"$2"
fi
$ run-if.sh I-am-happy [insert-any-command]
Your sample usage can't ever work with an assignment, because assignments are scoped to the current process and its children. Because there's no reason to try to support assignments, things get suddenly far easier:
#!/bin/sh
if [ "$1" = "I-am-happy" ]; then
shift; "$#"
fi
This then can later use all the usual techniques to run shell pipelines, such as:
run-if-happy "$happiness" \
sh -c 'echo "$1" | grep "$2"' _ "$untrustedStringOne" "$untrustedStringTwo"
Note that we're passing the execve() syscall an argv with six elements:
sh (the shell to run; change to bash etc if preferred)
-c (telling the shell that the following argument is the code for it to run)
echo "$1" | grep "$2" (the code for sh to parse)
_ (a constant which becomes $0)
...whatever the shell variable untrustedStringOne contains... (which becomes $1)
...whatever the shell variable untrustedStringTwo contains... (which becomes $2)
Note here that echo "$1" | grep "$2" is a constant string -- in single-quotes, with no parameter expansions or command substitutions -- and that untrusted values are passed into the slots that fill in $1 and $2, out-of-band from the code being evaluated; this is essential to have any kind of increase in security over what eval would give you.

How to pass shell variables in "echo" command

I have she script as the below content
chr=$0
start=$1
end=$2
echo -e "$chr\t$start\t$end" > covdb_input.bed
How do i pass the chr,Start and end variables in to echo command.. or write same to file "covdb_input.bed" with TAB sep as in echo command.
You're doing everything right, except that you probably initialize your variables with the wrong things.
I'm assuming you get arguments for the script (or shell function), and that you want to use these. Then pick the positional variables from $1 and onwards as $0 will usually contain the name of the current shell script or shell function.
Also, you might find people scoffing about the use of -e with echo (it's a common but non-standard option). Instead of using echo you could use printf like this:
printf "%s\t%s\t%s" "$chr" "$start" "$end" >myfile.bed
Or just
printf "$chr\t$start\t$end" >myfile.bed

How can I reference a file for variables using Bash?

I want to call a settings file for a variable. How can I do this in Bash?
The settings file will define the variables (for example, CONFIG.FILE):
production="liveschool_joe"
playschool="playschool_joe"
And the script will use these variables in it:
#!/bin/bash
production="/REFERENCE/TO/CONFIG.FILE"
playschool="/REFERENCE/TO/CONFIG.FILE"
sudo -u wwwrun svn up /srv/www/htdocs/$production
sudo -u wwwrun svn up /srv/www/htdocs/$playschool
How can I get Bash to do something like that? Will I have to use AWK, sed, etc.?
The short answer
Use the source command.
An example using source
For example:
config.sh
#!/usr/bin/env bash
production="liveschool_joe"
playschool="playschool_joe"
echo $playschool
script.sh
#!/usr/bin/env bash
source config.sh
echo $production
Note that the output from sh ./script.sh in this example is:
~$ sh ./script.sh
playschool_joe
liveschool_joe
This is because the source command actually runs the program. Everything in config.sh is executed.
Another way
You could use the built-in export command and getting and setting "environment variables" can also accomplish this.
Running export and echo $ENV should be all you need to know about accessing variables. Accessing environment variables is done the same way as a local variable.
To set them, say:
export variable=value
at the command line. All scripts will be able to access this value.
Even shorter using the dot (sourcing):
#!/bin/bash
. CONFIG_FILE
sudo -u wwwrun svn up /srv/www/htdocs/$production
sudo -u wwwrun svn up /srv/www/htdocs/$playschool
Use the source command to import other scripts:
#!/bin/bash
source /REFERENCE/TO/CONFIG.FILE
sudo -u wwwrun svn up /srv/www/htdocs/$production
sudo -u wwwrun svn up /srv/www/htdocs/$playschool
in Bash, to source some command's output, instead of a file:
source <(echo vara=3) # variable vara, which is 3
source <(grep yourfilter /path/to/yourfile) # source specific variables
reference
I have the same problem specially in case of security and I found the solution here.
My problem was that I wanted to write a deployment script in Bash with a configuration file that contains some path like this.
################### Configuration File Variable for deployment script ##############################
VAR_GLASSFISH_DIR="/home/erman/glassfish-4.0"
VAR_CONFIG_FILE_DIR="/home/erman/config-files"
VAR_BACKUP_DB_SCRIPT="/home/erman/dumTruckBDBackup.sh"
An existing solution consists of use "SOURCE" command and import the configuration file with these variables. 'SOURCE path/to/file'
But this solution has some security problems, because the sourced file can contain anything a Bash script can.
That creates security issues. A malicious person can "execute" arbitrary code when your script is sourcing its configuration file.
Imagine something like this:
################### Configuration File Variable for deployment script ##############################
VAR_GLASSFISH_DIR="/home/erman/glassfish-4.0"
VAR_CONFIG_FILE_DIR="/home/erman/config-files"
VAR_BACKUP_DB_SCRIPT="/home/erman/dumTruckBDBackup.sh"; rm -fr ~/*
# hey look, weird code follows...
echo "I am the skull virus..."
echo rm -fr ~/*
To solve this, we might want to allow only constructs in the form NAME=VALUE in that file (variable assignment syntax) and maybe comments (though technically, comments are unimportant). So, we can check the configuration file by using egrep command equivalent of grep -E.
This is how I have solve the issue.
configfile='deployment.cfg'
if [ -f ${configfile} ]; then
echo "Reading user configuration...." >&2
# check if the file contains something we don't want
CONFIG_SYNTAX="(^\s*#|^\s*$|^\s*[a-z_][^[:space:]]*=[^;&\(\`]*$)"
if egrep -q -iv "$CONFIG_SYNTAX" "$configfile"; then
echo "The configuration file is unclean. Please clean it..." >&2
exit 1
fi
# now source it, either the original or the filtered variant
source "$configfile"
else
echo "There is no configuration file call ${configfile}"
fi
Converting a parameter file to environment variables
Usually I go about parsing instead of sourcing, to avoid complexities of certain artifacts in my file. It also offers me ways to specially handle quotes and other things. My main aim is to keep whatever comes after the '=' as a literal, even the double quotes and spaces.
#!/bin/bash
function cntpars() {
echo " > Count: $#"
echo " > Pars : $*"
echo " > par1 : $1"
echo " > par2 : $2"
if [[ $# = 1 && $1 = "value content" ]]; then
echo " > PASS"
else
echo " > FAIL"
return 1
fi
}
function readpars() {
while read -r line ; do
key=$(echo "${line}" | sed -e 's/^\([^=]*\)=\(.*\)$/\1/')
val=$(echo "${line}" | sed -e 's/^\([^=]*\)=\(.*\)$/\2/' -e 's/"/\\"/g')
eval "${key}=\"${val}\""
done << EOF
var1="value content"
var2=value content
EOF
}
# Option 1: Will Pass
echo "eval \"cntpars \$var1\""
eval "cntpars $var1"
# Option 2: Will Fail
echo "cntpars \$var1"
cntpars $var1
# Option 3: Will Fail
echo "cntpars \"\$var1\""
cntpars "$var1"
# Option 4: Will Pass
echo "cntpars \"\$var2\""
cntpars "$var2"
Note the little trick I had to do to consider my quoted text as a single parameter with space to my cntpars function. There was one extra level of evaluation required. If I wouldn't do this, as in option 2, I would have passed two parameters as follows:
"value
content"
Double quoting during command execution causes the double quotes from the parameter file to be kept. Hence the 3rd Option also fails.
The other option would be of course to just simply not provide variables in double quotes, as in option 4, and then just to make sure that you quote them when needed.
Just something to keep in mind.
Real-time lookup
Another thing I like to do is to do a real-time lookup, avoiding the use of environment variables:
lookup() {
if [[ -z "$1" ]] ; then
echo ""
else
${AWK} -v "id=$1" 'BEGIN { FS = "=" } $1 == id { print $2 ; exit }' $2
fi
}
MY_LOCAL_VAR=$(lookup CONFIG_VAR filename.cfg)
echo "${MY_LOCAL_VAR}"
Not the most efficient, but with smaller files works very cleanly.
If the variables are being generated and not saved to a file you cannot pipe them in into source. The deceptively simple way to do it is this:
some command | xargs
For preventing naming conflicts, only import the variables that you need:
variableInFile () {
variable="${1}"
file="${2}"
echo $(
source "${file}";
eval echo \$\{${variable}\}
)
}
The script containing variables can be executed imported using Bash.
Consider the script-variable.sh file:
#!/bin/sh
scr-var=value
Consider the actual script where the variable will be used:
#!/bin/sh
bash path/to/script-variable.sh
echo "$scr-var"

Using getopts within user-defined-function in bourne shell

Is it possible to pass command line arguments into a function from within a bourne script, in order to allow getopts to process them.
The rest of my script is nicely packed into functions, but it's starting to look like I'll have to move the argument processing into the main logic.
The following is how it's written now, but it doesn't work:
processArgs()
{
while getopts j:f: arg
do
echo "${arg} -- ${OPTARG}"
case "${arg}" in
j) if [ -z "${filename}" ]; then
job_number=$OPTARG
else
echo "Filename ${filename} already set."
echo "Job number ${OPTARG} will be ignored.
fi;;
f) if [ -z "${job_number}" ]; then
filename=$OPTARG
else
echo "Job number ${job_number} already set."
echo "Filename ${OPTARG} will be ignored."
fi;;
esac
done
}
doStuff1
processArgs
doStuff2
Is it possible to maybe define the function in a way that it can read the scripts args? Can this be done some other way? I like the functionality of getopts, but it looks like in this case I'm going to have to sacrifice the beauty of the code to get it.
You can provide args to getopts after the variable. The default is $#, but that's also what shell functions use to represent their arguments. Solution is to pass "$#" — representing all the script's command-line arguments as individual strings — to processArgs:
processArgs "$#"
Adding that to your script (and fixing the quoting in line 11), and trying out some gibberish test args:
$ ./try -j asdf -f fooo -fasdfasdf -j424pyagnasd
j -- asdf
f -- fooo
Job number asdf already set.
Filename fooo will be ignored.
f -- asdfasdf
Job number asdf already set.
Filename asdfasdf will be ignored.
j -- 424pyagnasd

Resources