How to print script parameters sequentially? - bash

My shell name is test.sh, and I want to output all parameters which are passed to test.sh , but I find my code can't work well.
#!/bin/bash
i=1
num=$#
while [ $i -le $num ]; do
echo $($i)
((i++))
done
When I run ./test.sh -a -b -c, my expected output is:
-a
-b
-c
but, it tells me
./test.sh: line 5: 1: command not found
./test.sh: line 5: 2: command not found
./test.sh: line 5: 3: command not found
How can I resolve this issue and output all parameters using echo command?

You are looking for variable indirection:
#!/bin/bash
i=1
num=$#
while [ $i -le $num ]; do
echo ${!i} # <--- print the content of $1, $2...
((i++))
done
Upon execution, this returns:
$ bash test.sh -a -b "-c d"
-a
-b
-c d
From Bash Reference Manual → 3.5.3 Shell Parameter Expansion:
If the first character of parameter is an exclamation point (!), and parameter is not a nameref, it introduces a level of variable indirection. Bash uses the value of the variable formed from the rest of parameter as the name of the variable; this variable is then expanded and that value is used in the rest of the substitution, rather than the value of parameter itself. This is known as indirect expansion. If parameter is a nameref, this expands to the name of the variable referenced by parameter instead of performing the complete indirect expansion. The exceptions to this are the expansions of ${!prefix*} and ${!name[#]} described below. The exclamation point must immediately follow the left brace in order to introduce indirection.
If you wish to make it more verbose, show the mapping of ${1..n} to its value:
#!/bin/bash
i=1
num=$#
while [ $i -le $num ]; do
printf "$%s = %s\n" "$i" "${!i}"
((i++))
done
See the output:
$ bash test.sh -a -b "-c d"
$1 = -a
$2 = -b
$3 = -c d

In Bash the sequence $(...) is to create a sub-shell to execute a command. It's a "new" way of using backticks.
The expression
$($i)
is equivalent to
`$i`
So what you are doing is calling the arguments as a command in a sub-shell.

You could also use a one-line script
#!/bin/bash
echo $*

Related

Is there any way to convert a literal to a non literal variable in bash? [duplicate]

This question already has answers here:
Why does shell ignore quoting characters in arguments passed to it through variables? [duplicate]
(3 answers)
Closed last year.
My goal is to have an array that includes specific commands for each value. However to even get bash to process the array I need to create literal values surrounded by ''
When trying to process through the array the literal values (that include variable syntax) are processed as their literal name instead of their variable name. Is there anyway to convert a literal value to a non literal value?
Here is an example of what I mean:
#!/bin/bash
dir=C
echo "Non literal"
ls $dir
echo "Literal"
'ls $dir'
echo "variable with literal
cmd='ls $dir'
echo $cmd
echo "$cmd"
$ ./test.sh
Non literal
01_hello_world Modern_C PLE_LinkedIn_Learning_C The_C_Programming_Language
Literal
./test.sh: line 6: ls $dir: command not found
variable with literal
ls $dir
ls $dir
From the "Literal statement" I want to be able to convert 'ls $dir' to "ls $dir" so $dir gets processed as C.
Is this possible?
EDIT
I want to include my actual script that will process a list of commands from an array (my original goal):
#!/bin/bash
dir=C
tree_cmd=tree
run_cmds(){
if [[ -z "$#" ]]; then
echo "Array is empty"
else
for i in "$#"; do
$i
done
fi
}
arr=(
'ls $dir'
'cat $dir/01_hello_world'
'$tree_cmd $dir'
)
run_cmds "${arr[#]}"
Do not store commands in strings. Use functions.
Quote variable expansion.
Use shellcheck to check your script.
#!/bin/bash
dir=C
tree_cmd=tree
run_cmds(){
if ((!$#)); then
echo "No argumetns given"
else
local i
for i in "$#"; do
"$i"
done
fi
}
cmd_1() {
ls "$dir"
}
cmd_2() {
cat "$dir"/01_hello_world
}
cmd_3() {
"$tree_cmd" "$dir"
}
arr=( cmd_1 cmd_2 cmd_3 )
run_cmds "${arr[#]}"
If you really want to store commands in strings, for example for short testing purposes, ignoring some best practices, see https://mywiki.wooledge.org/BashFAQ/048 . Still quote variable expansions. You can do:
#!/bin/bash
dir=C
tree_cmd=tree
run_cmds(){
local i
for i in "$#"; do
eval "$i"
done
}
arr=(
'ls "$dir"'
'cat "$dir"/01_hello_world'
'"$tree_cmd" "$dir"'
)
run_cmds "${arr[#]}"

How do I nest parameter expansions for uppercasing and substitution in Bash?

I have two bash string built in commands that work fine independently but when nested generate an error message no matter what I try. Here's the two individual commands that work:
$ A="etc/.java"
$ echo $A
/etc/.java
$ B="${A//$'\057\056'/$'\057'}"
$ echo $B
/etc/java
$ B="${A^^}"
$ echo $B
/ETC/.JAVA
Now trying to combine the two commands together I get errors:
$ B="${${A^^}//$'\057\056'/$'\057'}"
bash: ${${A^^}///.//}: bad substitution
$ B="${ ${A^^}//$'\057\056'/$'\057'}"
bash: ${ ${A^^}///.//}: bad substitution
$ B="${ ${A^^} //$'\057\056'/$'\057'}"
bash: ${ ${A^^} ///.//}: bad substitution
$ B="${"${A^^}"//$'\057\056'/$'\057'}"
bash: ${"${A^^}"//'/.'/'/'}: bad substitution
$ B="${ "${A^^}" //$'\057\056'/$'\057'}"
bash: ${ "${A^^}" //'/.'/'/'}: bad substitution
$ B="${${A^^} //$'\057\056'/$'\057'}"
bash: ${${A^^} ///.//}: bad substitution
Simplified examples are presented above so one can copy and paste to their own terminal. Piping or redirection would be complicated because my real world code is this:
while [[ $i -lt $DirsArrCnt ]] ; do
DirsArr[$i]=false
CurrNdx=$i
CurrKey="${DirsArr[$(( $i + 1 ))]}"
# ^^ = convert to upper-case
# ${Variable//$'\041\056'/$'\041'} = Change /. to / for hidden directory sorting
if [[ "${"${CurrKey^^}"//$'\041\056'/$'\041'}" > \
"${"${LastKey^^}"//$'\041\056'/$'\041'}" ]] || \
[[ "${"${CurrKey^^}"//$'\041\056'/$'\041'}" = \
"${"${LastKey^^}"//$'\041\056'/$'\041'}" ]] ; then
LastNdx=$CurrNdx
LastKey="$CurrKey"
i=$(( $i + $OneDirArrCnt))
continue
fi
In the special case of one of the expansions being upper casing, it can be done in a single expansion, using declare -u (introduced in Bash 4.0). declare -u converts to uppercase on assignment.
Combining upper casing and substitution then becomes this:
$ declare -u A='/etc/.java'
$ echo "${A//\/./\/}"
/ETC/JAVA
There is the analogous -l for lower casing and the (undocumented) -c for title casing, but these are the only cases where you can do "nested" parameter expansion.

Bash script - variable content as a command to run

I have a Perl script that gives me a defined list of random numbers that correspond to the lines of a file. Next I want to extract those lines from the file using sed.
#!/bin/bash
count=$(cat last_queries.txt | wc -l)
var=$(perl test.pl test2 $count)
The variable var returns an output like: cat last_queries.txt | sed -n '12p;500p;700p'. The problem is that I can't run this last command. I tried with $var, but the output is not correct (if I run manually the command it works fine, so no problem there). What is the correct way to do this?
P.S: Sure I could do all the work in Perl, but I'm trying to learn this way, because it could help me in other situations.
You just need to do:
#!/bin/bash
count=$(cat last_queries.txt | wc -l)
$(perl test.pl test2 $count)
However, if you want to call your Perl command later, and that's why you want to assign it to a variable, then:
#!/bin/bash
count=$(cat last_queries.txt | wc -l)
var="perl test.pl test2 $count" # You need double quotes to get your $count value substituted.
...stuff...
eval $var
As per Bash's help:
~$ help eval
eval: eval [arg ...]
Execute arguments as a shell command.
Combine ARGs into a single string, use the result as input to the shell,
and execute the resulting commands.
Exit Status:
Returns exit status of command or success if command is null.
You're are probably looking for eval $var.
line=$((${RANDOM} % $(wc -l < /etc/passwd)))
sed -n "${line}p" /etc/passwd
just with your file instead.
In this example I used the file /etc/password, using the special variable ${RANDOM} (about which I learned here), and the sed expression you had, only difference is that I am using double quotes instead of single to allow the variable expansion.
There're 2 basic ways of executing a string command in a shell script whether it's given as parameter or not here's.
COMMAND="ls -lah"
$(echo $COMMAND)
or
COMMAND="ls -lah"
bash -c $COMMAND
In the case where you have multiple variables containing the arguments for a command you're running, and not just a single string, you should not use eval directly, as it will fail in the following case:
function echo_arguments() {
echo "Argument 1: $1"
echo "Argument 2: $2"
echo "Argument 3: $3"
echo "Argument 4: $4"
}
# Note we are passing 3 arguments to `echo_arguments`, not 4
eval echo_arguments arg1 arg2 "Some arg"
Result:
Argument 1: arg1
Argument 2: arg2
Argument 3: Some
Argument 4: arg
Note that even though "Some arg" was passed as a single argument, eval read it as two.
Instead, you can just use the string as the command itself:
# The regular bash eval works by jamming all its arguments into a string then
# evaluating the string. This function treats its arguments as individual
# arguments to be passed to the command being run.
function eval_command() {
"$#";
}
Note the difference between the output of eval and the new eval_command function:
eval_command echo_arguments arg1 arg2 "Some arg"
Result:
Argument 1: arg1
Argument 2: arg2
Argument 3: Some arg
Argument 4:
Better ways to do it
Using a function:
# define it
myls() {
ls -l "/tmp/test/my dir"
}
# run it
myls
Using an array:
# define the array
mycmd=(ls -l "/tmp/test/my dir")
# run the command
"${mycmd[#]}"
cmd="ls -atr ${HOME} | tail -1" <br/>
echo "$cmd" <br/>
VAR_FIRST_FILE=$( eval "${cmd}" ) <br/>
or
cmd=("ls -atr ${HOME} | tail -1") <br/>
echo "$cmd" <br/>
VAR_FIRST_FILE=$( eval "${cmd[#]}" )

Indirect parameter substitution in shell script

I'm having a problem with a shell script (POSIX shell under HP-UX, FWIW). I have a function called print_arg into which I'm passing the name of a parameter as $1. Given the name of the parameter, I then want to print the name and the value of that parameter. However, I keep getting an error. Here's an example of what I'm trying to do:
#!/usr/bin/sh
function print_arg
{
# $1 holds the name of the argument to be shown
arg=$1
# The following line errors off with
# ./test_print.sh[9]: argval=${"$arg"}: The specified substitution is not valid for this command.
argval=${"$arg"}
if [[ $argval != '' ]] ; then
printf "ftp_func: $arg='$argval'\n"
fi
}
COMMAND="XYZ"
print_arg "COMMAND"
I've tried re-writing the offending line every way I can think of. I've consulted the local oracles. I've checked the online "BASH Scripting Guide". And I sharpened up the ol' wavy-bladed knife and scrubbed the altar until it gleamed, but then I discovered that our local supply of virgins has been cut down to, like, nothin'. Drat!
Any advice regarding how to get the value of a parameter whose name is passed into a function as a parameter will be received appreciatively.
You could use eval, though using direct indirection as suggested by SiegeX is probably nicer if you can use bash.
#!/bin/sh
foo=bar
print_arg () {
arg=$1
eval argval=\"\$$arg\"
echo "$argval"
}
print_arg foo
In bash (but not in other sh implementations), indirection is done by: ${!arg}
Input
foo=bar
bar=baz
echo $foo
echo ${!foo}
Output
bar
baz
This worked surprisingly well:
#!/bin/sh
foo=bar
print_arg () {
local line name value
set | \
while read line; do
name=${line%=*} value=${line#*=\'}
if [ "$name" = "$1" ]; then
echo ${value%\'}
fi
done
}
print_arg foo
It has all the POSIX clunkiness, in Bash would be much sorter, but then again, you won't need it because you have ${!}. This -in case it proves solid- would have the advantage of using only builtins and no eval. If I were to construct this function using an external command, it would have to be sed. Would obviate the need for the read loop and the substitutions. Mind that asking for indirections in POSIX without eval, has to be paid with clunkiness! So don't beat me!
Even though the answer's already accepted, here's another method for those who need to preserve newlines and special characters like Escape ( \033 ): Storing the variable in base64.
You need: bc, wc, echo, tail, tr, uuencode, uudecode
Example
#!/bin/sh
#====== Definition =======#
varA="a
b
c"
# uuencode the variable
varB="`echo "$varA" | uuencode -m -`"
# Skip the first line of the uuencode output.
varB="`NUM=\`(echo "$varB"|wc -l|tr -d "\n"; echo -1)|bc \`; echo "$varB" | tail -n $NUM)`"
#====== Access =======#
namevar1=varB
namevar2=varA
echo simple eval:
eval "echo \$$namevar2"
echo simple echo:
echo $varB
echo precise echo:
echo "$varB"
echo echo of base64
eval "echo \$$namevar1"
echo echo of base64 - with updated newlines
eval "echo \$$namevar1 | tr ' ' '\n'"
echo echo of un-based, using sh instead of eval (but could be made with eval, too)
export $namevar1
sh -c "(echo 'begin-base64 644 -'; echo \$$namevar1 | tr ' ' '\n' )|uudecode"
Result
simple eval:
a b c
simple echo:
YQpiCmMK ====
precise echo:
YQpiCmMK
====
echo of base64
YQpiCmMK ====
echo of base64 - with updated newlines
YQpiCmMK
====
echo of un-based, using sh instead of eval (but could be made with eval, too)
a
b
c
Alternative
You also could use the set command and parse it's output; with that, you don't need to treat the variable in a special way before it's accessed.
A safer solution with eval:
v=1
valid_var_name='[[:alpha:]_][[:alnum:]_]*$'
print_arg() {
local arg=$1
if ! expr "$arg" : "$valid_var_name" >/dev/null; then
echo "$0: invalid variable name ($arg)" >&2
exit 1
fi
local argval
eval argval=\$$arg
echo "$argval"
}
print_arg v
print_arg 'v; echo test'
Inspired by the following answer.

What's a concise way to check that environment variables are set in a Unix shell script?

I've got a few Unix shell scripts where I need to check that certain environment variables are set before I start doing stuff, so I do this sort of thing:
if [ -z "$STATE" ]; then
echo "Need to set STATE"
exit 1
fi
if [ -z "$DEST" ]; then
echo "Need to set DEST"
exit 1
fi
which is a lot of typing. Is there a more elegant idiom for checking that a set of environment variables is set?
EDIT: I should mention that these variables have no meaningful default value - the script should error out if any are unset.
Parameter Expansion
The obvious answer is to use one of the special forms of parameter expansion:
: ${STATE?"Need to set STATE"}
: ${DEST:?"Need to set DEST non-empty"}
Or, better (see section on 'Position of double quotes' below):
: "${STATE?Need to set STATE}"
: "${DEST:?Need to set DEST non-empty}"
The first variant (using just ?) requires STATE to be set, but STATE="" (an empty string) is OK — not exactly what you want, but the alternative and older notation.
The second variant (using :?) requires DEST to be set and non-empty.
If you supply no message, the shell provides a default message.
The ${var?} construct is portable back to Version 7 UNIX and the Bourne Shell (1978 or thereabouts). The ${var:?} construct is slightly more recent: I think it was in System III UNIX circa 1981, but it may have been in PWB UNIX before that. It is therefore in the Korn Shell, and in the POSIX shells, including specifically Bash.
It is usually documented in the shell's man page in a section called Parameter Expansion. For example, the bash manual says:
${parameter:?word}
Display Error if Null or Unset. If parameter is null or unset, the expansion of word (or a message to that effect if word is not present) is written to the standard error and the shell, if it is not interactive, exits. Otherwise, the value of parameter is substituted.
The Colon Command
I should probably add that the colon command simply has its arguments evaluated and then succeeds. It is the original shell comment notation (before '#' to end of line). For a long time, Bourne shell scripts had a colon as the first character. The C Shell would read a script and use the first character to determine whether it was for the C Shell (a '#' hash) or the Bourne shell (a ':' colon). Then the kernel got in on the act and added support for '#!/path/to/program' and the Bourne shell got '#' comments, and the colon convention went by the wayside. But if you come across a script that starts with a colon, now you will know why.
Position of double quotes
blong asked in a comment:
Any thoughts on this discussion? https://github.com/koalaman/shellcheck/issues/380#issuecomment-145872749
The gist of the discussion is:
… However, when I shellcheck it (with version 0.4.1), I get this message:
In script.sh line 13:
: ${FOO:?"The environment variable 'FOO' must be set and non-empty"}
^-- SC2086: Double quote to prevent globbing and word splitting.
Any advice on what I should do in this case?
The short answer is "do as shellcheck suggests":
: "${STATE?Need to set STATE}"
: "${DEST:?Need to set DEST non-empty}"
To illustrate why, study the following. Note that the : command doesn't echo its arguments (but the shell does evaluate the arguments). We want to see the arguments, so the code below uses printf "%s\n" in place of :.
$ mkdir junk
$ cd junk
$ > abc
$ > def
$ > ghi
$
$ x="*"
$ printf "%s\n" ${x:?You must set x} # Careless; not recommended
abc
def
ghi
$ unset x
$ printf "%s\n" ${x:?You must set x} # Careless; not recommended
bash: x: You must set x
$ printf "%s\n" "${x:?You must set x}" # Careful: should be used
bash: x: You must set x
$ x="*"
$ printf "%s\n" "${x:?You must set x}" # Careful: should be used
*
$ printf "%s\n" ${x:?"You must set x"} # Not quite careful enough
abc
def
ghi
$ x=
$ printf "%s\n" ${x:?"You must set x"} # Not quite careful enough
bash: x: You must set x
$ unset x
$ printf "%s\n" ${x:?"You must set x"} # Not quite careful enough
bash: x: You must set x
$
Note how the value in $x is expanded to first * and then a list of file names when the overall expression is not in double quotes. This is what shellcheck is recommending should be fixed. I have not verified that it doesn't object to the form where the expression is enclosed in double quotes, but it is a reasonable assumption that it would be OK.
Try this:
[ -z "$STATE" ] && echo "Need to set STATE" && exit 1;
Your question is dependent on the shell that you are using.
Bourne shell leaves very little in the way of what you're after.
BUT...
It does work, just about everywhere.
Just try and stay away from csh. It was good for the bells and whistles it added, compared the Bourne shell, but it is really creaking now. If you don't believe me, just try and separate out STDERR in csh! (-:
There are two possibilities here. The example above, namely using:
${MyVariable:=SomeDefault}
for the first time you need to refer to $MyVariable. This takes the env. var MyVariable and, if it is currently not set, assigns the value of SomeDefault to the variable for later use.
You also have the possibility of:
${MyVariable:-SomeDefault}
which just substitutes SomeDefault for the variable where you are using this construct. It doesn't assign the value SomeDefault to the variable, and the value of MyVariable will still be null after this statement is encountered.
Surely the simplest approach is to add the -u switch to the shebang (the line at the top of your script), assuming you’re using bash:
#!/bin/sh -u
This will cause the script to exit if any unbound variables lurk within.
${MyVariable:=SomeDefault}
If MyVariable is set and not null, it will reset the variable value (= nothing happens).
Else, MyVariable is set to SomeDefault.
The above will attempt to execute ${MyVariable}, so if you just want to set the variable do:
MyVariable=${MyVariable:=SomeDefault}
In my opinion the simplest and most compatible check for #!/bin/sh is:
if [ "$MYVAR" = "" ]
then
echo "Does not exist"
else
echo "Exists"
fi
Again, this is for /bin/sh and is compatible also on old Solaris systems.
bash 4.2 introduced the -v operator which tests if a name is set to any value, even the empty string.
$ unset a
$ b=
$ c=
$ [[ -v a ]] && echo "a is set"
$ [[ -v b ]] && echo "b is set"
b is set
$ [[ -v c ]] && echo "c is set"
c is set
I always used:
if [ "x$STATE" == "x" ]; then echo "Need to set State"; exit 1; fi
Not that much more concise, I'm afraid.
Under CSH you have $?STATE.
For future people like me, I wanted to go a step forward and parameterize the var name, so I can loop over a variable sized list of variable names:
#!/bin/bash
declare -a vars=(NAME GITLAB_URL GITLAB_TOKEN)
for var_name in "${vars[#]}"
do
if [ -z "$(eval "echo \$$var_name")" ]; then
echo "Missing environment variable $var_name"
exit 1
fi
done
We can write a nice assertion to check a bunch of variables all at once:
#
# assert if variables are set (to a non-empty string)
# if any variable is not set, exit 1 (when -f option is set) or return 1 otherwise
#
# Usage: assert_var_not_null [-f] variable ...
#
function assert_var_not_null() {
local fatal var num_null=0
[[ "$1" = "-f" ]] && { shift; fatal=1; }
for var in "$#"; do
[[ -z "${!var}" ]] &&
printf '%s\n' "Variable '$var' not set" >&2 &&
((num_null++))
done
if ((num_null > 0)); then
[[ "$fatal" ]] && exit 1
return 1
fi
return 0
}
Sample invocation:
one=1 two=2
assert_var_not_null one two
echo test 1: return_code=$?
assert_var_not_null one two three
echo test 2: return_code=$?
assert_var_not_null -f one two three
echo test 3: return_code=$? # this code shouldn't execute
Output:
test 1: return_code=0
Variable 'three' not set
test 2: return_code=1
Variable 'three' not set
More such assertions here: https://github.com/codeforester/base/blob/master/lib/assertions.sh
This can be a way too:
if (set -u; : $HOME) 2> /dev/null
...
...
http://unstableme.blogspot.com/2007/02/checks-whether-envvar-is-set-or-not.html
None of the above solutions worked for my purposes, in part because I checking the environment for an open-ended list of variables that need to be set before starting a lengthy process. I ended up with this:
mapfile -t arr < variables.txt
EXITCODE=0
for i in "${arr[#]}"
do
ISSET=$(env | grep ^${i}= | wc -l)
if [ "${ISSET}" = "0" ];
then
EXITCODE=-1
echo "ENV variable $i is required."
fi
done
exit ${EXITCODE}
Rather than using external shell scripts I tend to load in functions in my login shell. I use something like this as a helper function to check for environment variables rather than any set variable:
is_this_an_env_variable ()
local var="$1"
if env |grep -q "^$var"; then
return 0
else
return 1
fi
}
The $? syntax is pretty neat:
if [ $?BLAH == 1 ]; then
echo "Exists";
else
echo "Does not exist";
fi

Resources