grep on variable passed to function - bash

I am passing a variable to a function that has been loaded from a config file. This variable is in turn passed on to a grep. I have tried several ways to expand the search variable ($1) but none seem to work, however the file to search works fine ($2).
Can someone advise what is the correct way to present the variable to the grep function. I have tried
"$1", "${1}", $1, ${1}, "${!1}
UPDATE Based on answers I have updated the script.sh to refelect the actual script complexity as it still does not work.
thanks
Art
Working code with no variable for RouteTableId
MyVar=$(grep -m 1 "RouteTableId" $2)
Not Working code with variable for RouteTableId
config.file
myTerm="RouteTableId"
myFile="my.file"
script.sh
. config.file
myFunction(){
mySubFunction(){
myVar=$(grep -m 1 "$1" $2)
echo $myVar
}
mySubFunction ${!1} ${!2}
}
myFunction "myTerm" "myFile"
UPDATE
I have done some more tests and it turns out that when passing variables between functions, in some circumstances the numbering order changes if one of the sequence is null. for example if i call myFunction with the following, the 2nd variable being null
myFunction "myTerm" "blah" "myFile"
then myFunction will see the following
echo "${!1}, ${!2}, ${!3}"
RouteTableId,,my.file
and then passing this to mySubFunction
mySubFunction ${!1} ${!3}
gives from within mySubFunction
echo "${!1}, ${!2}, ${!3}"
RouteTableId,my.file,
so it seems the null is being removed in the order and the subsequent variables are being brought forward in the number order.
If someone can validate and explain why I would appreciate it.
thx
Art

You are not passing the values of the variables but the variable names themselves as values.
i.e.
myFunction "myTerm" "myFile"
should be
myFunction "$myTerm" "$myFile"

If this isn't what you expect, you should edit your question to better define what it is you do expect:
$ cat my.file
foo
Here it is: RouteTableId
bar
$
$ cat config.file
myTerm="RouteTableId"
myFile="my.file"
$
$ cat script.sh
. config.file
myFunction() {
mySubFunction() {
myVar=$(grep -m 1 "$1" "$2")
echo "$myVar"
}
mySubFunction "${!1}" "${!2}"
}
myFunction "myTerm" "myFile"
$
$ ./script.sh
Here it is: RouteTableId
From your recent update, I think you are confused about what $1, etc. means. In a shell script but outside of any function $1 is the first arg passed to the shell script. In a function within a shell script $1 is the first arg passed to that function, NOT the first arg passed to the shell script. So if you do:
func() {
echo "$1,$2,$3"
}
func "$1" "$3"
then you will get the values of the first arg passed to the script (since that is also the 1st arg passed to the function), then a comma, then the 3rd arg passed to the script (since that is the 2nd arg passed to the function) and then another comma and that is all (because there is no 3rd arg passed to the function).
Hope that makes sense.

Related

Parsing command line arguments in a function [duplicate]

This question already has answers here:
How to access command line arguments of the caller inside a function?
(10 answers)
Closed 5 years ago.
Inside a function, $1 ... $n are the parameters passed to that function.
Outside a function $1 ... $n are the parameters passed to the script.
Can I somehow access the parameters passed to the script inside a function?
Usually you just pass them as parameters to the function at call time.
The (uglier) alternative is to put them in global variables.
(I know this is an old post, but none of the answers actually answered the question.)
Use the BASH_ARGV array. It contains the arguments passed to the invoking script in reverse order (i.e., it's a stack with the top at index 0). You may have to turn on extended debugging in the shebang (e.g., #!/bin/bash -O extdebug) or with shopt (e.g., shopt -s extdebug), but it works for me in bash 4.2_p37 without it turned on.
From man bash:
An array variable containing all of the parameters in the current bash execution call stack. The final parameter of the last subroutine call is at the top of the stack; the first parameter of the initial call is at the bottom. When a subroutine is executed, the parameters supplied are pushed onto BASH_ARGV. The shell sets BASH_ARGV only when in extended debugging mode….
Here's a function I use to print all arguments in order on a single line:
# Print the arguments of the calling script, in order.
function get_script_args
{
# Get the number of arguments passed to this script.
# (The BASH_ARGV array does not include $0.)
local n=${#BASH_ARGV[#]}
if (( $n > 0 ))
then
# Get the last index of the args in BASH_ARGV.
local n_index=$(( $n - 1 ))
# Loop through the indexes from largest to smallest.
for i in $(seq ${n_index} -1 0)
do
# Print a space if necessary.
if (( $i < $n_index ))
then
echo -n ' '
fi
# Print the actual argument.
echo -n "${BASH_ARGV[$i]}"
done
# Print a newline.
echo
fi
}
As Benoit stated, the simplest solution is to pass the commandline arguments to the function as function arguments with $#, then you can reference them in exactly the same way as outside the function. You'll actually be referencing the values passed to the function that just happen to have the same value as the commandline arguments, keep that in mind.
Note that this pretty much precludes you from passing any other arguments to the function, unless you know exactly how many arguments will be passed at the command line (unlikely as that is up to the user and isn't bound by your constraints)
i.e.
function fname {
# do something with $1 $2 $3...$n #
}
# $# represents all the arguments passed at the command line #
fname $#
A better way is to pass only the arguments you know you will be using, that way you can use them in the function AND also pass other parameters from within your code if you wish
i.e.
function fname {
# do something with $1 $count $2 and $3 #
}
count=1
fname $1 $count $2 $3
You can store all of your script arguments in a global array:
args=("$#")
and then access them in a function:
f(){
echo ${args[0]} ${args[1]}
}
You should probably use "$#" and pass that at the end of your function's argument list. Inside the function, shift after parsing your arguments and use $1 to $n as normally.
Thanks for the tips - they inspired me to write a callstack function. I used the 'column' command for esthetics.
callstack() {
local j=0 k prog=$(basename $0)
for ((i=1; ((i<${#BASH_ARGC[*]})); i++))
do
echo -n "${FUNCNAME[$i]/main/$prog} " # function name
args=""
for ((k=0; ((k<${BASH_ARGC[$i]})); k++))
do
args="${BASH_ARGV[$j]} $args" # arguments
let j++
done
echo -e "$args\t|${BASH_LINENO[$i]}" $(sed -n ${BASH_LINENO[$i]}p "$0" 2>/dev/null) # line calling the function
done | column -t -s $'\t' -o ' ' | sed 1d # delete callstack entry
}
compareTemplates brother_001270_1.jpg |163 compareTemplates "$f" # process the rest
processPdf brother_001270.pdf |233 filetype "${f%[*}" pdf && processPdf "$f"
process brother_001270.pdf |371 --process) shift; process "$#"; exit ;; # process jpg or pdf
sm --quiet --process brother_001270.pdf |0

Linux shell scripting: Is it possible to modify the variable sent as parameter to a function?

I am a bit new to linux shell scripting, so this could be a very silly question.
This is a simple example code (of course it does not work, but I am hoping to show what I wanna do):
#!/bin/bash
function AppendLetters() {
# Value sent as parameter: $1
$1= "$1"LLL
}
var="foo"
AppendLetter $var
echo "$var"
So, when calling the program from command line:
$ ./example.sh
I would like to modify the internal variableobtain some sort of:
fooLLL
Reason for doing this: I have a script that loads multiple variables from a config file, and would like to make the same modification to that variables in order to use them in my program.
Is it possible? Can a function modify variables sent as parameter?
Without needing bash 4.3, using variable indirection:
AppendLetters() {
declare -g "$1"="${!1}LLL"
}
var=f00
AppendLetters var
echo "$var"
f00LLL
The -g option for declare is necessary so that the assignment is not treated as local to the function.
Given:
I have a script that loads multiple variables from a config file, and would like to make the same modification to that variables in order to use them in my program.
I would do this, not in a function, using bash's += assignment operator
varlist=( var1 var2 var3 )
for varname in "${varlist}"; do
declare "$varname"+="LLL"
done
Starting from Bash 4.3.0 you can use declare -n to get reference of a variable in a function:
AppendLetters() {
declare -ln ref="$1"
ref+="LLL"
}
var="foo"
AppendLetters "var"
echo "$var"
fooLLL
From help declare:
-n make NAME a reference to the variable named by its value
EDIT: Thanks to #gniourf_gniourf for suggesting this printf, you can do this in older BASH:
AppendLetters() { printf -v "$1" '%sLLL' "${!1}"; }
var="foo"
AppendLetters "var"
echo "$var"
fooLLL

Access arguments to Bash script inside a function [duplicate]

This question already has answers here:
How to access command line arguments of the caller inside a function?
(10 answers)
Closed 5 years ago.
Inside a function, $1 ... $n are the parameters passed to that function.
Outside a function $1 ... $n are the parameters passed to the script.
Can I somehow access the parameters passed to the script inside a function?
Usually you just pass them as parameters to the function at call time.
The (uglier) alternative is to put them in global variables.
(I know this is an old post, but none of the answers actually answered the question.)
Use the BASH_ARGV array. It contains the arguments passed to the invoking script in reverse order (i.e., it's a stack with the top at index 0). You may have to turn on extended debugging in the shebang (e.g., #!/bin/bash -O extdebug) or with shopt (e.g., shopt -s extdebug), but it works for me in bash 4.2_p37 without it turned on.
From man bash:
An array variable containing all of the parameters in the current bash execution call stack. The final parameter of the last subroutine call is at the top of the stack; the first parameter of the initial call is at the bottom. When a subroutine is executed, the parameters supplied are pushed onto BASH_ARGV. The shell sets BASH_ARGV only when in extended debugging mode….
Here's a function I use to print all arguments in order on a single line:
# Print the arguments of the calling script, in order.
function get_script_args
{
# Get the number of arguments passed to this script.
# (The BASH_ARGV array does not include $0.)
local n=${#BASH_ARGV[#]}
if (( $n > 0 ))
then
# Get the last index of the args in BASH_ARGV.
local n_index=$(( $n - 1 ))
# Loop through the indexes from largest to smallest.
for i in $(seq ${n_index} -1 0)
do
# Print a space if necessary.
if (( $i < $n_index ))
then
echo -n ' '
fi
# Print the actual argument.
echo -n "${BASH_ARGV[$i]}"
done
# Print a newline.
echo
fi
}
As Benoit stated, the simplest solution is to pass the commandline arguments to the function as function arguments with $#, then you can reference them in exactly the same way as outside the function. You'll actually be referencing the values passed to the function that just happen to have the same value as the commandline arguments, keep that in mind.
Note that this pretty much precludes you from passing any other arguments to the function, unless you know exactly how many arguments will be passed at the command line (unlikely as that is up to the user and isn't bound by your constraints)
i.e.
function fname {
# do something with $1 $2 $3...$n #
}
# $# represents all the arguments passed at the command line #
fname $#
A better way is to pass only the arguments you know you will be using, that way you can use them in the function AND also pass other parameters from within your code if you wish
i.e.
function fname {
# do something with $1 $count $2 and $3 #
}
count=1
fname $1 $count $2 $3
You can store all of your script arguments in a global array:
args=("$#")
and then access them in a function:
f(){
echo ${args[0]} ${args[1]}
}
You should probably use "$#" and pass that at the end of your function's argument list. Inside the function, shift after parsing your arguments and use $1 to $n as normally.
Thanks for the tips - they inspired me to write a callstack function. I used the 'column' command for esthetics.
callstack() {
local j=0 k prog=$(basename $0)
for ((i=1; ((i<${#BASH_ARGC[*]})); i++))
do
echo -n "${FUNCNAME[$i]/main/$prog} " # function name
args=""
for ((k=0; ((k<${BASH_ARGC[$i]})); k++))
do
args="${BASH_ARGV[$j]} $args" # arguments
let j++
done
echo -e "$args\t|${BASH_LINENO[$i]}" $(sed -n ${BASH_LINENO[$i]}p "$0" 2>/dev/null) # line calling the function
done | column -t -s $'\t' -o ' ' | sed 1d # delete callstack entry
}
compareTemplates brother_001270_1.jpg |163 compareTemplates "$f" # process the rest
processPdf brother_001270.pdf |233 filetype "${f%[*}" pdf && processPdf "$f"
process brother_001270.pdf |371 --process) shift; process "$#"; exit ;; # process jpg or pdf
sm --quiet --process brother_001270.pdf |0

How to access command line arguments of the caller inside a function?

I'm attempting to write a function in bash that will access the scripts command line arguments, but they are replaced with the positional arguments to the function. Is there any way for the function to access the command line arguments if they aren't passed in explicitly?
# Demo function
function stuff {
echo $0 $*
}
# Echo's the name of the script, but no command line arguments
stuff
# Echo's everything I want, but trying to avoid
stuff $*
If you want to have your arguments C style (array of arguments + number of arguments) you can use $# and $#.
$# gives you the number of arguments.
$# gives you all arguments. You can turn this into an array by args=("$#").
So for example:
args=("$#")
echo $# arguments passed
echo ${args[0]} ${args[1]} ${args[2]}
Note that here ${args[0]} actually is the 1st argument and not the name of your script.
My reading of the Bash Reference Manual says this stuff is captured in BASH_ARGV,
although it talks about "the stack" a lot.
#!/bin/bash
shopt -s extdebug
function argv {
for a in ${BASH_ARGV[*]} ; do
echo -n "$a "
done
echo
}
function f {
echo f $1 $2 $3
echo -n f ; argv
}
function g {
echo g $1 $2 $3
echo -n g; argv
f
}
f boo bar baz
g goo gar gaz
Save in f.sh
$ ./f.sh arg0 arg1 arg2
f boo bar baz
fbaz bar boo arg2 arg1 arg0
g goo gar gaz
ggaz gar goo arg2 arg1 arg0
f
fgaz gar goo arg2 arg1 arg0
#!/usr/bin/env bash
echo name of script is $0
echo first argument is $1
echo second argument is $2
echo seventeenth argument is $17
echo number of arguments is $#
Edit: please see my comment on question
Ravi's comment is essentially the answer. Functions take their own arguments. If you want them to be the same as the command-line arguments, you must pass them in. Otherwise, you're clearly calling a function without arguments.
That said, you could if you like store the command-line arguments in a global array to use within other functions:
my_function() {
echo "stored arguments:"
for arg in "${commandline_args[#]}"; do
echo " $arg"
done
}
commandline_args=("$#")
my_function
You have to access the command-line arguments through the commandline_args variable, not $#, $1, $2, etc., but they're available. I'm unaware of any way to assign directly to the argument array, but if someone knows one, please enlighten me!
Also, note the way I've used and quoted $# - this is how you ensure special characters (whitespace) don't get mucked up.
# Save the script arguments
SCRIPT_NAME=$0
ARG_1=$1
ARGS_ALL=$*
function stuff {
# use script args via the variables you saved
# or the function args via $
echo $0 $*
}
# Call the function with arguments
stuff 1 2 3 4
One can do it like this as well
#!/bin/bash
# script_name function_test.sh
function argument(){
for i in $#;do
echo $i
done;
}
argument $#
Now call your script like
./function_test.sh argument1 argument2
This is #mcarifio response with several comments incorporated:
#!/bin/bash
shopt -s extdebug
function stuff() {
local argIndex="${#BASH_ARGV[#]}"
while [[ argIndex -gt 0 ]] ; do
argIndex=$((argIndex - 1))
echo -n "${BASH_ARGV[$argIndex]} "
done
echo
}
stuff
I want to highlight:
The shopt -s extdebug is important. Without this the BASH_ARGV array will be empty unless you use it in top level part of the script (it means outside of the stuff function). Details here: Why does the variable BASH_ARGV have a different value in a function, depending on whether it is used before calling the function
BASH_ARGV is a stack so arguments are stored there in backward order. That's the reason why I decrement the index inside loop so we get arguments in the right order.
Double quotes around the ${BASH_ARGV[#]} and the # as an index instead of * are needed so arguments with spaces are handled properly. Details here: bash arrays - what is difference between ${#array_name[*]} and ${#array_name[#]}
You can use the shift keyword (operator?) to iterate through them.
Example:
#!/bin/bash
function print()
{
while [ $# -gt 0 ]
do
echo "$1"
shift 1
done
}
print "$#"
I do it like this:
#! /bin/bash
ORIGARGS="$#"
function init(){
ORIGOPT= "- $ORIGARGS -" # tacs are for sed -E
echo "$ORIGOPT"
}
The simplest and likely the best way to get arguments passed from the command line to a particular function is to include the arguments directly in the function call.
# first you define your function
function func_ImportantPrints() {
printf '%s\n' "$1"
printf '%s\n' "$2"
printf '%s\n' "$3"
}
# then when you make your function call you do this:
func_ImportantPrints "$#"
This is useful no matter if you are sending the arguments to main or some function like func_parseArguments (a function containing a case statement as seen in previous examples) or any function in the script.

Passing a string with spaces as a function argument in Bash

I'm writing a Bash script where I need to pass a string containing spaces to a function in my Bash script.
For example:
#!/bin/bash
myFunction
{
echo $1
echo $2
echo $3
}
myFunction "firstString" "second string with spaces" "thirdString"
When run, the output I'd expect is:
firstString
second string with spaces
thirdString
However, what's actually output is:
firstString
second
string
Is there a way to pass a string with spaces as a single argument to a function in Bash?
You should add quotes and also, your function declaration is wrong.
myFunction()
{
echo "$1"
echo "$2"
echo "$3"
}
And like the others, it works for me as well.
Another solution to the issue above is to set each string to a variable, call the function with variables denoted by a literal dollar sign \$. Then in the function use eval to read the variable and output as expected.
#!/usr/bin/ksh
myFunction()
{
eval string1="$1"
eval string2="$2"
eval string3="$3"
echo "string1 = ${string1}"
echo "string2 = ${string2}"
echo "string3 = ${string3}"
}
var1="firstString"
var2="second string with spaces"
var3="thirdString"
myFunction "\${var1}" "\${var2}" "\${var3}"
exit 0
Output is then:
string1 = firstString
string2 = second string with spaces
string3 = thirdString
In trying to solve a similar problem to this, I was running into the issue of UNIX thinking my variables were space delimeted. I was trying to pass a pipe delimited string to a function using awk to set a series of variables later used to create a report. I initially tried the solution posted by ghostdog74 but could not get it to work as not all of my parameters were being passed in quotes. After adding double-quotes to each parameter it then began to function as expected.
Below is the before state of my code and fully functioning after state.
Before - Non Functioning Code
#!/usr/bin/ksh
#*******************************************************************************
# Setup Function To Extract Each Field For The Error Report
#*******************************************************************************
getField(){
detailedString="$1"
fieldNumber=$2
# Retrieves Column ${fieldNumber} From The Pipe Delimited ${detailedString}
# And Strips Leading And Trailing Spaces
echo ${detailedString} | awk -F '|' -v VAR=${fieldNumber} '{ print $VAR }' | sed 's/^[ \t]*//;s/[ \t]*$//'
}
while read LINE
do
var1="$LINE"
# Below Does Not Work Since There Are Not Quotes Around The 3
iputId=$(getField "${var1}" 3)
done<${someFile}
exit 0
After - Functioning Code
#!/usr/bin/ksh
#*******************************************************************************
# Setup Function To Extract Each Field For The Report
#*******************************************************************************
getField(){
detailedString="$1"
fieldNumber=$2
# Retrieves Column ${fieldNumber} From The Pipe Delimited ${detailedString}
# And Strips Leading And Trailing Spaces
echo ${detailedString} | awk -F '|' -v VAR=${fieldNumber} '{ print $VAR }' | sed 's/^[ \t]*//;s/[ \t]*$//'
}
while read LINE
do
var1="$LINE"
# Below Now Works As There Are Quotes Around The 3
iputId=$(getField "${var1}" "3")
done<${someFile}
exit 0
A more dynamic way would be:
function myFunction {
for i in "$*"; do echo "$i"; done;
}
The simplest solution to this problem is that you just need to use \" for space separated arguments when running a shell script:
#!/bin/bash
myFunction() {
echo $1
echo $2
echo $3
}
myFunction "firstString" "\"Hello World\"" "thirdString"
Your definition of myFunction is wrong. It should be:
myFunction()
{
# same as before
}
or:
function myFunction
{
# same as before
}
Anyway, it looks fine and works fine for me on Bash 3.2.48.
Simple solution that worked for me -- quoted $#
Test(){
set -x
grep "$#" /etc/hosts
set +x
}
Test -i "3 rb"
+ grep -i '3 rb' /etc/hosts
I could verify the actual grep command (thanks to set -x).
You could have an extension of this problem in case of your initial text was set into a string type variable, for example:
function status(){
if [ $1 != "stopped" ]; then
artist="ABC";
track="CDE";
album="DEF";
status_message="The current track is $track at $album by $artist";
echo $status_message;
read_status $1 "$status_message";
fi
}
function read_status(){
if [ $1 != "playing" ]; then
echo $2
fi
}
In this case if you don't pass the status_message variable forward as string (surrounded by "") it will be split in a mount of different arguments.
"$variable": The current track is CDE at DEF by ABC
$variable: The
I had the same kind of problem and in fact the problem was not the function nor the function call, but what I passed as arguments to the function.
The function was called from the body of the script - the 'main' - so I passed "st1 a b" "st2 c d" "st3 e f" from the command line and passed it over to the function using myFunction $*
The $* causes the problem as it expands into a set of characters which will be interpreted in the call to the function using whitespace as a delimiter.
The solution was to change the call to the function in explicit argument handling from the 'main' towards the function: the call would then be myFunction "$1" "$2" "$3" which will preserve the whitespace inside strings as the quotes will delimit the arguments...
So if a parameter can contain spaces, it should be handled explicitly throughout all calls of functions.
As this may be the reason for long searches to problems, it may be wise never to use $* to pass arguments...

Resources