Access value of read input from within a function in Bash - bash

I'm have a function wrapper for read command. I'm trying to automate most of input asking in my script by a function. Below is non-working code.
ask_input() {
_question="$1"
_thevar="$2"
_finalvar=$(eval $(echo $_thevar))
read -ep "${_question}: " "$_thevar"
printf "%s\n" "$_question" "$_thevar" "$_finalvar"
}
Basically what I'm hoping is when I execute following.
ask_input "Do you like apple (yes/no)" the_answer
If an user type yes, each variable will contain following (w/o quotes of course, it was just for easier readability).
$_question --> "Do you like apple (yes/no)"
$_thevar --> "the_answer"
$_finalvar --> "yes"
The eval command is my attempt to solve the problem, but I have not found the actual solution to this.

The main two things you need to change are: 1) use indirect expansion with ! (_finalvar="${!_thevar}") instead of messing with eval, and do that after reading something into that variable. I'd also recommend making all those variables local to the function. So something like this:
ask_input() {
local _question="$1"
local _thevar="$2"
read -ep "${_question}: " "$_thevar"
local _finalvar="${!_thevar}"
printf "%s\n" "$_question" "$_thevar" "$_finalvar"
}
Since those variables are local now, you could probably also remove the _ prefixes (unless you're worried about a conflict with the variable name supplied as $2).

Related

Delayed expansion of composite variable in Bash

I'm defining a variable as a composition of other variables and some text, and I'm trying to get this variable to not expand its containing variables on the assigning. But I want it to expand when called later. That way I could reuse the same template to print different results as the inner variables keep changing. I'm truing to avoid eval as much as possible as I will be receiving some of the inner variables from third parties, and I do not know what to expect.
My use case, as below, is to have some "calling stack" so I can log all messages with the same format and keep a record of the script, function, and line of the logged message in some format like this: script.sh:this_function:42.
My attempted solution
called.sh:
#!/bin/bash
SCRIPT_NAME="`basename "${BASH_SOURCE[0]}"`"
CURR_STACK="${SCRIPT_NAME}:${FUNCNAME[0]}:${LINENO[0]}"
echo "${SCRIPT_NAME}:${FUNCNAME[0]}:${LINENO[0]}"
echo "${CURR_STACK}"
echo
function _func_1 {
echo "${SCRIPT_NAME}:${FUNCNAME[0]}:${LINENO[0]}"
echo "${CURR_STACK}"
}
_func_1
So, I intend to get the same results while printing the "${CURR_STACK}" as when printing the previous line.
If there is some built-in or other clever way to log this 'call stack', by all means, let me know! I'll gladly wave my code good-bye, but I'd still like to know how to prevent the variables from expanding right away on the assigning of CURR_STACK, but still keep them able to expand further ahead.
Am I missing some shopt?
What I've tried:
Case 1 (expanding on line 4):
CURR_STACK="${SCRIPT_NAME}:${FUNNAME[0]}:${LINENO[0]}"
CURR_STACK="`echo "${SCRIPT_NAME}:${FUNCNAME[0]}:${LINENO[0]}"`"
CURR_STACK="`echo "\${SCRIPT_NAME}:\${FUNCNAME[0]}:\${LINENO[0]}"`"
called.sh::7 <------------------| These are control lines
called.sh::4 <---------------. .------------| With the results I expect to get.
X
called.sh:_func_1:12 <---´ `-------| Both indicate that the values expanded
called.sh::4 <-------------------------| on line 4 - when CURR_STACK was set.
Case 2 (not expanding at all):
CURR_STACK="\${SCRIPT_NAME}:\${FUNNAME[0]}:\${LINENO[0]}"
CURR_STACK=\${SCRIPT_NAME}:\${FUNCNAME[0]}:\${LINENO[0]}
CURR_STACK="`echo '${SCRIPT_NAME}:${FUNCNAME[0]}:${LINENO[0]}'`"
called.sh::7
${SCRIPT_NAME}:${FUNNAME[0]}:${LINENO[0]} <-------.----| No expansion at all!...
/
called.sh::12 /
${SCRIPT_NAME}:${FUNNAME[0]}:${LINENO[0]} <----´
Shell variables are store plain inert text(*), not executable code; there isn't really any concept of delayed evaluation here. To make something that does something when used, create a function instead of a variable:
print_curr_stack() {
echo "$(basename "${BASH_SOURCE[1]}"):${FUNCNAME[1]}:${BASH_LINENO[0]}"
}
# ...
echo "We are now at $(print_curr_stack)"
# Or just run it directly:
print_curr_stack
Note: using BASH_SOURCE[1] and FUNCNAME[1] gets info about context the function was run from, rather than where it is in the function itself. But for some reason I'm not clear on, BASH_LINENO[1] gets the wrong info, and BASH_LINENO[0] is what you want.
You could also write it to allow the caller to specify additional text to print:
print_curr_stack() {
echo "$#" "$(basename "${BASH_SOURCE[1]}"):${FUNCNAME[1]}:${BASH_LINENO[0]}"
}
# ...
print_curr_stack "We are now at"
(* There's an exception to what I said about variables just contain inert text: some variables -- like $LINENO, $RANDOM, etc -- are handled specially by the shell itself. But you can't create new ones like this except by modifying the shell itself.)
Are you familiar with eval?
$ a=this; b=is; c=a; d=test;
$ e='echo "$a $b $c $d"';
$ eval $e;
this is a test
$ b='is NOT'; # modify one of the variables
$ eval $e;
this is NOT a test
$ f=$(eval $e); # capture the value of the "eval" statement
$ echo $f;
this is NOT a test

Write Bash function with named input parameters

I would like to write a Bash function that uses named input parameters instead of positional parameters (eg. ${0} or ${1}). Is this possible, and if so, how do I achieve this?
Just reassign the parameters to something more intuitive:
function test {
local foo=$1
local bar=$2
local baz=$3
local msg='Function got called with parameters %s, %s, and %s\n'
printf "$msg" "$foo" "$bar" "$baz"
}
If you're looking for something to make calling the function more user-friendly, look into getopt.
Not out of the box, not the way you'd like. Every bash scripter runs in to this pretty quickly.
If you need the script to work on more than one machine, you pretty much have to roll your own solution. If it only has to work in one environment, you can load it up with dependencies and make your life easier.
A quick google for 'bash named parameters' will turn up a myriad of resources whichever way you want to go.
You can use getopts builtin to assign values to a one-letter-named parameters:
doSomething () {
local key
local -A param
while getopts ': a: b: c' key; do
param[$key]=${OPTARG:-1}
done
declare -p param
}
doSomething -a 'Be Patient' \
-b 'ThinkBeforeYouGo' \
-c
After "while" cycle in doSomething you will have:
param=( [a]="Be Patient" [b]="ThinkBeforeYouGo" [c]="1" )
You can use getopt instead of getopts to process "long" keys, but getopt is not BASH builtin and, unfortunately (and unlike very simple and intuitive getopts), it is too complex and heavy to use in most cases.

Simple map for pipeline in shell script

I'm dealing with a pipeline of predominantly shell and Perl files, all of which pass parameters (paths) to the next. I decided it would be better to use a single file to store all the paths and just call that for every file. The issue is I am using awk to grab the files at the beginning of each file, and it's turning out to be a lot of repetition.
My question is: I do not know if there is a way to store key-value pairs in a file so shell can natively do something with the key and return the value? It needs to access an external file, because the pipeline uses many scripts and a map in a specific file would result in parameters being passed everywhere. Is there some little quirk I do not know of that performs a map function on an external file?
You can make a file of env var assignments and source that file as need, ie.
$ cat myEnvFile
path1=/x/y/z
path2=/w/xy
path3=/r/s/t
otherOpt1="-x"
Inside your script you can source with either . myEnvFile or the more versbose version of the same feature sourc myEnvFile (assuming bash shell) , i.e.
$cat myScript
#!/bin/bash
. /path/to/myEnvFile
# main logic below
....
# references to defined var
if [[ -d $path2 ]] ; then
cd $path2
else
echo "no pa4h2=$path2 found, can't continue" 1>&1
exit 1
fi
Based on how you've described your problem this should work well, and provide a-one-stop-shop for all of your variable settings.
IHTH
In bash, there's mapfile, but that reads the lines of a file into a numerically-indexed array. To read a whitespace-separated file into an associative array, I would
declare -A map
while read key value; do
map[$key]=$value
done < filename
However this sounds like an XY problem. Can you give us an example (in code) of what you're actually doing? When I see long piplines of grep|awk|sed, there's usually a way to simplify. For example, is passing data by parameters better than passing via stdout|stdin?
In other words, I'm questioning your statement "I decided it would be better..."

how to access an automatically named variable in a Bash shell script

I have some code that creates a variable of some name automatically and assigns some value to it. The code is something like the following:
myVariableName="zappo"
eval "${myVariableName}=zappo_value"
How would I access the value of this variable using the automatically generated name of the variable? So, I'm looking for some code a bit like the following (but working):
eval "echo ${${myVariableName}}"
(... which may be used in something such as myVariableValue="$(eval "echo ${${myVariableName}}")"...).
Thanks muchly for any assistance
If you think this approach is madness and want offer more general advice, the general idea I'm working on is having variables defined in functions in a library with such names as ${usage} and ${prerequisiteFunctions}. These variables that are defined within functions would be accessed by an interrogation function that can, for instance, ensure that prerequisites etc. are installed. So a loop within this interrogation function is something like this:
for currentFunction in ${functionList}; do
echo "function: ${currentFunction}"
${currentFunction} -interrogate # (This puts the function variables into memory.)
currentInterrogationVariables="${interrogationVariables}" # The variable interrogationVariables contains a list of all function variables available for interrogation.
for currentInterrogationVariable in ${currentInterrogationVariables}; do
echo "content of ${currentInterrogationVariable}:"
eval "echo ${${currentInterrogationVariable}}"
done
done
Thanks again for any ideas!
IIRC, indirection in bash is by !, so try ${!myVariableName}
Try:
echo ${!myVariableName}
It will echo the variable who's name is contained in $myVariableName
For example:
#!/bin/bash
VAR1="ONE"
VAR2="TWO"
VARx="VAR1"
echo ${VARx} # prints "VAR1"
echo ${!VARx} # prints "ONE"

Multi layer variable substitution in shell.. possible?

I have multiple variables in a shell script; i was trying to save some code duplication and wanted to do something like following
# variables
FLAG=SIM
SIM_ICR_KEY_VAL="http://www.example.com/simi/icr"
REAL_ICR_KEY_VAL="http://www.example.com/real"
Based on the FLAG value i want to access the correct variable (without using IF's)
When i try this it echos the variable name & not the value itself.
echo $(echo ${FLAG}_ICR_KEY_VAL)
On further note; i need to use these substitutions inline in a sed statememt:
sed "s!${ISTR_KEY}=.*!${ISTR_KEY}=${SIM_ISTR_KEY_VAL}!" > tmp.file
... i am not sure its possible or not, please suggest
Reflection can be achieved with the infamous eval:
eval thisvar=\$${FLAG}_INC_KEY_VAL;
echo "We are using $thisvar"
Whenever you find yourself dynamically synthesizing a variable name, though, you are probably Doing It Wrong. You should consider alternatives like arrays:
ICR_KEY_VAL[0]="http://www.example.com/simi/icr"
ICR_KEY_VAL[1]="http://www.example.com/real"
SIM=0
echo ${ICR_KEY_VAL[$SIM]}
I don't know how to do it directly, but in bash you can do it indirectly:
FLAG=SIM
SIM_ICR_KEY_VAL="http://www.example.com/simi/icr"
REAL_ICR_KEY_VAL="http://www.example.com/real"
FLAG_ICR_KEY_VAL=${FLAG}_ICR_KEY_VAL
sed "s!${ISTR_KEY}=.*!${ISTR_KEY}=${!FLAG_ISTR_KEY_VAL}!" > tmp.file

Resources