How to Access a variable in a Shell script - bash

I'm currently stuck on how to do the following:
I have a settings file that looks like this:
USER_ID=12
ROLE=admin
STARTED=10/20/2010
...
I need to access the role and map the role to one of the variables in the script below. After I will use that variable to call open a doc with the correct name.
test.sh
#!/bin/sh
ADMIN=master_doc
STAFF=staff_doc
GUEST=welcome_doc
echo "accessing role type"
cd /setting
#open `settings` file to access role?
#call correct service
#chmod 555 master_doc.service
Is there a way to interpolate strings using bash like there is in javascript? Also, I'm guessing I would need to traverse through the settings file to access role?

With bash and grep and assuming that the settings file has exactly one line beginning with ROLE=:
#!/bin/bash
admin=master_doc
staff=staff_doc
guest=welcome_doc
cd /setting || exit
role=$(grep '^ROLE=' settings)
role=${role#*=}
echo chmod 555 "${!role}.service"
Drop the echo after making sure it works as intended.
Look into Shell Parameter Expansion for indirect expansion.

From what I understand, you want to get the variables from settings, use $role as an indirect reference to $admin, i.e. master_doc, then turn that into a string, master_doc.service.
Firstly, instead of indirection, I recommend an associative array since it's cleaner.
You can use source to get variables from another file, as well as functions and other stuff.
Lastly, to dereference a variable, you need to use the dollar sign, like $role. Variable references are expanded inside double-quotes, so that's sort of the equivalent of string interpolation.
#!/bin/bash
# Associative array with doc names
declare -A docs=(
[admin]=master_doc
[staff]=staff_doc
[guest]=welcome_doc
)
echo "accessing role type"
cd setting || exit
source settings # Import variables
true "${ROLE?}" # Exit if unset
echo chmod 555 "${docs[$ROLE]}.service" # Select from associative array
# ^ Using "echo" to test. Remove if the script works properly.

You can source the settings file to load the variables:
source settings
And then you can use them in your script:
chmod 555 "${admin}.service" # will replace ${admin} with master_doc

I'd certainly use source(.)
#!/bin/sh
ADMIN=master_doc
STAFF=staff_doc
GUEST=welcome_doc
echo "accessing role type"
. /setting/settings 2> /dev/null || { echo 'Settings file not found!'; exit 1; }
role=${ROLE^^} # upercase rolename
echo ${!role}.service # test echo

Related

How to source a variable list pulled using sqlplus in bash without creating a file

Im trying to source a variable list which is populated into one single variable in bash.
I then want to source this single variable to the contents (which are other variables) of the variable are available to the script.
I want to achieve this without having to spool the sqlplus file then source this file (this already works as I tried it).
Please find below what Im trying:
#!/bin/bash
var_list=$(sqlplus -S /#mydatabase << EOF
set pagesize 0
set trimspool on
set headsep off
set echo off
set feedback off
set linesize 1000
set verify off
set termout off
select varlist from table;
EOF
)
#This already works when I echo any variable from the list
#echo "$var_list" > var_list.dat
#. var_list.dat
#echo "$var1, $var2, $var3"
#Im trying to achieve the following
. $(echo "var_list")
echo "$any_variable_from_var_list"
The contents of var_list from the database are as follows:
var1="Test1"
var2="Test2"
var3="Test3"
I also tried sourcing it other ways such as:
. <<< $(echo "$var_list")
. $(cat "$var_list")
Im not sure if I need to read in each line now using a while loop.
Any advice is appreciated.
You can:
. /dev/stdin <<<"$varlist"
<<< is a here string. It redirects the content of data behind <<< to standard input.
/dev/stdin represents standard input. So reading from the 0 file descriptor is like opening /dev/stdin and calling read() on resulting file descriptor.
Because source command needs a filename, we pass to is /dev/stdin and redirect the data to be read to standard input. That way source reads the commands from standard input thinking it's reading from file, while we pass our data to the input that we want to pass.
Using /dev/stdin for tools that expect a file is quite common. I have no idea what references to give, I'll link: bash manual here strings, Posix 7 base definitions 2.1.1p4 last bullet point, linux kernel documentation on /dev/ directory entires, bash manual shell builtins, maybe C99 7.19.3p7.
I needed a way to store dotenv values in files locally and vars for DevOps pipelines, so I could then source to the runtime environment on demand (from file when available and vars when not). More though, I needed to store different dotenv sets in different vars and use them based on the source branch (which I load to $ENV in .gitlab-ci.yml via export ENV=${CI_COMMIT_BRANCH:=develop}). With this I'll have developEnv, qaEnv, and productionEnv, each being a var containing it's appropriate dotenv contents (being redundant to be clear.)
unset FOO; # Clear so we can confirm loading after
ENV=develop; #
developEnv="VERSION=1.2.3
FOO=bar"; # Creating a simple dotenv in a var, with linebreak (as the files passed through will have)
envVarName=${ENV}Env # Our dynamic var-name
source <(cat <<< "${!envVarName}") # Using the dynamic name,
echo $FOO;
# bar

Exporting environment variables for local development and heroku deployment

I would like to setup some files for development, staging and production with environment variables, for example:
application_root/development.env
KEY1=value1
KEY2=value2
There would be similar files staging.env and production.env.
I am looking for a couple different bash scripts which would allow the loading of all these variables in either development or staging/production.
In local development I want to effectively run export KEY1=value1 for each line in the file.
For staging/production I will be deploying to Heroku and would like to effectively run heroku config:set -a herokuappname KEY1=value1 for each line in the staging or production.env files.
I know there are some gems designed for doing this but it seems like this might be pretty simple. I also like the flexibility of having the .env files as simple lists of keys and values and not specifically being tied to any language/framework. If I would have to change something about the way these variables need to be loaded it would be a matter of changing the script but not the .env files.
In the simplest form, you can load the key-value pairs into a bash array as follows:
IFS=$'\n' read -d '' -ra nameValuePairs < ./development.env
In Bash v4+, it's even simpler:
readarray -t nameValuePairs < ./development.env
You can then pass the resulting "${nameValuePairs[#]}" array to commands such as export or heroku config:set ...; e.g.:
export "${nameValuePairs[#]}"
Note, however, that the above only works as intended if the input *.env file meets all of the following criteria:
the keys are syntactically valid shell variable names and the lines have the form <key>=<value>, with no whitespace around =
the lines contain no quoting and no leading or trailing whitespace
there are no empty/blank lines or comment lines in the file.
the values are confined to a single line each.
A different approach is needed with files that do not adhere to this strict format; for instance, this related question deals with files that may contain quoted values.
Below is the source code for a bash script named load_env (the .sh suffix is generally not necessary and ambiguous):
You'd invoke it with the *.env file of interest, and it would perform the appropriate action (running heroku config:set … or export) based on the filename.
However, as stated, you must source the script (using source or its effective bash alias, .) in order to create environment variables (export) visible to the current shell.
To prevent obscure failures, the script complains if you pass a development.env file and have invoked the script without sourcing.
Examples:
./load_env ./staging.dev
. ./load_env ./development.dev # !! Note the need to source
load_env source code
#!/usr/bin/env bash
# Helper function that keeps its aux. variables localized.
# Note that the function itself remains defined after sourced invocation, however.
configOrExport() {
local envFile=$1 doConfig=0 doExport=0 appName
case "$(basename "$envFile" '.env')" in
staging)
doConfig=1
# Set the desired app name here.
appName=stagingapp
;;
production)
doConfig=1
# Set the desired app name here.
appName=productionapp
;;
development)
doExport=1
;;
*)
echo "ERROR: Invalid or missing *.env file name: $(basename "$envFile" '.env')" >&2; exit 2
esac
# Make sure the file exists and is readable.
[[ -r "$envFile" ]] || { echo "ERROR: *.env file not found or not readable: $envFile" >&2; exit 2; }
# If variables must be exported, make sure the script is being sourced.
[[ $doExport -eq 1 && $0 == "$BASH_SOURCE" ]] && { echo "ERROR: To define environment variables, you must *source* this script." >&2; exit 2; }
# Read all key-value pairs from the *.env file into an array.
# Note: This assumes that:
# - the keys are syntactically valid shell variable names
# - the lines contain no quoting and no leading or trailing whitespace
# - there are no empty/blank lines or comment lines in the file.
IFS=$'\n' read -d '' -ra nameValuePairs < "$envFile"
# Run configuration command.
(( doConfig )) && { heroku config:set -a "$appName" "${nameValuePairs[#]}" || exit; }
# Export variables (define as environment variables).
(( doExport )) && { export "${nameValuePairs[#]}" || exit; }
}
# Invoke the helper function.
configOrExport "$#"

How Can I access bash variables in tcl(expect) script

How Can I access bash variables in tcl(expect) script.
I have bash file say f1.sh which set some variables like
export var1=a1
export var2=a2
These variable I need to use in my expect script .
I tried using this in my script which does not work
system "./f1.sh"
puts "var1 is $::env(var1)"
puts "var2 is $::env(var2)"
But this does not seems to work.
I see that non of the variable from f1.sh are getting set as environment variable.
system "./f1.sh" << # Is this command in my script right ?
How I need to access these bash variables from tcl file.
I would say that this problem is rather general. First I met this problem, when I wanted to initialize Microsoft Visual Studio environment (which is done using .cmd script) in PoserShell. Later I've faced this problem with other scripting languages in any combinations (Bash, Tcl, Python etc.).
Solution provided by Hai Vu is good. It works well, if you know from the beginning, which variables you need. However, if you are going to use script for initialization of some environment it my contains dozens of variables (which you don't even need to know about, but which are needed for normal operation of the environment).
In general, the solution for the problem is following:
Execute script and at the end print ALL environment variables and capture the output.
Match lines of output for the pattern like "variable=value", where is what you want to get.
Set environment variables using facilities of your language.
I do not have ready made solution, but I guess, that something similar to this should work (note, that snippets below was not tested - they are aimed only to give an idea of the solution):
Execute script; print vars and capture the output (argument expanding - {*} - requires Tcl 8.5, here we can go without it, but I prefer to use it):
set bashCommand {bash -c 'myScriptName arg1 arg2 2>&1 >/dev/null && export -p'}
if [catch {*}${bashCommand} output] {
set errMsg "ERROR: Failed to run script."
append errMsg "\n" $output
error $errMsg
}
;# If we get here, output contains the output of "export -p" command
Parse the output of the command:
set vars [dict create]
foreach line [split $output "\n"] {
regex -- {^declare -x ([[:alpha:]_]*)=\"(.*)\"$} $line dummy var val
;# 3. Store var-val pair of set env var.
}
Store var-val pair or set env var. Here several approaches can be used:
3.1. Set Tcl variables and use them like this (depending on context):
set $var $val
or
variable $var $val
3.2. Set environment variable (actually, sub-case of 3.1):
global ::env
set ::env($var) $val
3.3 Set dict or array and use it within your application (or script) without modification of global environment:
set myEnv($var) val ;# set array
dict set myEnvDict $var $val ;# set dict
I'd like to repeat, that this is only the idea of the receipt. And more important, that as most of the modern scripting languages support regexes, this receipt can provide bridge between almost arbitrary pair of languages, but not only Bash<->Tcl
You can use a here-document, like this:
#!/bin/bash
process=ssh
expect <<EOF
spawn $process
...
EOF
Exported variables are only passed from a parent process to it's children, not the other way around. The script f1.sh (actually the bash instance that's running the script) gets it's own copies of var1 and var2 and it doesn't matter if it changes them, the changes are lost when it exits. For variable exporting to work, you would need to start the expect script from the bash script.
In f1.sh, printf what you want to return...
printf '%s\n%s\n' "$var1" "$var2"
...and read it with exec in Tcl:
lassign [split [exec ./f1.sh] \n] var1 var2
Perhaps I did not look hard enough, but I don't see any way to do this. When you execute the bash script, you create a different process. What happens in that process does not propagate back to the current process.
We can work-around this issue by doing the following (thanks to potrzebie for the idea):
Duplicate the bash script to a temp script
Append to the temp script some commands at the end to echo a marker, and a list of variables and their values
Execute the temp script and parse the output
The result is a list of alternating names and values. We use this list to set the environment variables for our process.
#!/usr/bin/env tclsh
package require fileutil
# Execute a bash script and extract some environment variables
proc getBashVar {bashScript varsList} {
# Duplicate the bash script to a temp script
set tempScriptName [fileutil::tempfile getBashVar]
file copy -force $bashScript $tempScriptName
# Append a marker to the end of the script. We need this marker to
# identify where in the output to begin extracting the variables.
# After that append the list of specified varibles and their values.
set f [open $tempScriptName a]
set marker "#XXX-MARKER"
puts $f "\necho \\$marker"
foreach var $varsList {
puts $f "echo $var \\\"$$var\\\" "
}
close $f
# Execute the temp script and parse the output
set scriptOutput [exec bash $tempScriptName]
append pattern $marker {\s*(.*)}
regexp $pattern $scriptOutput all vars
# Set the environment
array set ::env $vars
# Finally, delete the temp script to clean up
file delete $tempScriptName
}
# Test
getBashVar f1.sh {var1 var2}
puts "var1 = $::env(var1)"
puts "var2 = $::env(var2)"

Why doesn't this bit of code work? Setting variables and config file

I have recently just made this script:
if test -s $HOME/koolaid.txt ; then
Billz=$(grep / $HOME/koolaid.txt)
echo $Billz
else
Billz=$HOME/notkoolaid
echo $Billz
fi
if test -d $Billz ; then
echo "Ok"
else touch $Billz
fi
So basically, if the file $HOME/koolaid.txt file does NOT exist, then Billz will be set as $HOME/koolaid.txt. It then sucesfully creates the file.
However, if I do make the koolaid.txt then I get this
mkdir: cannot create directory : No such file or directory
Any help would be appreciated
Here is a difference between content of a variable and evaluated content...
if your variable contains a string $HOME/some - you need expand it to get /home/login/same
One dangerous method is eval.
bin=$(grep / ~/.rm.cfg)
eval rbin=${bin:-$HOME/deleted}
echo "==$rbin=="
Don't eval unless you're absolutely sure what you evaling...
Here are a couple things to fix:
Start your script with a "shebang," such as:
#!/bin/sh
This way the shell will know that you want to run this as a Bourne shell script.
Also, your conditional at the top of the script doesn't handle the case well in which .rm.cfg exists but doesn't contain a slash character anywhere in it. In that case the rbin variable never gets set.
Finally, try adding the line
ls ~
at the top so you can see how the shell is interpreting the tilde character; that might be the problem.

how to access an automatically named variable in a Bash shell script

I have some code that creates a variable of some name automatically and assigns some value to it. The code is something like the following:
myVariableName="zappo"
eval "${myVariableName}=zappo_value"
How would I access the value of this variable using the automatically generated name of the variable? So, I'm looking for some code a bit like the following (but working):
eval "echo ${${myVariableName}}"
(... which may be used in something such as myVariableValue="$(eval "echo ${${myVariableName}}")"...).
Thanks muchly for any assistance
If you think this approach is madness and want offer more general advice, the general idea I'm working on is having variables defined in functions in a library with such names as ${usage} and ${prerequisiteFunctions}. These variables that are defined within functions would be accessed by an interrogation function that can, for instance, ensure that prerequisites etc. are installed. So a loop within this interrogation function is something like this:
for currentFunction in ${functionList}; do
echo "function: ${currentFunction}"
${currentFunction} -interrogate # (This puts the function variables into memory.)
currentInterrogationVariables="${interrogationVariables}" # The variable interrogationVariables contains a list of all function variables available for interrogation.
for currentInterrogationVariable in ${currentInterrogationVariables}; do
echo "content of ${currentInterrogationVariable}:"
eval "echo ${${currentInterrogationVariable}}"
done
done
Thanks again for any ideas!
IIRC, indirection in bash is by !, so try ${!myVariableName}
Try:
echo ${!myVariableName}
It will echo the variable who's name is contained in $myVariableName
For example:
#!/bin/bash
VAR1="ONE"
VAR2="TWO"
VARx="VAR1"
echo ${VARx} # prints "VAR1"
echo ${!VARx} # prints "ONE"

Resources