I'm using nodedock.
It has a start.sh script to start you docker
#!/usr/bin/env bash
set -e
cd "$( dirname "${BASH_SOURCE[0]}" )"
if [ ! -f .env ]; then
echo "Having .env is required. Maybe you forgot to copy env-example?"
exit 1
fi
while read -r line; do
VARNAME=$(echo ${line} | awk '{sub(/\=.*/,x)}1')
if [[ -z ${!VARNAME} ]]; then
declare -x ${line}
fi
done < <(egrep -v "(^#|^\s|^$)" .env)
docker-compose up -d ${NODEDOCK_SERVICES}
docker-compose logs -t -f ${NODEDOCK_LOG_AFTER_START}
NODEDOCK_SERVICES = nginx node workspace mongo
If found that if you need to have a variable with spaces you have to write your env variable with doubles quotes "nginx node workspace mongo"
The problem is that this "req expression" VARNAME=$(echo ${line} | awk '{sub(/\=.*/,x)}1') doesn't work with double quotes.
Any solution?
The problem is not with your awk expression but when you make a call to the declare built-in. Use proper quotes when declaring it.
declare -x "$str"
because without the quotes, your assignment would look like
declare -x NODEDOCK_SERVICES=nginx node workspace mongo
which splits on white-space and the first word of the resultant string gets assigned to NODEDOCK_SERVICES. But with proper quotes, the assignment would remain intact preserving the spaces in the resultant string.
That said, your whole loop can be modified by making the read loop parse the line with = as de-limiter, so you can easily parse the key/value pairs. At this point it is not clear that the assignments in your file would be of the form 1 or 2 below
NODEDOCK_SERVICES = nginx node workspace mongo
NODEDOCK_SERVICES=nginx node workspace mongo
The below logic would work for both the cases
shopt -s extglob
while IFS== read -r key value; do
key=${key%%+([[:space:]])}
value=${value##+([[:space:]])}
if [[ -z ${!key} ]]; then
declare -x "$key=$value"
fi
done < <(egrep -v "(^#|^\s|^$)" .env)
As a good practice, always quote your variables in bash, unless you see a good reason not to. And lower-casing user defined variables helps you distinguish them from the environment variables maintained by the shell itself.
If you want to read specific variables from an .env file format (maybe not exactly your question but it might help others as your title might be misleading):
read_var() {
VAR=$(grep "^$1=" $2 | xargs)
IFS="=" read -ra VAR <<< "$VAR"
IFS=" "
echo ${VAR[1]}
}
Related
I'm currently loading multiple variables into my shell (from a .env file) like so:
eval $(grep '^VAR_1' .env) && eval $(grep '^VAR_2' .env) && ...
I then use them in a script like so: echo $VAR_1.
Is there any way to condense this script into something like: eval $(grep ^('VAR_1|VAR_2')) .env? Maybe needs something other than grep
I would use source (.) command with process substitution:
. <(grep -e '^VAR_1=' -e '^VAR_2=' .env)
or
. <(grep '^VAR_[12]=' .env) # for this particular variables
The file .env must be from a trusted source, of course.
Note that the grep method won't work if a variable is assigned a string containing an embedded newline character.
You may use this grep with ERE option to filter all the variable you want from .env file:
grep -E '^(VAR_1|VAR_2)=' .env
This pattern will find VAR_1= or VAR_2= strings at the start.
To set variable use:
declare $(grep -E '^(VAR_1|VAR_2)=' .env)
# or
eval "$(grep -E '^(VAR_1|VAR_2)=' .env)"
This part of my script is comparing each line of a file to find a preset string. If the string does NOT exist as a line in the file, it should append it to the end of the file.
STRING=foobar
cat "$FILE" | while read LINE
do
if [ "$STRING" == "$LINE" ]; then
export ISLINEINFILE="yes"
fi
done
if [ ! "$ISLINEINFILE" == yes ]; then
echo "$LINE" >> "$FILE"
fi
However, it appears as if both $LINE and $ISLINEINFILE are both cleared upon finishing the do loop. How can I avoid this?
Using shell
If we want to make just the minimal change to your code to get it working, all we need to do is switch the input redirection:
string=foobar
while read line
do
if [ "$string" == "$line" ]; then
islineinfile="yes"
fi
done <"$file"
if [ ! "$islineinfile" == yes ]; then
echo "$string" >> "$file"
fi
In the above, we changed cat "$file" | while do ...done to while do...done<"$file". With this one change, the while loop is no longer in a subshell and, consequently, shell variables created in the loop live on after the loop completes.
Using sed
I believe that the whole of your script can be replaced with:
sed -i.bak '/^foobar$/H; ${x;s/././;x;t; s/$/\nfoobar/}' file*
The above adds line foobar to the end of each file that doesn't already have a line that matches ^foobar$.
The above shows file* as the final argument to sed. This will apply the change to all files matching the glob. You could list specific files individually if you prefer.
The above was tested on GNU sed (linux). Minor modifications may be needed for BSD/OSX sed.
Using GNU awk (gawk)
awk -i inplace -v s="foobar" '$0==s{f=1} {print} ENDFILE{if (f==0) print s; f=0}' file*
Like the sed command, this can tackle multiple files all in one command.
Why does my variable set in a do loop disappear?
It disappears because it is set in a shell pipeline component. Most shells run each part of a pipeline in a subshell. By Unix design, variables set in a subshell cannot affect their parent or any already running other shell.
How can I avoid this?
There are several ways:
The simplest is to use a shell that doesn't run the last component of a pipeline in a subshell. This is ksh default behavior, e.g. use that shebang:
#!/bin/ksh
This behavior can also be bash one when the lastpipe option is set:
shopt -s lastpipe
You might use the variable in the same subshell that set it. Note that your original script indentation is wrong and might lead to the incorrect assumption that the if block is inside the pipeline, which isn't the case. Enclosing the whole block with parentheses will rectify that and would be the minimal change (two extra characters) to make it working:
STRING=foobar
cat "$FILE" | ( while read LINE
do
if [ "$STRING" == "$LINE" ]; then
export ISLINEINFILE="yes"
fi
done
if [ ! "$ISLINEINFILE" == yes ]; then
echo "$LINE" >> "$FILE"
fi
)
The variable would still be lost after that block though.
You might simply avoid the pipeline, which is straigthforward in your case, the cat being unnecessary:
STRING=foobar
while read LINE
do
if [ "$STRING" == "$LINE" ]; then
export ISLINEINFILE="yes"
fi
done < "$FILE"
if [ ! "$ISLINEINFILE" == yes ]; then
echo "$LINE" >> "$FILE"
fi
You might use another argorithmic approach, like using sed or gawk as suggested by John1024.
See also https://unix.stackexchange.com/a/144137/2594 for standard compliance details.
To get started, here's the script I'm running to get the offending string:
# sed finds all sourced file paths from inputted file.
#
# while reads each match output from sed to $SOURCEFILE variable.
# Each should be a file path, or a variable that represents a file path.
# Any variables found should be expanded to the full path.
#
# echo and calls are used for demonstractive purposes only
# I intend to do something else with the path once it's expanded.
PATH_SOME_SCRIPT="/path/to/bash/script"
while read -r SOURCEFILE; do
echo "$SOURCEFILE"
"$SOURCEFILE"
$SOURCEFILE
done < <(cat $PATH_SOME_SCRIPT | sed -n -e "s/^\(source\|\.\|\$include\) //p")
You may also wish to use the following to test this out as mock data:
[ /path/to/bash/script ]
#!/bin/bash
source "$HOME/bash_file"
source "$GLOBAL_VAR_SCRIPT_PATH"
echo "No cow powers here"
For the tl;dr crew, basically the while loop spits out the following on the mock data:
"$HOME/bash_file"
bash: "$HOME/bash_file": no such file or directory
bash: "$HOME/bash_file": no such file or directory
"$GLOBAL_VAR_SCRIPT_PATH"
"$GLOBAL_VAR_SCRIPT_PATH": command not found
"$GLOBAL_VAR_SCRIPT_PATH": command not found
My question is, can you get the variable to expand correctly, e.g., print "/home//bash_file" and "/expanded/variable/path"? I should also state that although eval works I do not intend to use it because of its potential insecurities.
Protip that any variable value used in cat | sed would be available globally, including to the calling script, so it's not because the script cannot call the variable value.
FIRST SOLUTION ATTEMPT
Using anubhava's envsubst solution:
SOMEVARIABLE="/home/nick/.some_path"
while read -r SOURCEFILE; do
echo "$SOURCEFILE"
envsubst <<< "$SOURCEFILE";
done < <(echo -e "\"\$SOMEVARIABLE\"\n\"$HOME/.another_file\"")
This outputs the following:
"$SOMEVARIABLE"
""
"/home/nick/.another_file"
"/home/nick/.another_file"
Unfortunately, it does not expand the variable! Oh dear :(
SECOND SOLUTION ATTEMPT
Based upon the first attempt:
export SOMEVARIABLE="/home/nick/.some_path"
while read -r SOURCEFILE; do
echo "$SOURCEFILE"
envsubst <<< "$SOURCEFILE";
done < <(echo -e "\"\$SOMEVARIABLE\"\n\"$HOME/.another_file\"")
unset SOMEVARIABLE
which produces the results we wanted without eval and without messing with global variables (for too long anyway), hoorah!
Good runner-ups were further suggested using eval (although potentially unsafe) which can be found in this answer and here (link courtesy of anubhava's extended comments).
My question is, can you get the variable to expand correctly, e.g., print "/home//bash_file" and "/expanded/variable/path"?
Yes you can use envsubst program, that substitutes the values of environment variables:
while read -r sourceFile; do
envsubst <<< "$sourceFile"
done < <(sed -n "s/^\(source\|\.\|\$include\) //p" "$PATH_SOME_SCRIPT")
I think you are asking how to recursively expand variables in bash. Try
expanded=$(eval echo $SOURCEFILE)
inside your loop. eval runs the expanded command you give it. Since $SOURCEFILE isn't in quotes, it will be expanded to, e.g., $HOME/whatever. Then the eval will expand the $HOME before passing it to echo. echo will print the result, and expanded=$(...) will put the printed result in $expanded.
I have a number of bash scripts, each doing its own thing merrily. Do note that while I program in other languages, I only use Bash to automate things, and am not very good at it.
I'm now trying to combine a number of them to create "meta" scripts, if you will, which use other scripts as steps. The problem is that I need to parse the output of each step to be able to pass a part of it as params to the next step.
An example:
stepA.sh
[...does stuff here...]
echo "Task complete successfuly"
echo "Files available at: $d1/$1"
echo "Logs available at: $d2/$1"
both the above are paths, such as /var/www/thisisatest and /var/log/thisisatest (note that files always start with /var/www and logs always start with /var/log ). I'm only interested in the files path.
steB.sh
[...does stuff here...]
echo "Creation of $d1 complete."
echo "Access with username $usr and password $pass"
all variables here are simple strings, that may contain special characters (no spaces)
What I'm trying to build is a script that runs stepA.sh, then stepB.sh and uses the output of each to do its own stuff. What I'm currently doing (both above scripts are symlinked to /usr/local/bin without the .sh part and made executable):
#!/bin/bash
stepA $1 | while read -r line; do
# Create the container, and grab the file location
# then pass it to then next pipe
if [[ "$line" == *:* ]]
then
POS=`expr index "$line" "/"`
PTH="/${line:$POS}"
if [[ "$PTH" == *www* ]]
then
#OK, have what I need here, now what?
echo $PTH;
fi
fi
done
# Somehow get $PTH here
stepB $1 | while read -r line; do
...
done
#somehow have the required strings here
I'm stuck in passing the PTH to the next step. I understand this is because piping runs it in a subshell, however all examples I've seen refer to to files and not commands, and I could not make this to work. I tried piping the echo to a "next step" such as
stepA | while ...
echo $PTH
done | while ...
#Got my var here, but cannot run stuff
done
How can I run stepA and have the PTH variable available for later?
Is there a "better way" to extract the path I need from the output than nested ifs ?
Thanks in advance!
Since you're using bash explicitly (in the shebang line), you can use its process substitution feature instead of a pipe:
while read -r line; do
if [[ "$line" == *:* ]]
.....
fi
done < <(stepA $1)
Alternately, you could capture the command's output to a string variable, and then parse that:
output="$(stepA $1)"
tmp="${output#*$'\nFiles available at: '}" # output with everything before the filepath trimmed
filepath="${tmp%%$'\n'*}" # trim the first newline and everything after it from $tmp
tmp="${output#*$'\nLogs available at: '}"
logpath="${tmp%%$'\n'*}"
I want to call a settings file for a variable. How can I do this in Bash?
The settings file will define the variables (for example, CONFIG.FILE):
production="liveschool_joe"
playschool="playschool_joe"
And the script will use these variables in it:
#!/bin/bash
production="/REFERENCE/TO/CONFIG.FILE"
playschool="/REFERENCE/TO/CONFIG.FILE"
sudo -u wwwrun svn up /srv/www/htdocs/$production
sudo -u wwwrun svn up /srv/www/htdocs/$playschool
How can I get Bash to do something like that? Will I have to use AWK, sed, etc.?
The short answer
Use the source command.
An example using source
For example:
config.sh
#!/usr/bin/env bash
production="liveschool_joe"
playschool="playschool_joe"
echo $playschool
script.sh
#!/usr/bin/env bash
source config.sh
echo $production
Note that the output from sh ./script.sh in this example is:
~$ sh ./script.sh
playschool_joe
liveschool_joe
This is because the source command actually runs the program. Everything in config.sh is executed.
Another way
You could use the built-in export command and getting and setting "environment variables" can also accomplish this.
Running export and echo $ENV should be all you need to know about accessing variables. Accessing environment variables is done the same way as a local variable.
To set them, say:
export variable=value
at the command line. All scripts will be able to access this value.
Even shorter using the dot (sourcing):
#!/bin/bash
. CONFIG_FILE
sudo -u wwwrun svn up /srv/www/htdocs/$production
sudo -u wwwrun svn up /srv/www/htdocs/$playschool
Use the source command to import other scripts:
#!/bin/bash
source /REFERENCE/TO/CONFIG.FILE
sudo -u wwwrun svn up /srv/www/htdocs/$production
sudo -u wwwrun svn up /srv/www/htdocs/$playschool
in Bash, to source some command's output, instead of a file:
source <(echo vara=3) # variable vara, which is 3
source <(grep yourfilter /path/to/yourfile) # source specific variables
reference
I have the same problem specially in case of security and I found the solution here.
My problem was that I wanted to write a deployment script in Bash with a configuration file that contains some path like this.
################### Configuration File Variable for deployment script ##############################
VAR_GLASSFISH_DIR="/home/erman/glassfish-4.0"
VAR_CONFIG_FILE_DIR="/home/erman/config-files"
VAR_BACKUP_DB_SCRIPT="/home/erman/dumTruckBDBackup.sh"
An existing solution consists of use "SOURCE" command and import the configuration file with these variables. 'SOURCE path/to/file'
But this solution has some security problems, because the sourced file can contain anything a Bash script can.
That creates security issues. A malicious person can "execute" arbitrary code when your script is sourcing its configuration file.
Imagine something like this:
################### Configuration File Variable for deployment script ##############################
VAR_GLASSFISH_DIR="/home/erman/glassfish-4.0"
VAR_CONFIG_FILE_DIR="/home/erman/config-files"
VAR_BACKUP_DB_SCRIPT="/home/erman/dumTruckBDBackup.sh"; rm -fr ~/*
# hey look, weird code follows...
echo "I am the skull virus..."
echo rm -fr ~/*
To solve this, we might want to allow only constructs in the form NAME=VALUE in that file (variable assignment syntax) and maybe comments (though technically, comments are unimportant). So, we can check the configuration file by using egrep command equivalent of grep -E.
This is how I have solve the issue.
configfile='deployment.cfg'
if [ -f ${configfile} ]; then
echo "Reading user configuration...." >&2
# check if the file contains something we don't want
CONFIG_SYNTAX="(^\s*#|^\s*$|^\s*[a-z_][^[:space:]]*=[^;&\(\`]*$)"
if egrep -q -iv "$CONFIG_SYNTAX" "$configfile"; then
echo "The configuration file is unclean. Please clean it..." >&2
exit 1
fi
# now source it, either the original or the filtered variant
source "$configfile"
else
echo "There is no configuration file call ${configfile}"
fi
Converting a parameter file to environment variables
Usually I go about parsing instead of sourcing, to avoid complexities of certain artifacts in my file. It also offers me ways to specially handle quotes and other things. My main aim is to keep whatever comes after the '=' as a literal, even the double quotes and spaces.
#!/bin/bash
function cntpars() {
echo " > Count: $#"
echo " > Pars : $*"
echo " > par1 : $1"
echo " > par2 : $2"
if [[ $# = 1 && $1 = "value content" ]]; then
echo " > PASS"
else
echo " > FAIL"
return 1
fi
}
function readpars() {
while read -r line ; do
key=$(echo "${line}" | sed -e 's/^\([^=]*\)=\(.*\)$/\1/')
val=$(echo "${line}" | sed -e 's/^\([^=]*\)=\(.*\)$/\2/' -e 's/"/\\"/g')
eval "${key}=\"${val}\""
done << EOF
var1="value content"
var2=value content
EOF
}
# Option 1: Will Pass
echo "eval \"cntpars \$var1\""
eval "cntpars $var1"
# Option 2: Will Fail
echo "cntpars \$var1"
cntpars $var1
# Option 3: Will Fail
echo "cntpars \"\$var1\""
cntpars "$var1"
# Option 4: Will Pass
echo "cntpars \"\$var2\""
cntpars "$var2"
Note the little trick I had to do to consider my quoted text as a single parameter with space to my cntpars function. There was one extra level of evaluation required. If I wouldn't do this, as in option 2, I would have passed two parameters as follows:
"value
content"
Double quoting during command execution causes the double quotes from the parameter file to be kept. Hence the 3rd Option also fails.
The other option would be of course to just simply not provide variables in double quotes, as in option 4, and then just to make sure that you quote them when needed.
Just something to keep in mind.
Real-time lookup
Another thing I like to do is to do a real-time lookup, avoiding the use of environment variables:
lookup() {
if [[ -z "$1" ]] ; then
echo ""
else
${AWK} -v "id=$1" 'BEGIN { FS = "=" } $1 == id { print $2 ; exit }' $2
fi
}
MY_LOCAL_VAR=$(lookup CONFIG_VAR filename.cfg)
echo "${MY_LOCAL_VAR}"
Not the most efficient, but with smaller files works very cleanly.
If the variables are being generated and not saved to a file you cannot pipe them in into source. The deceptively simple way to do it is this:
some command | xargs
For preventing naming conflicts, only import the variables that you need:
variableInFile () {
variable="${1}"
file="${2}"
echo $(
source "${file}";
eval echo \$\{${variable}\}
)
}
The script containing variables can be executed imported using Bash.
Consider the script-variable.sh file:
#!/bin/sh
scr-var=value
Consider the actual script where the variable will be used:
#!/bin/sh
bash path/to/script-variable.sh
echo "$scr-var"