Can't export env vars inside env loop? - bash

I have a function where I tried to export some env vars:
env -0 | while IFS='=' read -r -d '' env_var_name env_var_value; do
# Some logic here, then:
export MY_ENV_VAR=hello
done
However, I just noticed that export and unset do not work inside this loop. What's the best way to perform these exports if I can't do them inside the loop? Store them somewhere and execute them outside the loop?

The loop isn't the issue. The problem is actually the pipe. When you pipe to a command, a subshell is created and any variables set inside of that subshell will go away when it exits. You can work around this using process substitution:
while IFS='=' read -r -d '' env_var_name env_var_value; do
# Some logic here, then:
export MY_ENV_VAR=hello
done < <(env -0)

Related

Forcing string replacement in declared function of shell script

I'm working on a script to move some files to a remote server (see:
Function calls in Here Document for unix shell script for more details). In order to allow the script to work both on a local machine and for a remote server, I'm using 'declare -f' to wrap an existing function to be executed remotely. So far I have come up with this:
myscript.sh
REMOTE_HOST=myhost
TMP=eyerep-files
getMoveCommand()
{
echo Src Dir: $2
sudo cp ~/$TMP/start.ini ~/$1/start_b.ini
ls ~/$2
echo Target Dir: $1
ls ~/$1
}
moveRemote()
{
echo "attempting move with here doc"
echo $(declare -fp getMoveCommand )
ssh -t "$REMOTE_HOST" "$(declare -fp getMoveCommand); getMoveCommand ${1#Q} ${TMP#Q}"
}
moveFiles()
{
case "$1" in
# remote deploy
remote)
moveRemote $2
;;
# local deploy
local)
getMoveCommand $2
;;
*)
echo "Usage: myscript.sh {local|remote}"
exit 1
;;
esac
}
moveFiles $1 $2
exit 0
If called with './myscript.sh remote dev' the script should ssh into the remote server and move a file from one folder to another. The problem I'm running into is the string replacement. I have a bunch of global variables acting as constants that getMoveCommand needs access to. In the example here there is only one (TMP) so I can simply pass it as an argument. In the actual script however, the work being done is more complicated and the number of arguments that would need to be passed in would make this solution unwieldy. Since those variables are never expected to change, it seems like it should be possible to force the string replacement to occur before sending the wrapped function along to ssh.
Is what I want to do possible, and if so how? If not, is there another way to handle this that doesn't require passing a large number of arguments to the function?
It is possible to use envsubst if you export the variable:
export TMP=foo
getMoveCommand() {
echo TMP is $TMP
}
declare -fp getMoveCommand|envsubst
The script above prints:
getMoveCommand ()
{
echo TMP is foo
}
You can also send global variables using declare -p:
ssh -t "$REMOTE_HOST" "$(declare -fp getMoveCommand; declare -p GLOBAL_VAR_1 GLOBAL_VAR_2)"$'\n'"getMoveCommand ${1#Q} ${TMP#Q}"
You can also have another global variable that declares them so you can expand them easily:
GLOBAL_VARS=(GLOBAL_VAR_1 GLOBAL_VAR_2)
...
ssh -t "$REMOTE_HOST" "$(declare -fp getMoveCommand; declare -p "${GLOBAL_VARS[#]}")"$'\n'"getMoveCommand ${1#Q} ${TMP#Q}"
If your variables have a common prefix, you can also expand them through "${!PREFIX#}". No need to store to a variable.
Or might as well create an "export" function to keep things cleaner:
dump_env() {
declare -fp getMoveCommand
declare -p GLOBAL_VAR_1 GLOBAL_VAR_2
}
...
ssh -t "$REMOTE_HOST" "$(dump_env)"$'\n'"getMoveCommand ${1#Q} ${TMP#Q}"

Bash: export .env variables [duplicate]

Let's say I have .env file contains lines like below:
USERNAME=ABC
PASSWORD=PASS
Unlike the normal ones have export prefix so I cannot source the file directly.
What's the easiest way to create a shell script that loads content from .env file and set them as environment variables?
If your lines are valid, trusted shell but for the export command
This requires appropriate shell quoting. It's thus appropriate if you would have a line like foo='bar baz', but not if that same line would be written foo=bar baz
set -a # automatically export all variables
source .env
set +a
If your lines are not valid shell
The below reads key/value pairs, and does not expect or honor shell quoting.
while IFS== read -r key value; do
printf -v "$key" %s "$value" && export "$key"
done <.env
This will export everything in .env:
export $(xargs <.env)
Edit: this requires the environment values to not have whitespace. If this does not match your use case you can use the solution provided by Charles
Edit2: I recommend adding a function to your profile for this in any case so that you don't have to remember the details of set -a or how xargs works.
This is what I use:
function load_dotenv(){
# https://stackoverflow.com/a/66118031/134904
source <(cat $1 | sed -e '/^#/d;/^\s*$/d' -e "s/'/'\\\''/g" -e "s/=\(.*\)/='\1'/g")
}
set -a
[ -f "test.env" ] && load_dotenv "test.env"
set +a
If you're using direnv, know that it already supports .env files out of the box :)
Add this to your .envrc:
[ -f "test.env" ] && dotenv "test.env"
Docs for direnv's stdlib: https://direnv.net/man/direnv-stdlib.1.html
Found this:
http://www.commandlinefu.com/commands/view/12020/export-key-value-pairs-list-as-environment-variables
while read line; do export $line; done < <(cat input)
UPDATE So I've got it working as below:
#!/bin/sh
while read line; do export $line; done < .env
use command below on ubuntu
$ export $(cat .env)

env -0 dump environment. But how to load it?

The linux command line tool env can dump the current environment.
Since there are some special characters I want to use env -0 (end each output line with 0 byte rather than newline).
But how to load this dump again?
Bash Version: 4.2.53
Don't use env; use declare -px, which outputs the values of exported variables in a form that can be re-executed.
$ declare -px > env.sh
$ source env.sh
This also gives you the possibility of saving non-exported variables as well, which env does not have access to: just use declare -p (dropping the -x option).
For example, if you wrote foo=$'hello\nworld', env produces the output
foo=hello
world
while declare -px produces the output
declare -x foo="hello
world"
If you want to load the export of env you can use what is described in Set environment variables from file:
env > env_file
set -o allexport
source env_file
set +o allexport
But if you happen to export with -0 it uses (from man env):
-0, --null
end each output line with 0 byte rather than newline
So you can loop through the file using 0 as the character delimiter to mark the end of the line (more description in What does IFS= do in this bash loop: cat file | while IFS= read -r line; do … done):
env -0 > env_file
while IFS= read -r -d $'\0' var
do
export "$var"
done < env_file

Bash: echo extract variables

Suppose there's a script called 'test.sh':
#!/bin/bash
while read line; do
APP=/apps echo "$line"
done < ./lines
And the 'lines':
cd $APP && pwd
If I bash test.sh, it prints out 'cd $APP && pwd'.
But when I type APP=/apps echo "cd $APP && pwd" in the terminal, it prints out 'cd /apps && pwd'.
Is it possible using echo to extract variables which are reading from a regular file?
Depending on the contents of the file, you may want to use eval:
#!/bin/bash
APP=/apps
while read line; do
eval "echo \"$line\"" # WARNING: dangerous
done < ./lines
However, eval is extremely dangerous. Although the quoting here will work for simple cases, it is quite easy to execute arbitrary commands by manipulating the input.
You should use eval to evaluate string line read from file
If you know the variable(s) you want to substitute, just substitute them.
sed 's%\$APP\>%/apps%g' ./lines

use external file with variables

The following is iptable save file, which I modified by setting some variables like you see below.
-A OUTPUT -o $EXTIF -s $UNIVERSE -d $INTNET -j REJECT
I also have a bash script which is defining this variables and should call iptables-restore with the save file above.
#!/bin/sh
EXTIF="eth0"
INTIF="eth1"
INTIP="192.168.0.1/32"
EXTIP=$(/sbin/ip addr show dev "$EXTIF" | perl -lne 'if(/inet (\S+)/){print$1;last}');
UNIVERSE="0.0.0.0/0"
INTNET="192.168.0.1/24"
Now I need to use
/sbin/iptables-restore <the content of iptables save file>
in bash script and somehow insert the text file on top to this script, so the variables will be initialized. Is there any way to do that?
UPDATE: even tried this
/sbin/iptables-restore -v <<-EOF;
$(</etc/test.txt)
EOF
Something like this:
while read line; do eval "echo ${line}"; done < iptables.save.file | /sbin/iptables-restore -v
or more nicely formatted:
while read line
do eval "echo ${line}"
done < iptables.save.file | /sbin/iptables-restore -v
The eval of a string forces the variable expansion stuff.
Use . (dot) char to include one shell script to another:
#!/bin/sh
. /path/to/another/script
In your shell script:
. /path/to/variable-definitions
/sbin/iptables-restore < $(eval echo "$(</path/to/template-file)")
or possibly
/sbin/iptables-restore < <(eval echo "$(</path/to/template-file)")

Resources