env -0 dump environment. But how to load it? - bash

The linux command line tool env can dump the current environment.
Since there are some special characters I want to use env -0 (end each output line with 0 byte rather than newline).
But how to load this dump again?
Bash Version: 4.2.53

Don't use env; use declare -px, which outputs the values of exported variables in a form that can be re-executed.
$ declare -px > env.sh
$ source env.sh
This also gives you the possibility of saving non-exported variables as well, which env does not have access to: just use declare -p (dropping the -x option).
For example, if you wrote foo=$'hello\nworld', env produces the output
foo=hello
world
while declare -px produces the output
declare -x foo="hello
world"

If you want to load the export of env you can use what is described in Set environment variables from file:
env > env_file
set -o allexport
source env_file
set +o allexport
But if you happen to export with -0 it uses (from man env):
-0, --null
end each output line with 0 byte rather than newline
So you can loop through the file using 0 as the character delimiter to mark the end of the line (more description in What does IFS= do in this bash loop: cat file | while IFS= read -r line; do … done):
env -0 > env_file
while IFS= read -r -d $'\0' var
do
export "$var"
done < env_file

Related

Bash: export .env variables [duplicate]

Let's say I have .env file contains lines like below:
USERNAME=ABC
PASSWORD=PASS
Unlike the normal ones have export prefix so I cannot source the file directly.
What's the easiest way to create a shell script that loads content from .env file and set them as environment variables?
If your lines are valid, trusted shell but for the export command
This requires appropriate shell quoting. It's thus appropriate if you would have a line like foo='bar baz', but not if that same line would be written foo=bar baz
set -a # automatically export all variables
source .env
set +a
If your lines are not valid shell
The below reads key/value pairs, and does not expect or honor shell quoting.
while IFS== read -r key value; do
printf -v "$key" %s "$value" && export "$key"
done <.env
This will export everything in .env:
export $(xargs <.env)
Edit: this requires the environment values to not have whitespace. If this does not match your use case you can use the solution provided by Charles
Edit2: I recommend adding a function to your profile for this in any case so that you don't have to remember the details of set -a or how xargs works.
This is what I use:
function load_dotenv(){
# https://stackoverflow.com/a/66118031/134904
source <(cat $1 | sed -e '/^#/d;/^\s*$/d' -e "s/'/'\\\''/g" -e "s/=\(.*\)/='\1'/g")
}
set -a
[ -f "test.env" ] && load_dotenv "test.env"
set +a
If you're using direnv, know that it already supports .env files out of the box :)
Add this to your .envrc:
[ -f "test.env" ] && dotenv "test.env"
Docs for direnv's stdlib: https://direnv.net/man/direnv-stdlib.1.html
Found this:
http://www.commandlinefu.com/commands/view/12020/export-key-value-pairs-list-as-environment-variables
while read line; do export $line; done < <(cat input)
UPDATE So I've got it working as below:
#!/bin/sh
while read line; do export $line; done < .env
use command below on ubuntu
$ export $(cat .env)

Can't export env vars inside env loop?

I have a function where I tried to export some env vars:
env -0 | while IFS='=' read -r -d '' env_var_name env_var_value; do
# Some logic here, then:
export MY_ENV_VAR=hello
done
However, I just noticed that export and unset do not work inside this loop. What's the best way to perform these exports if I can't do them inside the loop? Store them somewhere and execute them outside the loop?
The loop isn't the issue. The problem is actually the pipe. When you pipe to a command, a subshell is created and any variables set inside of that subshell will go away when it exits. You can work around this using process substitution:
while IFS='=' read -r -d '' env_var_name env_var_value; do
# Some logic here, then:
export MY_ENV_VAR=hello
done < <(env -0)

Adding extra argument to xargs

I'm trying to kick off multiple processes to work through some test suites. In my bash script I have the following
printf "%s\0" "${SUITE_ARRAY[#]}" | xargs -P 2 -0 bash -c 'run_test_suite "$#" ${EXTRA_ARG}'
Below is the defined script, cut down to it's basics.
SUITE_ARRAY will be a list of suites that may have 1 or more, {Suite 1, Suite 2, ..., Suite n}
EXTRA_ARG will be like a specific name to store values in another script
#!/bin/bash
run_test_suite(){
suite=$1
someArg=$2
someSaveDir=someArg"/"suite
# some preprocess work happens here, but isn't relevant to running
runSomeScript.sh suite someSaveDir
}
export -f run_test_suite
SUITES=$1
EXTRA_ARG=$2
IFS=','
SUITECOUNT=0
for csuite in ${SUITES}; do
SUITE_ARRAY[$SUITECOUNT]=$csuite
SUITECOUNT=$(($SUITECOUNT+1))
done
unset IFS
printf "%s\0" "${SUITE_ARRAY[#]}" | xargs -P 2 -0 bash -c 'run_test_suite "$#" ${EXTRA_ARG}'
The issue I'm having is how to get the ${EXTRA_ARG} passed into xargs. From how I've come to understand it, xargs will take whatever is piped into it, so the way I have it doesn't seem correct.
Any suggestions on how to correctly pass the values? Thanks in advance
If you want EXTRA_ARG to be available to the subshell, you need to export it. You can do that either explicitly, with the export keyword, or by putting the var=value assignment in the same simple command as xargs itself:
#!/bin/bash
run_test_suite(){
suite=$1
someArg=$2
someSaveDir=someArg"/"suite
# some preprocess work happens here, but isn't relevant to running
runSomeScript.sh suite someSaveDir
}
export -f run_test_suite
# assuming that the "array" in $1 is comma-separated:
IFS=, read -r -a suite_array <<<"$1"
# see the EXTRA_ARG="$2" just before xargs on the same line; this exports the variable
printf "%s\0" "${suite_array[#]}" | \
EXTRA_ARG="$2" xargs -P 2 -0 bash -c 'run_test_suite "$#" "${EXTRA_ARG}"' _
The _ prevents the first argument passed from xargs to bash from becoming $0, and thus not included in "$#".
Note also that I changed "${suite_array[#]}" to be assigned by splitting $1 on commas. This or something like it (you could use IFS=$'\n' to split on newlines instead, for example) is necessary, as $1 cannot contain a literal array; every shell command-line argument is only a single string.
This is something of a guess:
#!/bin/bash
run_test_suite(){
suite="$1"
someArg="$2"
someSaveDir="${someArg}/${suite}"
# some preprocess work happens here, but isn't relevant to running
runSomeScript.sh "${suite}" "${someSaveDir}"
}
export -f run_test_suite
SUITE_ARRAY="$1"
EXTRA_ARG="$2"
printf "%s\0" "${SUITE_ARRAY[#]}" |
xargs -n 1 -I '{}' -P 2 -0 bash -c 'run_test_suite {} '"${EXTRA_ARG}"
Using GNU Parallel it looks like this:
#!/bin/bash
run_test_suite(){
suite="$1"
someArg="$2"
someSaveDir="$someArg"/"$suite"
# some preprocess work happens here, but isn't relevant to running
echo runSomeScript.sh "$suite" "$someSaveDir"
}
export -f run_test_suite
EXTRA_ARG="$2"
parallel -d, -q run_test_suite {} "$EXTRA_ARG" ::: "$1"
Called as:
mytester 'suite 1,suite 2,suite "three"' 'extra "quoted" args here'
If you have the suites in an array:
parallel -q run_test_suite {} "$EXTRA_ARG" ::: "${SUITE_ARRAY[#]}"
Added bonus: Any output from the jobs will not be mixed, so you will not have to deal with http://mywiki.wooledge.org/BashPitfalls#Using_output_from_xargs_-P

issue with creating file containing $ from bash script

I want produce udev rule file from bash script. For this I'm using cat command. Unfortunately produced file has missing "$" char. Here is example test.sh script:
#!/bin/sh
rc=`cat <<stmt1 > ./test.txt
-p $tempnode
archive/$env{ID_FS_LABEL_ENC}
stmt1`
Result is following:
cat test.txt
-p ''
archive/{ID_FS_LABEL_ENC}
Where issue is ?
If you don't want any variable interpolation, use:
#!/bin/sh
group="test_1"
cat <<'stmt1' > ./test.txt
-p $tempnode
archive/$env{ID_FS_LABEL_ENC}
stmt1
rc=$?
(Notice the '' around stmt1.)

use external file with variables

The following is iptable save file, which I modified by setting some variables like you see below.
-A OUTPUT -o $EXTIF -s $UNIVERSE -d $INTNET -j REJECT
I also have a bash script which is defining this variables and should call iptables-restore with the save file above.
#!/bin/sh
EXTIF="eth0"
INTIF="eth1"
INTIP="192.168.0.1/32"
EXTIP=$(/sbin/ip addr show dev "$EXTIF" | perl -lne 'if(/inet (\S+)/){print$1;last}');
UNIVERSE="0.0.0.0/0"
INTNET="192.168.0.1/24"
Now I need to use
/sbin/iptables-restore <the content of iptables save file>
in bash script and somehow insert the text file on top to this script, so the variables will be initialized. Is there any way to do that?
UPDATE: even tried this
/sbin/iptables-restore -v <<-EOF;
$(</etc/test.txt)
EOF
Something like this:
while read line; do eval "echo ${line}"; done < iptables.save.file | /sbin/iptables-restore -v
or more nicely formatted:
while read line
do eval "echo ${line}"
done < iptables.save.file | /sbin/iptables-restore -v
The eval of a string forces the variable expansion stuff.
Use . (dot) char to include one shell script to another:
#!/bin/sh
. /path/to/another/script
In your shell script:
. /path/to/variable-definitions
/sbin/iptables-restore < $(eval echo "$(</path/to/template-file)")
or possibly
/sbin/iptables-restore < <(eval echo "$(</path/to/template-file)")

Resources