Shell script: Pass associative array to another shell script - bash

I want to pass associate array as a single argument to shell script.
mainshellscript.sh :
#!/bin/bash
declare -A cmdOptions=( [branch]=testing [directory]=someDir [function1key]=function1value [function1key]=function1value )
./anothershellscript.sh cmdOptions
anothershellscript.sh :
#!/bin/bash
#--- first approach, didn't work, shows error "declare: : not found"
#opts=$(declare -p "$1")
#declare -A optsArr=${opts#*=}
#---
# second approach, didnt work, output - declare -A optsArr=([0]="" )
#declare -A newArr="${1#*=}"
#declare -p newArr
#---
I'm suspecting that they way I am collecting array in anothershellscript.sh is wrong, I'm finding a way to access map values simply by providing echo "${optsArr[branch]}" shall give testing.
I am using bash version 4.4.23(1)-release (x86_64-pc-msys).

Processes are separate - they do not share variables. A variable set in one process is not visible in the other process. Except: child processes inherit exported variables, but cmdOptions is not exported and you can't export associative arrays anyway.
So pass the string representation of the array to the child, either via arguments or with a file or with an exported variable:
#!/bin/bash
declare -A cmdOptions=( [branch]=testing [directory]=someDir [function1key]=function1value [function1key]=function1value )
export SOME_VAR="$(declare -p cmdOptions)" # env or
declare -p cmdOptions > some_file.txt # file or
./anothershellscript.sh "$(declare -p cmdOptions)" # arg
# Note - `declare -p` is called _on parent side_!
Then load in the child:
#!/bin/bash
declare -A newArr="${SOME_VAR#*=}" # env or
. some_file.txt # file or
declare -A newArr="${1#*=}" # arg
# How to load from file so that current environment is not affected:
tmp=$(
. some_file.txt
declare -p cmdOptions
)
declare -A newArr=${tmp#*=}"

Passing Associative array to sub script
Associative arrays and regular arrays are not well exported. But you could pass variables by using declare -p in some wrapper call.
First script:
#!/bin/bash
declare -A cmdOptions=( [branch]=testing [function1key]=function1value
[directory]=someDir [function2key]=function2value )
declare -a someArray=( foo "bar baz" )
declare -x someVariable="Foo bar baz" # Using *export* for simple variables
bash -c "$(declare -p cmdOptions someArray someVariable);. anothershellscript.sh"
Note syntax . anothershellscript.sh could be replaced by source anothershellscript.sh.
Doing so, prevent need of temporary files or temporary variable and keep STDIN/STDOUT free.
Then your anothershellscript.sh could use your variables as is.
#!/bin/bash
declare -p cmdOptions someArray someVariable
could work.

You'll have to rebuild that array in second script, here is an example script1:
#!/bin/bash
declare -A test=(
[v1]=1
[v2]=2
[v3]=3
)
for key in "${!test[#]}"; { data+=" $key=${test[$key]}"; }
echo "$data" | ./test2
script2:
#!/bin/bash
read -ra data
declare -A test2
for item in "${data[#]}"; {
key="${item%=*}"
val="${item#*=}"
test2[$key]="$val"
}
echo "${test2[#]}"

Related

Iterating array in declared function of bash shell script

I've been working through creating a script to move some files from a local machine to a remote server. As part of that process I have a function that can either be called directly or wrapped with 'declare -fp' and sent along to an ssh command. The code I have so far looks like this:
export REMOTE_HOST=myserver
export TMP=eyerep-files
doTest()
{
echo "Test moving files from $TMP with arg $1"
declare -A files=(["abc"]="123" ["xyz"]="789")
echo "Files: ${!files[#]}"
for key in "${!files[#]}"
do
echo "$key => ${files[$key]}"
done
}
moveTest()
{
echo "attempting move with wrapped function"
ssh -t "$REMOTE_HOST" "$(declare -fp doTest|envsubst); doTest ${1#Q}"
}
moveTest $2
If I run the script with something like
./myscript.sh test dev
I get the output
attempting move with wrapped function
Test moving files from eyerep-files with arg dev
Files: abc xyz
bash: line 7: => ${files[]}: bad substitution
It seems like the string expansion for the for loop is not working correctly. Is this expected behaviour? If so, is there an alternative way to loop through an array that would avoid this issue?
If you're confident that your remote account's default shell is bash, this might look like:
moveTest() {
ssh -t "$REMOTE_HOST" "$(declare -f doTest; declare -p $(compgen -e)); doTest ${1#Q}"
}
If you aren't, it might instead be:
moveTest() {
ssh -t "$REMOTE_HOST" 'exec bash -s' <<EOF
set -- ${##Q}
$(declare -f doTest; declare -p $(compgen -e))
doTest \"\$#\"
EOF
}
I managed to find an answer here: https://unix.stackexchange.com/questions/294378/replacing-only-specific-variables-with-envsubst/294400
Since I'm exporting the global variables, I can get a list of them using compgen and use that list with envsubst to specify which variables I want to replace. My finished function ended up looking like:
moveTest()
{
echo "attempting move with wrapped function"
ssh -t "$REMOTE_HOST" "$(declare -fp doTest|envsubst "$(compgen -e | awk '$0="${"$0"}"') '${1}'"); doTest ${1#Q}"
}

Exporting associative arrays from one script to another

I have a bash script which basically contains all export variables and I am trying to add associative array into that script.
This is my export's script:
#!/bin/bash
export declare -A oldLinks
oldLinks["A"]="linkA"
oldLinks["B"]="linkB"
oldLinks["C"]="linkC"
oldLinks["D"]="linkD"
export declare -A newLinks
newLinks["E"]="linkE"
newLinks["F"]="linkF"
newLinks["G"]="linkG"
newLinks["H"]="linkH"
This is the main script:
#!/bin/bash
source ArraysFile
for i in "${!oldLinks[#]}"
do
echo "${i} -> ${oldLinks[$i]}"
done
for i in "${!newLinks[#]}"
do
echo "${i} -> ${newLinks[$i]}"
done
This is the error which I'm getting:
export: `-A': not a valid identifier

Make a typeset function access local variable when executed remotely

I want to create a function locally, echo_a in the example, and pass it with to a remote shell through ssh, here with typeset -f. The problem is that function does not have access to the local variables.
export a=1
echo_a() {
echo a: $a
}
bash <<EOF
$(typeset -f echo_a)
echo local heredoc:
echo_a
echo
echo local raw heredoc:
echo a: $a
echo
EOF
ssh localhost bash <<EOF
$(typeset -f echo_a)
echo remote heredoc:
echo_a
echo
echo remote raw heredoc:
echo a: $a
echo
EOF
Assuming the ssh connection is automatic, running the above script gives me as output:
local heredoc:
a: 1
local raw heredoc:
a: 1
remote heredoc:
a:
remote raw heredoc:
a: 1
See how the "remote heredoc" a is empty? What can I do to get 1 there?
I tested adding quotes and backslashes everywhere without success.
What am I missing? Would something else than typeset make this work?
Thanks to #Guy for the hint, it indeed is because ssh disables by default sending the environment variables. In my case, changing the server's setting was not wanted.
Hopefully we can hack around by using compgen, eval and declare.
First we identify added variables generically. Works if variables are created inside a called function too. Using compgen is neat because we don't need to export variables explicitely.
The array diff code comes from https://stackoverflow.com/a/2315459/1013628 and the compgen trick from https://stackoverflow.com/a/16337687/1013628.
# Store in env_before all variables created at this point
IFS=$'\n' read -rd '' -a env_before <<<"$(compgen -v)"
a=1
# Store in env_after all variables created at this point
IFS=$'\n' read -rd '' -a env_after <<<"$(compgen -v)"
# Store in env_added the diff betwen env_after and env_before
env_added=()
for i in "${env_after[#]}"; do
skip=
for j in "${env_before[#]}"; do
[[ $i == $j ]] && { skip=1; break; }
done
if [[ $i == "env_before" || $i == "PIPESTATUS" ]]; then
skip=1
fi
[[ -n $skip ]] || env_added+=("$i")
done
echo_a() {
echo a: $a
}
env_added holds now an array of all names of added variables between the two calls to compgen.
$ echo "${env_added[#]}"
a
I filter out also the variables env_before and PIPESTATUS as they are added automatically by bash.
Then, inside the heredocs, we add eval $(declare -p "${env_added[#]}").
declare -p VAR [VAR ...] prints, for each VAR, the variable name followed by = followed by its value:
$ a = 1
$ b = 2
$ declare -p a b
declare -- a=1
declare -- b=2
And the eval is to actually evaluate the declare lines. The rest of the code looks like:
bash <<EOF
# Eval the variables computed earlier
eval $(declare -p "${env_added[#]}")
$(typeset -f echo_a)
echo local heredoc:
echo_a
echo
echo local raw heredoc:
echo a: $a
echo
EOF
ssh rpi_301 bash <<EOF
# Eval the variables computed earlier
eval $(declare -p "${env_added[#]}")
$(typeset -f echo_a)
echo remote heredoc:
echo_a
echo
echo remote raw heredoc:
echo a: $a
echo
EOF
Finally, running the modified script gives me the wanted behavior:
local heredoc:
a: 1
local raw heredoc:
a: 1
remote heredoc:
a: 1
remote raw heredoc:
a: 1

Why can't I use `declare -r` inside a function fo mark a variable readonly while `set -u` is in use?

With GNU bash, version 4.3.11(1)-release (x86_64-pc-linux-gnu),
#! /bin/bash
set -u
exec {FD1}>tmp1.txt
declare -r FD1
echo "fd1: $FD1" # why does this work,
function f1() {
exec {FD2}>tmp2.txt
readonly FD2
echo "fd2: $FD2" # this work,
}
f1
function f2() {
exec {FD3}>tmp3.txt
echo "fd3: $FD3" # and even this work,
declare -r FD3
echo "fd3: $FD3" # when this complains: "FD3: unbound variable"?
}
f2
The goal is to make my file descriptor readonly
I don't think it's a bug. The exec statement assigns a value to the parameter FD3 in the global scope, while the declare statement creates a local parameter that shadows the global:
When used in a function, `declare' makes NAMEs local, as with the `local'
command. The `-g' option suppresses this behavior.
It's the local parameter that is undefined. You can see this with a slightly different example:
$ FD3=foo
$ f () { FD3=bar; declare -r FD3=baz; echo $FD3; }
$ f
baz # The value of the function-local parameter
$ echo $FD3
bar # The value of the global parameter set in f

How to export an associative array (hash) in bash?

Related, but not a duplicate of: How to define hash tables in Bash?
I can define and use a bash hash, but I am unable to export it, even with the -x flag. For example, the following works to export (and test exportation of) a normal string variable:
aschirma#graphics9-lnx:/$ export animal_cow="moo"
aschirma#graphics9-lnx:/$ bash -c "echo \$animal_cow"
moo
aschirma#graphics9-lnx:/$
However, if I try to export a hash:
aschirma#graphics9-lnx:/$ declare -A -x animals
aschirma#graphics9-lnx:/$ animals[duck]="quack"
aschirma#graphics9-lnx:/$ echo ${animals[duck]}
quack
aschirma#graphics9-lnx:/$ bash -c "echo \${animals[duck]}"
aschirma#graphics9-lnx:/$
It seems the nested bash shell does not have the hash in its scope. I did verify this also by manually entering the nested bash shell and attempting to use the hash interactively.
There isn't really a good way to encode an array variable into the environment. See
http://www.mail-archive.com/bug-bash#gnu.org/msg01774.html (Chet Ramey is the maintainer of bash)
As a workaround for this harsh Bash limitation I'm using "serialize to temporary file" method. You can export plain variables, so you can pass an array (associative) through filepath. Of course, this has limitations, but sometimes works and is good enough.
declare -A MAP # export associative array
MAP[bar]="baz"
declare -x serialized_array=$(mktemp) # create temporary variable
# declare -p can be used to dump the definition
# of a variable as shell code ready to be interpreted
declare -p MAP > "${serialized_array}" # serialize an array in temporary file
# perform cleanup after finishing script
cleanup() {
rm "${serialized_array}"
}
trap cleanup EXIT
# ... in place where you need this variable ...
source "${serialized_array}" # deserialize an array
echo "map: ${MAP[#]}"
This is a bit old but I answer anyway, you could use temp files. If you do it right you can wrapper it to use them like arrays. For example with this function:
var() { # set var or add comtent
case $1 in
*=|*=*)
local __var_part1=$( echo "$1" | sed -e 's/=.*//' -e 's/[+,-]//' ) # cut +=/=
local __var_part2=$( echo "$1" | sed -e 's/.*.=//' )
local __var12=$tmp_dir/$__var_part1
mkdir -p ${__var12%/*} #create all subdirs if its an array
case $1 in
*+=*)
# if its an array try to add new item
if [ -d $tmp_dir/$__var_part1 ] ; then
printf -- $__var_part2 > $tmp_dir/$__var_part1/\ $((
$( echo $tmp_dir/$__var_part2/* \
| tail | basename )\ + 1 ))
else
printf -- "$__var_part2" >> $tmp_dir/$__var_part1
fi
;;
*-=*) false ;;
# else just add content
*) printf -- "$__var_part2" > $tmp_dir/$__var_part1 ;;
esac
;;
*) # just print var
if [ -d $tmp_dir/$1 ] ; then
ls $tmp_dir/$1
elif [ -e $tmp_dir/$1 ] ; then
cat $tmp_dir/$1
else
return 1
fi
;;
esac
}
# you can use mostly like you set vars in bash/shell
var test='Hello Welt!'
# if you need arrays set it like this:
var fruits/0='Apple'
var fruits/1='Banana'
# or if you need a dict:
var contacts/1/name="Max"
var contacts/1/surname="Musterman"
This not the fastest way, but its very flexible, simple and works in nearly all shells.
short answer --> export animals after declaring it
full -->
Try this way as a script:
#!/usr/bin/env bash
declare -A -x animals
export animals
animals[duck]="quack"
echo ${animals[duck]}
bash -c "echo ${animals[duck]}"
Output on my side using Bash version: 5.1.16(1)
quack
quack
or in terminal:
$ declare -A -x animals
$ export animals
$ animals[duck]="quack"
$ echo ${animals[duck]}
quack
$ bash -c "echo ${animals[duck]}"
quack
$

Resources