Bash Save & Load Arrays from file - bash

I am saving my global variable arrays to a file with this:
declare -p hashTable > $File
declare -p testArray >> $File
I would like to load them back to global variables. I was using this:
source $File
That is fine when called from the global scope, but when it is within a function, it loads the variables back as local.
Is there a way to load them to globals?
Is there a way to save with the -g option so it loads globally?

On BASH 4.2+, you can source script as this inside your function:
fn() {
source <(sed 's/^declare -[aA]/&g/' "$File")
}
# access your array outside the function
declare -p testArray
This sed will find lines starting with declare -a or declare -A and replace them with declare -ag thus making all the array as global.

My two cents:
There are 2 way of doing this:
use -g argument of declare command
declare -p hashTable testArray | sed 's/ -[aA]/&g/' >$File
Nota: I prefer using sed when writting $File, instead of when reading.
fn() { source $File; }
declaring global variable out of the scope of function:
declare -p hashTable testArray | sed 's/^.* -[aA] //' >$File
then now:
fn() { source $File; }
declare -A hashTable
declare -a testArray
fn
If Associative array are declared before function and declare command are not used in the scope of function, this will do the job.

Related

How to use two `IFS` in Bash

So I know I can use a single IFS in a read statement, but is it possible to use two. For instance if I have the text
variable = 5 + 1;
print variable;
And I have the code to assign every word split to an array, but I also want to split at the ; as well as a space, if it comes up.
Here is the code so far
INPUT="$1"
declare -a raw_parse
while IFS=' ' read -r -a raw_input; do
for raw in "${raw_input[#]}"; do
raw_parse+=("$raw")
done
done < "$INPUT"
What comes out:
declare -a raw_parse=([0]="variable" [1]="=" [2]="5" [3]="+" [4]="1;" [5]="print" [6]="variable;")
What I want:
declare -a raw_parse=([0]="variable" [1]="=" [2]="5" [3]="+" [4]="1" [5]=";" [6]="print" [7]="variable" [8]=";")
A workaround with GNU sed. This inserts a space before every ; and replaces every newline with a space.
read -r -a raw_input < <(sed -z 's/;/ ;/g; s/\n/ /g' "$INPUT")
declare -p raw_input
Output:
declare -a raw_input=([0]="variable" [1]="=" [2]="5" [3]="+" [4]="1" [5]=";" [6]="print" [7]="variable" [8]=";")

Bash for loop with unique var names issue

I am using a bash for loop to cycle through a directory and print the file size. My issue though is that I need to assign a unique variable to each of the values so that they can be used later but the array is inputting all the data as one element in the array. I have tried both a double for loop and a if statement with a nested for loop but did not get the right results.
Question: How can I fix the below code to match my needs or is there a better method?
declare -a byte
for b in /home/usr/frames/*
do
byte+=$(wc -c < $b)
done
declare -p byte
With associative array (if available)
#!/usr/bin/env bash
for b in /home/usr/frames/*; do
declare -A byte["$b"]=$(wc -c < "$b")
done
Use Parameter Expansion to extract just the file name.
declare -A byte["${b##*/}"]=$(wc -c < "$b")
Now check the value of byte
declare -p byte
A variation on OPs original code and jetchisel's answer:
unset byte
declare -A byte # -A == associative array
for b in /home/usr/frames/*
do
byte[$(basename "${b}")]=$(wc -c < "${b}")
done
declare -p byte
Using some *.txt files on my host:
$ pwd
/c/temp/cygwin
$ wc -c *.txt
22 file.txt
405 somefile.txt
214 test.txt
With the modified for clause:
unset byte
declare -A byte
for b in /c/temp/cygwin/*.txt
do
byte[$(basename "${b}")]=$(wc -c < "${b}")
done
declare -p byte
Generates:
declare -A byte=([somefile.txt]="405" [test.txt]="214" [file.txt]="22" )

Improving functions to backup and restore a bash dictionary [duplicate]

This question already has an answer here:
How to store state between two consecutive runs of a bash script
(1 answer)
Closed 1 year ago.
I wrote this two simple functions to backup and restore the content of a bash dictionary:
declare -A dikv
declare -A dict
backup_dikv()
{
FILE=$1
rm -f $FILE
for k in "${!dikv[#]}"
do
echo "$k,${dikv[$k]}" >> $FILE
done
}
restore_dict()
{
FILE=$1
for i in $(cat $FILE)
do
key=$(echo $i | cut -f 1 -d ",")
val=$(echo $i | cut -f 2 -d ",")
dict[$key]=$val
done
}
# Initial values
dikv=( ["k1"]="v1" ["k2"]="v2" ["k3"]="v3" ["k4"]="v4")
backup_dikv /tmp/backup
restore_dict /tmp/backup
echo "${!dict[#]}"
echo "${dict[#]}"
My questions:
As you can see, these two funcions are very limited as the name of the backuped (dikv) and restored (dict) dictionaries is hardcoded. I would like to pass the dictionary as an input ($2) argument, but I don't know how to pass dictionaries as funcion arguments in bash.
Is this method to write keys and values into a file, using a string format ("key","value") and parse that string format to restore the dictionary, the unique / most eficient way to do that? Do you know some better mechanism to backup and restore a dictionary?
Thanks!
Use declare -p to reliably serialize variables regardless of their type
#!/usr/bin/env bash
if [ -f saved_vars.sh ]; then
# Restore saved variables
. saved_vars.sh
else
# No saved variables, so lets populate them
declare -A dikv=([foo]="foo bar from dikv" [bar]="bar baz from dikv")
declare -A dict=([baz]="baz qux from dict" [qux]="qux corge from dict")
fi
# Serialise backup dikv dict into the saved_vars.sh file
declare -p dikv dict >'saved_vars.sh'
printf %s\\n "${!dict[#]}"
printf %s\\n "${dict[#]}"
printf %s\\n "${!dikv[#]}"
printf %s\\n "${dikv[#]}"
Found a way to pass arrays to functions, using local -n in this way:
declare -A dikv
declare -A dict
backup_dictionary()
{
local -n dict_ref=$1
FILE=/tmp/backup
for k in "${!dict_ref[#]}"
do
echo "$k,${dict_ref[$k]}" >> $FILE
done
}
restore_dictionary()
{
local -n dict_ref=$1
FILE=/tmp/backup
for i in $(cat $FILE)
do
key=$(echo $i | cut -f 1 -d ",")
val=$(echo $i | cut -f 2 -d ",")
dict_ref[$key]=$val
done
}
dikv=( ["k1"]="v1" ["k2"]="v2" ["k3"]="v3" ["k4"]="v4")
backup_dictionary dikv
restore_dictionary dict
echo "${!dict[#]}"
echo "${dict[#]}"
Still trying to find the most convenient way to backup and restore the content.

How to assign a value to a variable which name is in a variable in bash?

I have a config file which looks like this:
$ cat .config
PARAM1 = avalue # a comment
PARAM2 = "many values" # another comment
# PARAM3=blabla
I wrote a function to read from it:
get_from_config_file()
{
A=$(grep "$1" ${config_file} | grep -v "^#" | sed s'/^[[:space:]]*//g' | sed s'/#.*$//' | sed s'/^.*=[[:space:]]*//' | sed s'/[[:space:]]*$//' | sed s'/"//g')
echo "$A"
}
Then I can read the parameters from the config file which works fine:
PARAM1=$(get_from_config_file "PARAM1")
PARAM2=$(get_from_config_file "PARAM2")
But I wanted to make it better (I have many parameters in this config file) so I wanted to be able to grab the value of all my parameters and then assign to variables in a simple for loop -- and here I got in trouble:
for name in PARAM1 PARAM2
do
value=$(get_from_config_file "$name")
echo $name, $value
# How to assign here $value to a variable named PARAM1, PARAM2 which is contained in name ?
# Note that I do not want to use an array for this
# param[$name]="$value"
done
Thanks,
Define variables directly using declare
for name in PARAM1 PARAM2
do
declare -gx "$name"="$(get_from_config_file "$name")"
#echo $name, $value
# How to assign here $value to a variable named PARAM1, PARAM2 which is contained in name ?
# Note that I do not want to use an array for this
# param[$name]="$value"
done
echo PARAM1="$PARAM1"
echo PARAM2="$PARAM2"
When you run this command, declare -gx "$name"="$value", Bash expands the variables, name and value, first. Then execute the command, declare -gx PARAM1=foobar
declare options:
-g create global variables when used in a shell function; otherwise
ignored
-x to make NAMEs export
for name in PARAM1 PARAM2
do
value=$(get_from_config_file "$name")
eval "$name"="$value"
done
This should assign the way you want.
Also for get_from_config_file how about,
awk -v input="$name" -F" = " '$0 ~ input{split($2, arr, "#"); print arr[1]}' .config
read is the command you are missing. while and read can be used together to read variables from a file. How you process the file to remove the comments is up to you; there are many ways. In the following example, I used sed to remove the # comments and convert the delimiter = to a single space.
sed '/^[[:blank:]]*#/d;s/#.*//;s/[[:blank:]]*=[[:blank:]]*/ /' "/.config" \
| while read -r name value; do
echo $name $value
done
If you remove the whitespaces around = signs the file can be interpreted as shell script:
. .config
If you don't want to edit the file:
eval "$(cat .config | sed 's/ = /=/')"

Split output of command into an associative array in bash

The output is
/ ext4
/boot ext2
tank zfs
On each line the delimiter is a space. I need an associative array like:
"/" => "ext4", "/boot" => "ext2", "tank" => "zfs"
How is this done in bash?
If the command output is in file file, then:
$ declare -A arr=(); while read -r a b; do arr["$a"]="$b"; done <file
Or, you can read the data directly from a command cmd into an array as follows:
$ declare -A arr=(); while read -r a b; do arr["$a"]="$b"; done < <(cmd)
The construct <(...) is process substitution. It allows us to read from a command the same as if we were reading from a file. Note that the space between the two < is essential.
You can verify that the data was read correctly using declare -p:
$ declare -p arr
declare -A arr='([tank]="zfs" [/]="ext4" [/boot]="ext2" )'

Resources