Zsh: Creating associative array from yaml file using yq - yaml

The real case:
Every time I set up my environment, I'd like to check and – create if are not existing – certain environment variables. So, instead of doing it manually all the time, I thought it would be great if I can have a file which stores the environment name env_N and environment value env_V pairs. Amongst all text file formats, the yaml looks the simplest and the more natural to store that info.
So what I thought it would be great if I suck in the yaml file with my environmental variables using yq and create associative array ready to be iterated over by zsh foreach loop:
foreach entry in my_assoc_arr
do
check_and_create(entry.env_N, entry.env_V)
done
with the final result of:
$ echo $env_N1
env_V1
$ echo $env_N2
env_V2
$ echo $env_N3
env_V3
...
The problem I'm having is to get my yaml to associative array using yq in shell zsh script.
After applying each suggestion from the comments, I was unable to create my associative array from yaml with yq. I had errors from yq like bad syntax or script worked or not depends on whether I have #!/bin/zsh switched on or commented out.
I got impression that my task is simple, but somehow I cant achieve this.
What I'm doing wrong here?
PS: I'm using zsh on macOS

Why would you want to use Yaml for this, when it’s much easier to write a simple shell script for it?
Simply create a text file with the following line for each env var:
# Set $FOO to 'bar' if $FOO does not yet exist.
export ${FOO=bar}
Then source this file from your shell (or another script).
See https://zsh.sourceforge.io/Doc/Release/Expansion.html#Parameter-Expansion

Related

Simple configuration files template processor that preserves bash-style variables

I have to introduce some templating over text configuration files (yaml, xml, json) that already contain bash-like syntax variables. I need to preserve existing bash-like variables untouched but substitute my ones. List is dynamic, variables should come from environment. Something like simple processor taking "$${MY_VAR}}" pattern but ignoring $MY_VAR. Preferably pure Bash or as small number of tooling required as possible.
Pattern could be $$(VAR) or anything that can be easily separated from ${VAR} and $VAR. The key limitation - it is intended for a docker container startup procedure injecting environment variables into provided service configuration templates and this way building this configuration. So something like Java or even Perl processing is not an option.
Does anybody have a simple approach?
I was using the following bash processing for such variable substitution where original files had no variables. But now I need something one step smarter.
# process input file ($1) placing output into ($2) with shell variables substitution.
process_file() {
set -e
eval "cat <<EOF
$(<$1)
EOF
" | cat > $2
}
Obvious clean solution that is too complex for Docker file because of number of packages needed:
perl -p -i -e 's/\$\{\{([^}]+)\}\}/defined $ENV{$1} ? $ENV{$1} : $&/eg' < test.json
This filters out ${{VAR}}, even better - only set ones.

How to obtain variable values from a text file for use by a bash script?

I have a bash script, and I'd like it to read in the values of variables from a text file.
I'm thinking that in the text file where the values are stored, I'd use a different line for each variable / value pair and use the equals sign.
VARIABLE1NAME=VARIABLE1VALUE
VARIABLE2NAME=VARIABLE2VALUE
I'd then like my bash script to assign the value VARIABLE1VALUE to the variable VARIABLE1NAME, and the same for VARIABLE2VALUE / VARIABLE2NAME.
Since the syntax of the file is the syntax you would use in the script itself, the source command should do the trick:
source text-file-with-assignments.txt
alternately, you can use . instead of source, but in a case like this, using the full name is more clear.
The documentation can be found in the GNU Bash Reference Manual.

Using a variable for associative array key in Bash

I'm trying to create associative arrays based on variables. So below is a super simplified version of what I'm trying to do (the ls command is not really what I want, just used here for illustrative purposes)...
I have a statically defined array (text-a,text-b). I then want to iterate through that array, and create associative arrays with those names and _AA appended to them (so associative arrays called text-a_AA and text-b_AA).
I don't really need the _AA appended, but was thinking it might be
necessary to avoid duplicate names since $NAME is already being used
in the loop.
I will need those defined and will be referencing them in later parts of the script, and not just within the for loop seen below where I'm trying to define them... I want to later, for example, be able to reference text-a_AA[NUM] (again, using variables for the text-a_AA part). Clearly what I have below doesn't work... and from what I can tell, I need to be using namerefs? I've tried to get the syntax right, and just can't seem to figure it out... any help would be greatly appreciated!
#!/usr/bin/env bash
NAMES=('text-a' 'text-b')
for NAME in "${NAMES[#]}"
do
NAME_AA="${NAME}_AA"
$NAME_AA[NUM]=$(cat $NAME | wc -l)
done
for NAME in "${NAMES[#]}"
do
echo "max: ${$NAME_AA[NUM]}"
done
You may want to use "NUM" as the name of the associative array and file name as the key. Then you can rewrite your code as:
NUM[${NAME}_AA]=$(wc -l < "$NAME")
Then rephrase your loop as:
for NAME in "${NAMES[#]}"
do
echo "max: ${NUM[${NAME}_AA]}"
done
Check your script at shellcheck.net
As an aside: all uppercase is not a good practice for naming normal shell variables. You may want to take a look at:
Correct Bash and shell script variable capitalization

looping a shell command in ansible

I'm trying to get going with some more advanced Ansible playbooks and have hit a wall. I'm trying to get Ansible to do what this /bin/bash 'for' loop does;
for i in $(</filename.txt);do '/some/command options=1 user=usera server=$i';done
filesnames.txt contains 50-100 hostnames.
I can't use jinja templates as the command has to be run, not just the config file updated.
Any help would be greatly appreciated.
Thanks in advance,
Jeremy
you can use jinja templates, but differently
your specific code is not doing something that is most advisable
for multi-line code you should use shell module.
example of multi-code piece of call:
- name: run multiline stuff
shell: |
for x in "${envvar}"; do
echo "${x}"
done
args:
executable: /bin/bash
note I'm explicitly setting executable, which will ensure bash-isms would work.
I just used envvar as an example, of arbitrary environment variable available.
if you need to pass specific env variables, you should use environment clause of the call to shell module, refer to: http://docs.ansible.com/ansible/playbooks_environment.html
For simple variables you can just use their value in shell: echo "myvar: {{myvar}}"
If you wish to use an ansible list/tuple variable inside bash code, you can make it bash variable first. e.g. if you have a list of stuff in mylist, you can expand it and assign into a bash array, and then iterate over it. the shell code of the call to shell would be:
mylist_for_bash=({{mylist|join(" ")}})
for myitem in "${mylist_for_bash[#]}"; do
echo "my current item: ${myitem}"
done
Another approach would be to pass it as string env variable, and convert it into an array later in the code.
NOTE:
of course all this works correctly only with SPACELESS values
I've never had to pass array with space containing items

Simple map for pipeline in shell script

I'm dealing with a pipeline of predominantly shell and Perl files, all of which pass parameters (paths) to the next. I decided it would be better to use a single file to store all the paths and just call that for every file. The issue is I am using awk to grab the files at the beginning of each file, and it's turning out to be a lot of repetition.
My question is: I do not know if there is a way to store key-value pairs in a file so shell can natively do something with the key and return the value? It needs to access an external file, because the pipeline uses many scripts and a map in a specific file would result in parameters being passed everywhere. Is there some little quirk I do not know of that performs a map function on an external file?
You can make a file of env var assignments and source that file as need, ie.
$ cat myEnvFile
path1=/x/y/z
path2=/w/xy
path3=/r/s/t
otherOpt1="-x"
Inside your script you can source with either . myEnvFile or the more versbose version of the same feature sourc myEnvFile (assuming bash shell) , i.e.
$cat myScript
#!/bin/bash
. /path/to/myEnvFile
# main logic below
....
# references to defined var
if [[ -d $path2 ]] ; then
cd $path2
else
echo "no pa4h2=$path2 found, can't continue" 1>&1
exit 1
fi
Based on how you've described your problem this should work well, and provide a-one-stop-shop for all of your variable settings.
IHTH
In bash, there's mapfile, but that reads the lines of a file into a numerically-indexed array. To read a whitespace-separated file into an associative array, I would
declare -A map
while read key value; do
map[$key]=$value
done < filename
However this sounds like an XY problem. Can you give us an example (in code) of what you're actually doing? When I see long piplines of grep|awk|sed, there's usually a way to simplify. For example, is passing data by parameters better than passing via stdout|stdin?
In other words, I'm questioning your statement "I decided it would be better..."

Resources