Pass bash variable in yq - bash

I am trying to pass bash variable in yq
test.yml
configuration:
Properties:
corporate-url: https://stackoverflow.com/
temp = '.configuration.Properties.corporate-url'
export $temp
Value1=$(yq '.[env($temp)]' test.yml)
expected output:
https://stackoverflow.com/
but I am getting this error(Actual output)
Error: Value for env variable '$variable1' not provided in env()
Please note:
I am trying to fetch corporate-url value, using a bash variable, constraint is that I cannot pass string directly in yq as the value of temp changes as this snippet is running inside a for loop which changes value of temp every time so cannot hard code for a particular value.
Reference YQ Documentation:
https://mikefarah.gitbook.io/yq/operators/env-variable-operators
ApisDraft folder contains multiple yml files
ApisDraft=$(find drafts/* -maxdepth 1 -type f)
for ApiFixOrgsTags in $ApisDraft
do
my_var=$(yq '.securityDefinitions.[].tokenUrl' $ApiFixOrgsTags)
ConfigProper='.configuration.Properties.'
CatelogProper='.configuration.catalogs.[].Properties.'
variable1=$ConfigProper${my_var}
variable2=$CatelogProper${my_var}
# to remove white all spaces
variable1= echo $variable1 | sed -E 's/(\.) */\1/g'
variable2= echo $variable2 | sed -E 's/(\.) */\1/g'
export $variable1
export $variable2
Value1=$(yq "$variable1" $ApiFixOrgsTags)
Value2=$(yq '.[env($variable2)]' $ApiFixOrgsTags)
done

In this case, you don't need to put it in the environment. Let the shell expand it so yq just sees the value of the variable:
yq "$temp" test.yml # => https://stackoverflow.com/

Related

Is there a way to unpack a config file to cli flags in general?

Basically what foo(**bar) does in python, here I’d want something like
foo **bar.yaml
and that would become
foo --bar1=1 --bar2=2
Where bar.yaml would be
bar1: 1
bar2: 2
You could use a combination of sed and xargs:
sed -E 's/^(.+):[[:space:]]+(.+)$/--\1=\2/' bar.yaml | xargs -d '\n' foo
sed converts the format of bar.yaml lines (e.g. bar1: 1 -> --bar1=1) and xargs feeds the converted lines as arguments to foo.
You could of course modify/extend the sed part to support other formats or single-dash options like -v.
To test if this does what you want, you can run this Bash script instead of foo:
#!/usr/bin/env bash
echo "Arguments: $#"
for ((i=1; i <= $#; i++)); do
echo "Argument $i: '${!i}'"
done
Here's a version for zsh. Run this code or add it to ~/.zshrc:
function _yamlExpand {
setopt local_options extended_glob
# 'words' array contains the current command line
# yaml filename is the last value
yamlFile=${words[-1]}
# parse 'key : value' lines from file, create associative array
typeset -A parms=("${(#s.:.)${(f)"$(<${yamlFile})"}}")
# trim leading and trailing whitespace from keys and values
# requires extended_glob
parms=("${(kv#)${(kv#)parms##[[:space:]]##}%%[[:space:]]##}")
# add -- and = to create flags
typeset -a flags
for key val in "${(#kv)parms}"; do
flags+=("--${key}='${val}'")
done
# replace the value on the command line
compadd -QU -- "$flags"
}
# add the function as a completion and map it to ctrl-y
compdef -k _yamlExpand expand-or-complete '^Y'
At the zsh shell prompt, type in the command and the yaml file name:
% print -l -- ./bar.yaml▃
With the cursor immediately after the yaml file name, hit ctrl+y. The yaml filename will be replaced with the expanded parameters:
% print -l -- --bar1='1' --bar2='2' ▃
Now you're set; you can hit enter, or add parameters, just like any other command line.
Notes:
This only supports the yaml subset in your example.
You can add more yaml parsing to the function, possibly with yq.
In this version, the cursor must be next to the yaml filename - otherwise the last value in words will be empty. You can add code to detect that case and then alter the words array with compset -n.
compadd and compset are described in the zshcompwid man page.
zshcompsys has details on compdef; the section on autoloaded files describes another way to deploy something like this.

Docker - dynamic regex-based sed substitution in files

I have various environment variables declared inside an env_file referenced in a docker-compose.yml
export FOO=bar
export FAZ=baz
In a config file within a container I reference these environment variables as such:
Today is a great day to ${FOO} and ${FAZ}
I want to be able to capture all instances of text start with ${ and ending with } (ex. ${<SOMETHING>}) and then replace it with the environment variable of key <SOMETHING>.
The following works in the shell, but I cannot get this to work with sed -i against a file, or within a bash script.
echo "TODAY IS ${DAY}" | sed -r 's/(<foobar>\$[\w{}]+)/<foobar>/g
where the environment variable DAY is "FUNDAY" would produce: "TODAY IS FUNDAY"
I think i found something which may interest you. Please note that it complies to using only sed, in place editing and bash. Before we get into it, lets see the input and output:
Environment file/variables: As seen below, our file "myenv" has 3 variables and their values are as seen below.
%_Host#User> file myenv
myenv: ASCII text
%_Host#User> cat myenv
export FOO='This is FOO'
export BAR='This is BAR'
export BAZ='This is BAZ'
%_Host#User> env|egrep "FOO|BAR|BAZ"
FOO=This is FOO
BAR=This is BAR
BAZ=This is BAZ
%_Host#User>
Target file to perform the operation upon: Our target is to read the file sample.txt and replace values of variables with their actual values, if they are found in the loaded environment file. Our 4th line is not defined in our variable file so it should be untouched or unchanged.
%_Host#User> cat sample.txt
${FOO} is TRUE
${BAR} is TRUE
${BAZ} is TRUE
${BRAZ} is TRUE
%_Host#User>
SCRIPT Result and final output: To demonstrate, I am printing the lines as they are read from the target file sample.txt. If the variable inside the container (eg: ${FOO} is found in any of the line, we replace the it with its value (eg: This is FOO in this case) which we can obtain from the env
%_Host#User> ./env.sh sample.txt ; cat sample.txt
[1.] Line is: [${FOO} is TRUE]
[1.] VAR:[FOO] has VALUE:[This is FOO]
[2.] Line is: [${BAR} is TRUE]
[2.] VAR:[BAR] has VALUE:[This is BAR]
[3.] Line is: [${BAZ} is TRUE]
[3.] VAR:[BAZ] has VALUE:[This is BAZ]
[4.] Line is: [${BRAZ} is TRUE]
[4.] VAR:[BRAZ] not found and NO CHANGE IN FILE !!
This is FOO is TRUE
This is BAR is TRUE
This is BAZ is TRUE
${BRAZ} is TRUE
%_Host#User>
As seen above, we have managed to change the ${FOO} with This is FOO as expected and left any other/s (which were not in our env file) untouched.
SCRIPT and its working:
#!/bin/bash
i=1
cfgfile="$1"
cat $cfgfile | while read line
do
echo "[${i}.] Line is: [$line]"
var=$(echo $line | sed 's#^.*\${\(\w*\)}.*$#\1#g')
val=$(env |grep "$var"| cut -f2 -d"=")
env|grep "${var}=" >/dev/null
if [ $? -eq 0 ]
then
echo "[$i.] VAR:[$var] has VALUE:[$val]"
# Perform inplace editing.
sed -i "s#\${$var}#$val#g" "$cfgfile"
else
echo "[$i.] VAR:[$var] not found and NO CHANGE IN FILE !!"
fi
((i++)) ; echo
done
Above script is following these steps:
i=1 is just our counter. Not so important. Only for demo purpose.
cfgfile="$1" to accept file from prompt.
We cat the file and read it line by line in a while loop.
We echo the line and then extract FOO from ${FOO} and its actual value from env i.e. This is FOO and store it in var and val. simple enough.
We test whether this variable FOO is defined in env or not and check for command status.
If it is defined i.e. $? is 0 or True, we go ahead and perform in place editing or else we leave the line alone and no change.
I believe you should try your original command with double quotes (") instead of single ('). It should work.
Please let us know if this was any useful.
Cheers.

Unix user created variables

I am going though some growing pains with Unix. My question:
I want to be able to print all my user defined variables in my shell. Let say I do the following in the shell:
$ x=9
$ y="Help"
$ z=-18
$ R="My 4th variable"
How would I go about printing:
x y z R
You should record your variables first at runtime with set, then compare it later to see which variables were added. Example:
#!/bin/bash
set | grep -E '^[^[:space:]]+=' | cut -f 1 -d = | sort > /tmp/previous.txt
a=1234
b=1234
set | grep -E '^[^[:space:]]+=' | cut -f 1 -d = | sort > /tmp/now.txt
comm -13 /tmp/previous.txt /tmp/now.txt
Output:
a
b
PIPESTATUS
Notice that there are still other variables produced by the shell but is not declared by the user. You can filter them with grep -v. It depends on the shell as well.
Add: Grep and cut could simply be just one sed a well: sed -n 's/^\([^[:space:]]\+\)=.*/\1/p'
Type set:
$ set
Apple_PubSub_Socket_Render=/tmp/launch-jiNTOC/Render
BASH=/bin/bash
BASH_ARGC=()
BASH_ARGV=()
BASH_LINENO=()
BASH_SOURCE=()
BASH_VERSINFO=([0]="3" [1]="2" [2]="51" [3]="1" [4]="release" [5]="x86_64-apple-darwin13")
BASH_VERSION='3.2.51(1)-release'
COCOS2DROOT=/Users/andy/Source/cocos2d
COLUMNS=80
DIRSTACK=()
...
(Oh, and BTW, you appear to have your variable syntax incorrect as you assign, say, A but print $A)
If variables are exported then you can use env command in Unix.

Shell Script to read value set to zero and set back to old value?

I need to read a value of variable and set it to zero and again set back to old value
I tried these steps
value=$(grep -Po '(?<=Max_value=).*' /usr/post_check.ini|awk '{print $1}')
sed -i -r 's/Max_value=[0-9]+/Max_value=0/g' /usr/master.ini
echo "$value" # say $value is 3
sed -i -r 's/Max_value=[0-9]+/Max_value=$value/g' /usr/master.ini
value=$(grep -Po '(?<=Max_value=).*' /usr/master.ini|awk '{print $1}')
echo "$value"
The above code is setting the value Max_value to $value[I mean actual to text $value but not value contained it which is 3 like "Max_value=$value" and not to Max_value=3] but not to old value
If I reset it to zero and try to set it back to old value How would I do that ?
Like I said in my answer to your last question, you should use a config parser:
import ConfigParser
config = ConfigParser.ConfigParser()
config.read('post_check.ini')
print config.get('section 1','Max_value')
config.set('section 1','Max_value','0')
print config.get('section 1','Max_value')
Demo
$ cat post_check.ini
[section 1]
Max_value=123
[section 2]
Max_value=456
$ python config.py
123
0
sed -i -r "s/Max_value=[0-9]+/Max_value=$value/g" /usr/master.ini
The particular issue with $value being inserted directly was because you used single quotes instead of double quotes.
However I recommend using crudini
which is a dedicated tool to manipulate ini files from shell
value=$(crudini --get /usr/post_check.ini section Max_value)
crudini --set /usr/post_check.ini section Max_value 0
echo "$value"
crudini --set /usr/post_check.ini section Max_value "$value"
value=$(crudini --get /usr/post_check.ini section Max_value)
echo "$value"
Details on usage and download at:
http://www.pixelbeat.org/programs/crudini/

How to replace ${} placeholders in a text file?

I want to pipe the output of a "template" file into MySQL, the file having variables like ${dbName} interspersed. What is the command line utility to replace these instances and dump the output to standard output?
The input file is considered to be safe, but faulty substitution definitions could exist. Performing the replacement should avoid performing unintended code execution.
Update
Here is a solution from yottatsa on a similar question that only does replacement for variables like $VAR or ${VAR}, and is a brief one-liner
i=32 word=foo envsubst < template.txt
Of course if i and word are in your environment, then it is just
envsubst < template.txt
On my Mac it looks like it was installed as part of gettext and from MacGPG2
Old Answer
Here is an improvement to the solution from mogsie on a similar question, my solution does not require you to escale double quotes, mogsie's does, but his is a one liner!
eval "cat <<EOF
$(<template.txt)
EOF
" 2> /dev/null
The power on these two solutions is that you only get a few types of shell expansions that don't occur normally $((...)), `...`, and $(...), though backslash is an escape character here, but you don't have to worry that the parsing has a bug, and it does multiple lines just fine.
Sed!
Given template.txt:
The number is ${i}
The word is ${word}
we just have to say:
sed -e "s/\${i}/1/" -e "s/\${word}/dog/" template.txt
Thanks to Jonathan Leffler for the tip to pass multiple -e arguments to the same sed invocation.
Use /bin/sh. Create a small shell script that sets the variables, and then parse the template using the shell itself. Like so (edit to handle newlines correctly):
File template.txt:
the number is ${i}
the word is ${word}
File script.sh:
#!/bin/sh
#Set variables
i=1
word="dog"
#Read in template one line at the time, and replace variables (more
#natural (and efficient) way, thanks to Jonathan Leffler).
while read line
do
eval echo "$line"
done < "./template.txt"
Output:
#sh script.sh
the number is 1
the word is dog
I was thinking about this again, given the recent interest, and I think that the tool that I was originally thinking of was m4, the macro processor for autotools. So instead of the variable I originally specified, you'd use:
$echo 'I am a DBNAME' | m4 -DDBNAME="database name"
Create rendertemplate.sh:
#!/usr/bin/env bash
eval "echo \"$(cat $1)\""
And template.tmpl:
Hello, ${WORLD}
Goodbye, ${CHEESE}
Render the template:
$ export WORLD=Foo
$ CHEESE=Bar ./rendertemplate.sh template.tmpl
Hello, Foo
Goodbye, Bar
template.txt
Variable 1 value: ${var1}
Variable 2 value: ${var2}
data.sh
#!/usr/bin/env bash
declare var1="value 1"
declare var2="value 2"
parser.sh
#!/usr/bin/env bash
# args
declare file_data=$1
declare file_input=$2
declare file_output=$3
source $file_data
eval "echo \"$(< $file_input)\"" > $file_output
./parser.sh data.sh template.txt parsed_file.txt
parsed_file.txt
Variable 1 value: value 1
Variable 2 value: value 2
Here's a robust Bash function that - despite using eval - should be safe to use.
All ${varName} variable references in the input text are expanded based on the calling shell's variables.
Nothing else is expanded: neither variable references whose names are not enclosed in {...} (such as $varName), nor command substitutions ($(...) and legacy syntax `...`), nor arithmetic substitutions ($((...)) and legacy syntax $[...]).
To treat a $ as a literal, \-escape it; e.g.:\${HOME}
Note that input is only accepted via stdin.
Example:
$ expandVarsStrict <<<'$HOME is "${HOME}"; `date` and \$(ls)' # only ${HOME} is expanded
$HOME is "/Users/jdoe"; `date` and $(ls)
Function source code:
expandVarsStrict(){
local line lineEscaped
while IFS= read -r line || [[ -n $line ]]; do # the `||` clause ensures that the last line is read even if it doesn't end with \n
# Escape ALL chars. that could trigger an expansion..
IFS= read -r -d '' lineEscaped < <(printf %s "$line" | tr '`([$' '\1\2\3\4')
# ... then selectively reenable ${ references
lineEscaped=${lineEscaped//$'\4'{/\${}
# Finally, escape embedded double quotes to preserve them.
lineEscaped=${lineEscaped//\"/\\\"}
eval "printf '%s\n' \"$lineEscaped\"" | tr '\1\2\3\4' '`([$'
done
}
The function assumes that no 0x1, 0x2, 0x3, and 0x4 control characters are present in the input, because those chars. are used internally - since the function processes text, that should be a safe assumption.
here's my solution with perl based on former answer, replaces environment variables:
perl -p -e 's/\$\{(\w+)\}/(exists $ENV{$1}?$ENV{$1}:"missing variable $1")/eg' < infile > outfile
I would suggest using something like Sigil:
https://github.com/gliderlabs/sigil
It is compiled to a single binary, so it's extremely easy to install on systems.
Then you can do a simple one-liner like the following:
cat my-file.conf.template | sigil -p $(env) > my-file.conf
This is much safer than eval and easier then using regex or sed
Here is a way to get the shell to do the substitution for you, as if the contents of the file were instead typed between double quotes.
Using the example of template.txt with contents:
The number is ${i}
The word is ${word}
The following line will cause the shell to interpolate the contents of template.txt and write the result to standard out.
i='1' word='dog' sh -c 'echo "'"$(cat template.txt)"'"'
Explanation:
i and word are passed as environment variables scopped to the execution of sh.
sh executes the contents of the string it is passed.
Strings written next to one another become one string, that string is:
'echo "' + "$(cat template.txt)" + '"'
Since the substitution is between ", "$(cat template.txt)" becomes the output of cat template.txt.
So the command executed by sh -c becomes:
echo "The number is ${i}\nThe word is ${word}",
where i and word are the specified environment variables.
If you are open to using Perl, that would be my suggestion. Although there are probably some sed and/or AWK experts that probably know how to do this much easier. If you have a more complex mapping with more than just dbName for your replacements you could extend this pretty easily, but you might just as well put it into a standard Perl script at that point.
perl -p -e 's/\$\{dbName\}/testdb/s' yourfile | mysql
A short Perl script to do something slightly more complicated (handle multiple keys):
#!/usr/bin/env perl
my %replace = ( 'dbName' => 'testdb', 'somethingElse' => 'fooBar' );
undef $/;
my $buf = <STDIN>;
$buf =~ s/\$\{$_\}/$replace{$_}/g for keys %replace;
print $buf;
If you name the above script as replace-script, it could then be used as follows:
replace-script < yourfile | mysql
file.tpl:
The following bash function should only replace ${var1} syntax and ignore
other shell special chars such as `backticks` or $var2 or "double quotes".
If I have missed anything - let me know.
script.sh:
template(){
# usage: template file.tpl
while read -r line ; do
line=${line//\"/\\\"}
line=${line//\`/\\\`}
line=${line//\$/\\\$}
line=${line//\\\${/\${}
eval "echo \"$line\"";
done < ${1}
}
var1="*replaced*"
var2="*not replaced*"
template file.tpl > result.txt
I found this thread while wondering the same thing. It inspired me to this (careful with the backticks)
$ echo $MYTEST
pass!
$ cat FILE
hello $MYTEST world
$ eval echo `cat FILE`
hello pass! world
Lots of choices here, but figured I'd toss mine on the heap. It is perl based, only targets variables of the form ${...}, takes the file to process as an argument and outputs the converted file on stdout:
use Env;
Env::import();
while(<>) { $_ =~ s/(\${\w+})/$1/eeg; $text .= $_; }
print "$text";
Of course I'm not really a perl person, so there could easily be a fatal flaw (works for me though).
It can be done in bash itself if you have control of the configuration file format. You just need to source (".") the configuration file rather than subshell it. That ensures the variables are created in the context of the current shell (and continue to exist) rather than the subshell (where the variable disappear when the subshell exits).
$ cat config.data
export parm_jdbc=jdbc:db2://box7.co.uk:5000/INSTA
export parm_user=pax
export parm_pwd=never_you_mind
$ cat go.bash
. config.data
echo "JDBC string is " $parm_jdbc
echo "Username is " $parm_user
echo "Password is " $parm_pwd
$ bash go.bash
JDBC string is jdbc:db2://box7.co.uk:5000/INSTA
Username is pax
Password is never_you_mind
If your config file cannot be a shell script, you can just 'compile' it before executing thus (the compilation depends on your input format).
$ cat config.data
parm_jdbc=jdbc:db2://box7.co.uk:5000/INSTA # JDBC URL
parm_user=pax # user name
parm_pwd=never_you_mind # password
$ cat go.bash
cat config.data
| sed 's/#.*$//'
| sed 's/[ \t]*$//'
| sed 's/^[ \t]*//'
| grep -v '^$'
| sed 's/^/export '
>config.data-compiled
. config.data-compiled
echo "JDBC string is " $parm_jdbc
echo "Username is " $parm_user
echo "Password is " $parm_pwd
$ bash go.bash
JDBC string is jdbc:db2://box7.co.uk:5000/INSTA
Username is pax
Password is never_you_mind
In your specific case, you could use something like:
$ cat config.data
export p_p1=val1
export p_p2=val2
$ cat go.bash
. ./config.data
echo "select * from dbtable where p1 = '$p_p1' and p2 like '$p_p2%' order by p1"
$ bash go.bash
select * from dbtable where p1 = 'val1' and p2 like 'val2%' order by p1
Then pipe the output of go.bash into MySQL and voila, hopefully you won't destroy your database :-).
In place perl editing of potentially multiple files, with backups.
perl -e 's/\$\{([^}]+)\}/defined $ENV{$1} ? $ENV{$1} : ""/eg' \
-i.orig \
-p config/test/*
I created a shell templating script named shtpl. My shtpl uses a jinja-like syntax which, now that I use ansible a lot, I'm pretty familiar with:
$ cat /tmp/test
{{ aux=4 }}
{{ myarray=( a b c d ) }}
{{ A_RANDOM=$RANDOM }}
$A_RANDOM
{% if $(( $A_RANDOM%2 )) == 0 %}
$A_RANDOM is even
{% else %}
$A_RANDOM is odd
{% endif %}
{% if $(( $A_RANDOM%2 )) == 0 %}
{% for n in 1 2 3 $aux %}
\$myarray[$((n-1))]: ${myarray[$((n-1))]}
/etc/passwd field #$n: $(grep $USER /etc/passwd | cut -d: -f$n)
{% endfor %}
{% else %}
{% for n in {1..4} %}
\$myarray[$((n-1))]: ${myarray[$((n-1))]}
/etc/group field #$n: $(grep ^$USER /etc/group | cut -d: -f$n)
{% endfor %}
{% endif %}
$ ./shtpl < /tmp/test
6535
6535 is odd
$myarray[0]: a
/etc/group field #1: myusername
$myarray[1]: b
/etc/group field #2: x
$myarray[2]: c
/etc/group field #3: 1001
$myarray[3]: d
/etc/group field #4:
More info on my github
To me this is the easiest and most powerful solution, you can even include other templates using the same command eval echo "$(<template.txt):
Example with nested template
create the template files, the variables are in regular bash syntax ${VARIABLE_NAME} or $VARIABLE_NAME
you have to escape special characters with \ in your templates otherwhise they will be interpreted by eval.
template.txt
Hello ${name}!
eval echo $(<nested-template.txt)
nested-template.txt
Nice to have you here ${name} :\)
create source file
template.source
declare name=royman
parse the template
source template.source && eval echo "$(<template.txt)"
the output
Hello royman!
Nice to have you here royman :)
envsubst
please don't use anything else (ie. don't eval)

Resources