How do I save the output variable from terragrunt apply as regular shell environment variable? - bash

After running my terragrunt apply-all in my CI step (so basically a bash script) I get my outputs, in this case I only have one:
output "cloudrun-hostname" {
value = google_cloud_run_service.cloudrun.status[0].url
description = "API endpoint URL"
}
How do I pass the value of that output to the environment variable so basically like I exported a variable like this:
export HOSTNAME=terragrunt-cloudrun-hostname-output
I need this variable with that value so I can envsub the value in another file later.

You can use the terraform output command, i.e.
export MY_ENV=$(terraform output cloudrun-hostname)
after your apply-all.

You will need to expand the command and so:
export HOSTNAME="$(terragrunt apply-all | awk -F= '/value/ { gsub(" ","",$2);print $2 }')"

Related

bash: create a variable name containing another variable

I have to create multiple variable while reading the file in bash.
These variables need to have a dynamic name as per the content of file.
E.g:
File content:
abc: 20
1 apple a day
abc: 40
1 keeps the doctor away
now i have to create variables as:
day_20_id = 1
day_20_fruit = apple
away_40_id = 1
away_40_who = doctor
it would be like in all variable names, only the the value of $abc will be updated and the value of the variable will be as per the file content.
Can somebody help me out to figure out how to achieve this.
You can use the eval command to accomplish this as illustrated below
abc=20 # assuming you got this from the input file
val=1 # assuming you also got this from the input file
varName="day_${abc}_id"
command="${varName}=${val}"
eval $command
# now print to output file as you have stated in the comments
outputFile=output.txt # randomly-chosen name
command="echo $varName = $val > $outputFile"
eval $command

Modify a shell variable inside awk block of code

Is there any way to modify a shell variable inside awk block of code?
--------- [shell_awk.sh]---------------------
#!/bin/bash
shell_variable_1=<value A>
shell_variable_2=<value B>
shell_variable_3=<value C>
awk 'function A(X)
{ return X+1 }
{ a=A('$shell_variable_1')
b=A('$shell_variable_2')
c=A('$shell_variable_3')
shell_variable_1=a
shell_variable_2=b
shell_variable_3=c
}' FILE.TXT
--------- [shell_awk.sh]---------------------
This is a very simple example, the real script load a file and make some changes using functions, I need to keep each value before change into a specific variable, so then I can register into MySQL the before and after value.
The after value is received from parameters ($1, $2 and so on).
The value before I already know how to get it from the file.
All is done well, except the shell_variable been set by awk variable. Outside from awk block code is easy to set, but inside, is it possible?
No program -- in awk, shell, or any other language -- can directly modify a parent process's memory. That includes variables. However, of course, your awk can write contents to stdout, and the parent shell can read that content and modify its own variables itself.
Here's an example of awk that writes key/value pairs out to be read by bash. It's not perfect -- read the caveats below.
#!/bin/bash
shell_variable_1=10
shell_variable_2=20
shell_variable_3=30
run_awk() {
awk -v shell_variable_1="$shell_variable_1" \
-v shell_variable_2="$shell_variable_2" \
-v shell_variable_3="$shell_variable_3" '
function A(X) { return X+1 }
{ a=A(shell_variable_1)
b=A(shell_variable_2)
c=A(shell_variable_3) }
END {
print "shell_variable_1=" a
print "shell_variable_2=" b
print "shell_variable_3=" c
}' <<<""
}
while IFS="=" read -r key value; do
printf -v "$key" '%s' "$value"
done < <(run_awk)
for var in shell_variable_{1,2,3}; do
printf 'New value for %s is %s\n' "$var" "${!var}"
done
Advantages
Doesn't use eval. Content such as $(rm -rf ~) in the output from awk won't be executed by your shell.
Disadvantages
Can't handle variable contents with newlines. (You could fix this by NUL-delimiting output from your awk script, and adding -d '' to the read command).
A hostile awk script could modify PATH, LD_LIBRARY_PATH, or other security-sensitive variables. (You could fix this by reading variables into an associative array, rather than the global namespace, or by enforcing a prefix on their names).
The code above uses several ksh extensions also available in bash; however, it will not run with POSIX sh. Thus, be sure not to run this via sh scriptname (which only guarantees POSIX functionality).

how to find the position of a string in a file in unix shell script

Can you please help me solve this puzzle? I am trying to print the location of a string (i.e., line #) in a file, first to the std output, and then capture that value in a variable to be used later. The string is “my string”, the file name is “myFile” which is defined as follows:
this is first line
this is second line
this is my string on the third line
this is fourth line
the end
Now, when I use this command directly at the command prompt:
% awk ‘s=index($0, “my string”) { print “line=” NR, “position= ” s}’ myFile
I get exactly the result I want:
% line= 3, position= 9
My question is: if I define a variable VAR=”my string”, why can’t I get the same result when I do this:
% awk ‘s=index($0, $VAR) { print “line=” NR, “position= ” s}’ myFile
It just won’t work!! I even tried putting the $VAR in quotation marks, to no avail? I tried using VAR (without the $ sign), no luck. I tried everything I could possibly think of ... Am I missing something?
awk variables are not the same as shell variables. You need to define them with the -v flag
For example:
$ awk -v var="..." '$0~var{print NR}' file
will print the line number(s) of pattern matches. Or for your case with the index
$ awk -v var="$Var" 'p=index($0,var){print NR,p}' file
using all uppercase may not be good convention since you may accidentally overwrite other variables.
to capture the output into a shell variable
$ info=$(awk ...)
for multi line output assignment to shell array, you can do
$ values=( $(awk ...) ); echo ${values[0]}
however, if the output contains more than one field, it will be assigned it's own array index. You can change it with setting the IFS variable, such as
$ IFS=$(echo -en "\n\b"); values=( $(awk ...) )
which will capture the complete lines as the array values.

Shell Script for Searching text with variable and replace it with content

im new to shell script, but i need something to search a file with any given variable, and if the file contains this variable replace it with the variables alias
the text would be some thing like:
74304050 = +4574304050#voip1.local
74304051 = +4574304051#voip1.local
20304050 = +4520304050#voip2.local
20304051 = +4520304051#voip2.local
so if i use call the shell script with 20304050 i get +4520304050#voip1.local
how can this be done, i need it for calling aliases and rewriting them in opensips config file?
It's a bit underspecified, but does this do what you want?
awk -V number=20304050 '$1 == $number { print $3 }' file

grep text by function parameter in bash

There is a file name as pkg_list
a-1.2b-1.tar.gz
c-2.5b-1.tar.gz
a xx-1.4.txz
a$xx-1.4.txz
中文-3.txz
xx-3.2-2.tar.gz
xxy-1.3.tar.gz
My bash function can input package name like 'xx'
pkg_find() { # <pkg_name> as $1
grep "^$1-[0-9]*" pkg_list
}
pkg_find xx # wish it return xx-3.2-2.tar.gz
I know I can not pass $1 directly into pkg_find, what's the correct method?
[SOLVED]
In this case, because $1 is enclosed by double quote, I found even regex meta chars could pass as parameter.
What you're doing looks right to me.
What isn't working?
I tried the code in your question, and pkg_find xx displays ‘xx-3.2-2.tar.gz’ — which you say is the output you were hoping for.
You can pass $1 directly to pkg_find
pkg_find() { # <pkg_name> as $1
grep "^$1-[0-9]*" pkg_list
}
pkg_find "$1"
In the main body, $1, $2, .. are the script arguments, you get from the command line or another calling script. In a shell function they refer to the function arguments.
When you call this on the command line
sh pkg_find.sh xx
you will get
xx-3.2-2.tar.gz
Your code and your question seem to me to ask for different things, you want to either/both of:
pass a function parameters: Passing parameters to a Bash function
return a string from a function: shell script function return a string
$1, $2 etc at the top-level of a script are the script parameters; within a function they
are set to the function parameters, or unset if there are no parameters.

Resources