This question already has answers here:
How to pass terraform outputs variables into ansible as vars_files?
(5 answers)
Closed 2 years ago.
I followed the answer here in this question and
I have created a template file tf_ansible_vars_file.yml.tpl like below
tf_share_location: "${share_location}"
and a terra_render.tf like below
# Define an Ansible var_file containing Terraform variable values
data "template_file" "tf_ansible_vars_file" {
template = "${file("/home/deployment_root/app4/tf_ansible_vars_file.yml.tpl")}"
vars = {
share_location = var.share_location
# gitlab_backup_bucket_name = aws_s3_bucket.gitlab_backup.bucket
}
}
# Render the Ansible var_file containing Terrarorm variable values
resource "local_file" "tf_ansible_vars_file" {
content = data.template_file.tf_ansible_vars_file.rendered
filename = "/home/deployment_root/app4/tf_ansible_vars_file.yml"
}
I already have a variables.tf file in which i have declared that variable
variable "share_location" {
type = string
}
and in the terraform.tfvars gave the value as null
share_location = null
when i run terraform apply i get the below error
Error: failed to render : <template_file>:1,23-37: Unknown variable; There is no variable named "share_location".
on terra_render.tf line 2, in data "template_file" "tf_ansible_vars_file":
2: data "template_file" "tf_ansible_vars_file" {
My understanding is it will create a file as mentioned in that answer, but it is not working.
How do you output variables to Ansible?
If you are generating a supported format like YAML, you do not need a template file.
You can generate YAML directly from Terraform data structures as follows:
resource "local_file" "tf_ansible_vars_file" {
content = yamlencode(
{
tf_share_location = var.share_location
# gitlab_backup_bucket_name = aws_s3_bucket.gitlab_backup.
}
)
filename = var.ansible_vars_filename
}
variable "ansible_vars_filename" {
type = string
default = "/home/deployment_root/app4/tf_ansible_vars_file.yml"
}
Applying that with -var="share_location=example_location" will yield a file like this:
"tf_share_location": "example_location"
Whether or not the quoting the configuration variable names matters depends on Ansible. It shouldn't as it's still valid YAML. Terraform's yamlencode quotes those regardless of whether they need to be, which is regrettable.
I have extracted ansible_vars_filename to a variable as you may want to make that configurable.
I have also left in the commented out gitlab_backup_bucket_name as adding it to the YAML file is as simple as uncommenting it.
You can learn more about yamlencode here:
https://www.terraform.io/docs/configuration/functions/yamlencode.html
In the below code, instead of var.share_location , i need to give the variable that i used in terraform in my case
${data.azurerm_storage_account.new.name}/sharename and after that i can remove that from variables.tf as well as terraform.tfvars as i am getting the value generated. Thanks
Old code :
data "template_file" "tf_ansible_vars_file" {
template = "${file("/home/deployment_root/app4/tf_ansible_vars_file.yml.tpl")}"
vars = {
share_location = var.share_location
# gitlab_backup_bucket_name = aws_s3_bucket.gitlab_backup.bucket
}
}
New Code:
# Define an Ansible var_file containing Terraform variable values
data "template_file" "tf_ansible_vars_file" {
template = "${file("/home/deployment_root/app4/tf_ansible_vars_file.yml.tpl")}"
vars = {
share_location = "${data.azurerm_storage_account.new.name}/sharename"
# gitlab_backup_bucket_name = aws_s3_bucket.gitlab_backup.bucket
}
}
Related
I want to read a list of objects from a yaml file via terraform code and map it to a local variable. Also i need search an object with a key and get the values from a yaml file. Can anyone suggest suitable solution?
my yaml file looks like below. Here use will be the primary key
list_details:
some_list:
- use: a
path: somepath
description : "some description"
- use: b
path: somepath2
description : "some description 2"
I have loaded the yaml file in my variable section in Terraform like this
locals {
list = yamldecode(file("${path.module}/mylist.yaml"))
}
Now the problem is how I can get one object with its values by passing the "use" value to the list?
"
Assuming that use values are unique, you can re-organize your list into map:
locals {
list_as_map = {for val in local.list["list_details"]["some_list"]:
val["use"] => val["path"]}
}
which gives list_as_map as:
"a" = "somepath"
"b" = "somepath2"
then you access the path based on a value of use:
path_for_a = local.list_as_map["a"]
Update
If you want to keep description, its better to do:
list_as_map = {for val in local.list["list_details"]["some_list"]:
val["use"] => {
path = val["path"]
description = val["description"]
}
}
then you access the path or description as:
local.list_as_map["a"].path
local.list_as_map["a"].description
I am trying to pass to 3 AWS launch template userdata a bash script. This script calls other scripts from Github depending on a specific variable. Since each launch template must call different scripts what is the best way to accomplish it. I am currently trying to configure a data source template_file but I canĀ“t find a way to do what I need.
This is a piece of the bash script where I put a variable that need to change its value depending on which launch template is being built every time:
#------------------------------------------------------------------------------------------
# Define here scripts (separated with 1 space) that will be executed on first run:
AMI_SCRIPTS="ami_base_lynis.sh ${ami_script}"
#------------------------------------------------------------------------------------------
download_and_run_scripts
This is the template file data source:
data "template_file" "AMIs"{
template = "${file("../AMIs/s1_aws_userdata.sh")}"
vars = {
ami = var.dci_appserver_ami
}
}
And this is the user data attribute:
user_data_base64 = base64encode(data.template_file.AMIs.rendered)
This is not working for me as it will replace the variable has the same value for all 3 launch templates. How can I assign each time a different value?
The syntax you used for user_data_base64 tells me that you're using Terraform v0.12 or later, so you should no longer use template_file as shown in the template_file documentation:
In Terraform 0.12 and later, the templatefile function offers a built-in mechanism for rendering a template from a file. Use that function instead, unless you are using Terraform 0.11 or earlier.
Because of that, I'm going to answer using the templatefile function instead.
Inside each of your launch template resource blocks, you can call templatefile with different values for the template variables in order to get a different result each time:
resource "aws_launch_template" "example1" {
# ...
user_data = base64encode(templatefile("${path.module}/../AMIs/s1_aws_userdata.sh", {
ami = var.dci_appserver_ami
ami_script = "script1.sh"
}))
}
resource "aws_launch_template" "example2" {
# ...
user_data = base64encode(templatefile("${path.module}/../AMIs/s1_aws_userdata.sh", {
ami = var.dci_appserver_ami
ami_script = "script2.sh"
}))
}
resource "aws_launch_template" "example3" {
# ...
user_data = base64encode(templatefile("${path.module}/../AMIs/s1_aws_userdata.sh", {
ami = var.dci_appserver_ami
ami_script = "script3.sh"
}))
}
You could in principle factor out constructing the templates into a local value if you want to do it more systematically, but since your question didn't include any indication that you are doing anything special with the launch templates here I've just written the simplest possible approach, where each launch template has its own template-rendering expression.
I have a .env file that contains the following data
API_URL=${API_URL}
API_KEY=${API_KEY}
API_SECRET=${API_SECRET}
Setting environment variables in Jenkins and passing them to the pipeline is clear. But it is not clear how do I replace ${API_URL}, ${API_KEY} & ${API_SECRET} in the .env file with their values in the Jenkins environment variable? Plus, how do I loop through all the Jenkins variables?
This basically requires two steps:
Get all environment variables
Replace values of environment variables in the template (.env) file
Let's start with #2, because it dictates which kind of data #1 must produce.
2. Replace variables in a template
We can use Groovy's SimpleTemplateEngine for this task.
def result = new SimpleTemplateEngine().createTemplate( templateStr ).make( dataMap )
Here templateStr is the template string (content of your .env file) and dataMap must be a Map consisting of string keys and values (the actual values of the environment variables). Getting the template string is trivial (use Jenkins readFile step), reading the environment variables into a Map is slightly more involved.
1. Read environment variables into a Map
I wrote "slightly more involved" because Groovy goodness makes this task quite easy aswell.
#Chris has already shown how to read environment variables into a string. What we need to do is split this string, first into separate lines and then each line into key and value. Fortunately, Groovy provides the member function splitEachLine of the String class, which can do both steps with a single call!
There is a little caveat, because splitEachLine is one of the functions that doesn't behave well in Jenkins pipeline context - it would only return the first line. Moving the critical code into a separate function, annotated with #NonCPS works around this problem.
#NonCPS
Map<String,String> envStrToMap( String envStr ) {
def envMap = [:]
envStr.splitEachLine('=') {
envMap[it[0]] = it[1]
}
return envMap
}
Finally
Now we have all ingredients for letting Jenkins cook us a tasty template soup!
Here is a complete pipeline demo. It uses scripted style, but it should be easy to use in declarative style as well. Just replace node with a script block.
import groovy.text.SimpleTemplateEngine
node {
// TODO: Replace the hardcoded string with:
// def tmp = readFile file: 'yourfile.env'
def tmp = '''\
API_URL=${API_URL}
API_KEY=${API_KEY}
API_SECRET=${API_SECRET}'''
withEnv(['API_URL=http://someurl', 'API_KEY=123', 'API_SECRET=456']) {
def envMap = getEnvMap()
echo "envMap:\n$envMap"
def tmpResolved = new SimpleTemplateEngine().createTemplate( tmp ).make( envMap )
writeFile file: 'test.env', text: tmpResolved.toString()
// Just for demo, to let me see the result
archiveArtifacts artifacts: 'test.env'
}
}
// Read all environment variables into a map.
// Here, #NonCPS must NOT be used, because we are calling a Jenkins step.
Map<String,String> getEnvMap() {
def envStr = sh(script: 'env', returnStdout: true)
return envStrToMap( envStr )
}
// Split a multiline string, where each line consists of key and value separated by '='.
// It is critical to use #NonCPS to make splitEachLine() work!
#NonCPS
Map<String,String> envStrToMap( String envStr ) {
def envMap = [:]
envStr.splitEachLine('=') {
envMap[it[0]] = it[1]
}
return envMap
}
The pipeline creates an artifact "test.env" with this content:
API_URL=http://someurl
API_KEY=123
API_SECRET=456
You can access variables by executing simple shell in scripted pipeline:
def variables = sh(script: 'env|sort', returnStdout: true)
Then programatically in Groovy convert it to list and iterate using each loop.
According to replacing variables, if you're not using any solution which can access env variables then you can use simple text operations like executing sed on that file.
we have our static stack (CloudFront, S3, ..) defined as a configurable module for different projects. Now some of them need edge lambdas and I wanted to make them configurable (and optional(!)), too.
We are using the module as following:
module "static" {
..
lambda_function_associations = [
{
event_type = "viewer-request"
lambda_arn = "${aws_lambda_function.onex_lambda_viewer_req.qualified_arn}"
},
{
event_type = "viewer-response"
lambda_arn = "${aws_lambda_function.onex_lambda_viewer_res.qualified_arn}"
},
]
..
}
and the default cache behaviour of CloudFront is defined as the following:
default_cache_behavior {
..
lambda_function_association = ["${var.lambda_function_associations}"]
..
}
and our variable within the module:
variable "lambda_function_associations" {
type = "list"
default = []
}
Applying this stack I get:
Error: module.static.aws_cloudfront_distribution.web: "default_cache_behavior.0.lambda_function_association.0.event_type": required field is not set
Error: module.static.aws_cloudfront_distribution.web: "default_cache_behavior.0.lambda_function_association.0.lambda_arn": required field is not set
Is there no way to make them work optionally? I really dont want to duplicate the whole stack when adding an edge lambda.
Apparently something like this works for lb_health_check configuration blocks:
https://github.com/hashicorp/terraform/issues/17292#issuecomment-393984861
Thanks in advance!
I recently stumbled upon the same issue. This is caused by a terraform limitation, which prevents us from passing dynamic values to a nested block inside a module.
The only workaround I found was duplicating the resource declaration and creating one of the resources based on a condition in the count variable (pass a static variable here, e.g. associate_lambda_function).
You can find more details and an example in this gitlab snippet
I need to create a set of local variables at the beginning of a Keyword test and then use them later while executing the Test.
Is there any possibility to create local variables dynamically as like project variables which can be created dynamically.
Project.variables.<variable_name> = "project_variable_value"
in the similar fashion can we create any variable associated to any keyword test
Keywordtests.<generic_keyword_test_name>.variables.<variable_name> = "local_variable_value"
Sure, you can do this. Please see this example:
function Test11()
{
if (KeywordTests.Test1.Variables.VariableExists("MyVariable") == false) {
KeywordTests.Test1.Variables.AddVariable("MyVariable", "String");
}
KeywordTests.Test1.Variables.MyVariable = "test value";
Log.Message(KeywordTests.Test1.Variables.MyVariable);
}
Information on the AddVariable method can be found in the AddVariable Method help topic.