when i run terraform with local variables inside variable.tf everything work like a charm
i want to pass Jenkins parameters inside terraform variable.tf file so it will be dynamic from Jenkins
how can i achieve it?
pipeline {
agent any
options {
skipDefaultCheckout true
}
environment {
TF_VAR_datacenter="${DATA_CENTER}"
TF_VAR_cluster="${CLUSTER}"
TF_VAR_esxi="${ESXI}"
TF_VAR_datastore="${DATASTORE}"
TF_VAR_network="${NETWORK}"
TF_VAR_server_hostname="${SERVER_HOSTNAME}"
TF_VAR_server_mac="${SERVER_MAC}"
}
parameters {
string(name: 'DATA_CENTER', defaultValue: 'xxx', description: 'vcenter data center',)
string(name: 'CLUSTER', defaultValue: 'xxx', description: 'data center cluster',)
string(name: 'ESXI', defaultValue: 'xxx', description: 'esxi hostname',)
string(name: 'DATASTORE', defaultValue: 'xxx', description: 'data center datastore',)
string(name: 'NETWORK', defaultValue: 'xxx', description: 'data center network',)
string(name: 'SERVER_HOSTNAME', defaultValue: 'xxx', description: 'server hostname',)
string(name: 'SERVER_MAC', defaultValue: 'xxx', description: 'server mac',)
string(name: 'SERVER_IP', defaultValue: 'xxx', description: 'server ip',)
string(name: 'SERVER_NETMASK', defaultValue: 'xxx', description: 'server netmask',)
string(name: 'SERVER_GATEWAY', defaultValue: 'xxx', description: 'server gateway',)
string(name: 'COBBLER_PROFILE', defaultValue: 'xxx', description: 'cobbler profile',)
choice(name: 'BUILD_DESTROY', description: '', choices: ['build' , 'destroy'])
}
stages {
stage('OS PROVISION') {
steps {
dir("/root/terraform"){
sh """
export TF_VAR_datacenter=${DATA_CENTER}
export TF_VAR_cluster=${CLUSTER}
export TF_VAR_esxi=${ESXI}
export TF_VAR_datastore=${DATASTORE}
export TF_VAR_network=${NETWORK}
export TF_VAR_server_hostname=${SERVER_HOSTNAME}
export TF_VAR_server_mac=${SERVER_MAC}
terraform init
terraform apply -auto-approve
"""
}
}
}
}
post {
always {
echo 'This will always run'
}
}
}
I prefer use this format:
terraform apply \
-var 'vpc_id=$(AWS_VPC_ID)' \
-var 'subnet_id=$(AWS_SUBNET_ID)' \
-var 'aws_region=$(AWS_REGION)' \
-var 'ami_id=$(AMI_ID)'\
-var 'instance_type=$(AWS_EC2_TYPE)' \
-var 'key_pair=$(KEY_PAIR_NAME)' \
-var 'tags={ "Owner":"$(OWNER)", "Service":"$(SERVICE)", "Terraform":"true", "Env":"$(ENV)" }'
Your question needs a bit more clarity, but I am going to make a few educated guesses.
with local variables inside variable.tf
Do you mean locals {, or what exactly do you mean by local variables?
pass Jenkins parameters inside terraform variable.tf file
This isn't the right file to 'pass parameters'. *.tf files are for declaring variables.
You are probably looking at *.tfvars files.
You have 3 options with *.tfvars files:
a file named exactly terraform.tfvars
a file named <anything>.auto.tfvars
a file named <anything>.tfvars which you reference using the -var-file CLI parameter.
The format of *.tfvars files is simply:
var1_name = var1_value
You can (must) use the usual HCL markup fo trings, lists, maps, ...
Related
agent { label 'jenkins-slave' }
environment {
CRED_REPO_URL = 'gitcredentials.git'
SERVICES_REPO_URL = 'gitrepo'
ENV_NAME = 'dev_dev'
REGION = 'ap-south-1'
}
parameters {
choice name: 'ENV', choices: ['dev', 'qa'], description: 'Choose an env'
choice name: 'DIRS', choices: choiceArray, description: 'Choose a dir from branch'
gitParameter branchFilter: 'origin/(.*)',
tagFilter: '*',
defaultValue: 'main',
name: 'BRANCH',
type: 'BRANCHTAG',
quickFilterEnabled: 'true',
description: 'branch to execute',
useRepository: 'gitrepo'
}
}
facing some issue.so need to assist for this while select the branch in jenkins it has to shown that repo directories as a drop down.
I am trying to add a function in JenkinsFile Declarative pipelines parameter's description but struggling to make it work.
Idea is to have a Jenkins Job specific for the environment. and would like to see the choice parameter to show environment name in the description of the variable.
My pipeline looks like this
def check_env = app_env(ENVS, env.JOB_NAME)
pipeline {
agent { label 'master' }
options {
disableConcurrentBuilds()
buildDiscarder(logRotator(numToKeepStr: '20'))
timestamps()
}
parameters{
string(name: 'myVariable', defaultValue: "/", description: 'Enter Path To App e.g / OR /dummy_path for ' {check_env} )
}
stages{
stage('Running App') {
agent {
docker {
image 'myApp:latest'
}
}
steps{
script{
sh label: 'App', script: "echo "App is running in ${check_env} "
}
}
}
}
}
}
I have tried multiple combinations for check_env e.g check_env, check_env(), ${check_env} function but none of them worked.
String Interpolation
I believe this is simply a String Interpolation issue. Notice my use of double quotes below
parameters{
string(name: 'myVariable', defaultValue: "/", description: "Enter Path To App e.g / OR /dummy_path for ${check_env}")
}
Your build page should then interpolate your variable
To test I simply set def check_env = 'live' since I do not have the code for your method
I am having jenkin Jenkins 2.289.1, at I am converting the existing job to the pipeline.
The same batch which is working in job but is not working in pipeline.
Even when the batch file does not exist, no error thrown but task ended as completed.
Any idea on the issue?
pipeline {
agent any
parameters {
choice(name: 'RELEASE', choices: ['860', '859', '858','857'], description: 'Pick something')
string(name: 'SrcTestSetNameToCopy', defaultValue: '', description: 'Source ALM test Set Name')
string(name: 'TestSetNameToBeCreated', defaultValue: '', description: 'Test Set Name to create')
choice(name: 'Platform', choices: ['ORACLE', 'MICROSFT', 'DB2ODBC'], description: 'Pick something')
string(name: 'BuildOverride', defaultValue: '', description: '4 dit build overwrite value')
choice(name: 'EnvnBuildType', choices: ['DEP', 'QAE'], description: 'Pick something')
booleanParam(name: 'TOGGLE', defaultValue: true, description: 'Toggle this value')
}
stages {
stage('Create ALM Test Set') {
steps {
// bat "\"C:\\JenKin_Jobs\\Test.bat\""
// bat 'C:/JenKin_Jobs/Test1.bat'
// bat 'wmic computersystem get name'
//bat 'echo %PATH%'
echo 'selva'
echo "Current workspace is $WORKSPACE"
//bat returnStatus: true, script: 'C:\\JenKin_Jobs\\Test.bat'
bat script: 'C:\\JenKin_Jobs\\Test.bat'
}
}
}
}
pipeline {
agent any
stages {
stage('Hello') {
steps {
// below simple echo executed
echo 'Hello World'
//bat 'C:\\JenKin_Jobs\\NetUSeIDrive.bat'
bat 'cmd.exe "/c C:\\JenKin_Jobs\\NetUSeIDrive.bat" '
bat 'cmd.exe /c c:\\JenKin_Jobs\\SQAClnUp.bat "I:\\\\***\\\\SQA_CONFIG_FILES\\\\859 Stuff\\\\####\\\\P05\\\\P05B"'
}
}
}
}
The pipeline works in one set of box but not in another? IS there any config missing between the boxes. Both are usins same jenkin version
I have the below pipeline script with string parameters. The Target parameter will fail if multiple comma separated inputs (target1, target2) are provided in Jenkins. How can I restrict the Jenkins pipeline to accept just one parameter (target) as parameter and not multiple comma separated values.
properties([
parameters([
string(defaultValue: '', description: '', name: 'ID'),
string(defaultValue: '', description: '', name: 'Target')
])
])
What you could do in the first stage/step
if ((params.Target.split(',')).size() > 1) {
error("Build failed because of this and that..")
}
I want to use AWS macro Transform::Include with some dynamic parameters for my file.
Resources:
'Fn::Transform':
Name: 'AWS::Include'
Parameters:
TestMacroVariable:
Default: 2
Type: Number
Location: !Sub "s3://${InstallBucketName}/test.yaml"
test.yaml:
DataAutoScalingGroup:
Type: AWS::AutoScaling::AutoScalingGroup
Properties:
LaunchConfigurationName:
Ref: DataLaunchConfiguration
MinSize: '1'
MaxSize: '100'
DesiredCapacity:
Ref: TestMacroVariable
...
After calling: aws cloudformation describe-stack-events --stack-name $stack
I get:
"ResourceStatusReason": "The value of parameter TestMacroVariable
under transform Include must resolve to a string, number, boolean or a
list of any of these.. Rollback requested by user."
When I try to do it this way:
Resources:
'Fn::Transform':
Name: 'AWS::Include'
Parameters:
TestMacroVariable: 2
Location: !Sub "s3://${InstallBucketName}/test.yaml"
I get:
"ResourceStatusReason": "Template format error: Unresolved resource
dependencies [TestMacroVariable] in the Resources block of the
template"
Error is the same when I don't provide TestMacroVariable at all.
Tried with different types: String, Number, Boolean, List - none of them work.
As i know you cannot have anything other than Location key in the Parameters section of the AWS::Include. Check here AWS DOC
As an alternative, you can pass in the whole S3 path as a parameter and reference it in Location:
Parameters:
MyS3Path:
Type: String
Default: 's3://my-cf-templates/my-include.yaml'
...
'Fn::Transform':
Name: 'AWS::Include'
Parameters:
Location: !Ref MyS3Path
Building on what #BatteryAcid Said you can refer the parameters in your Cloudformation template directly from your file using Sub function:
In your CF template :
Parameters:
TableName:
Type: String
Description: Table Name of the Dynamo DB Users table
Default: 'Users'
In the file you are including:
"Resource": [
{
"Fn::Sub": [
"arn:${AWS::Partition}:dynamodb:${AWS::Region}:${AWS::AccountId}:table/${tableName}",
{
"tableName": {
"Ref": "TableName"
}
}
]
}
Alternatively doesn't have to be a parameter but any resource from your template :
Fn::Sub: arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${QueryTelemtryFunction.Arn}/invocations