How to restrict only one parameter in jenkins pipeline? - jenkins-pipeline

I have the below pipeline script with string parameters. The Target parameter will fail if multiple comma separated inputs (target1, target2) are provided in Jenkins. How can I restrict the Jenkins pipeline to accept just one parameter (target) as parameter and not multiple comma separated values.
properties([
parameters([
string(defaultValue: '', description: '', name: 'ID'),
string(defaultValue: '', description: '', name: 'Target')
])
])

What you could do in the first stage/step
if ((params.Target.split(',')).size() > 1) {
error("Build failed because of this and that..")
}

Related

In Jenkins display the directories from gitlab repo in build with parameters as a drop down

agent { label 'jenkins-slave' }
environment {
CRED_REPO_URL = 'gitcredentials.git'
SERVICES_REPO_URL = 'gitrepo'
ENV_NAME = 'dev_dev'
REGION = 'ap-south-1'
}
parameters {
choice name: 'ENV', choices: ['dev', 'qa'], description: 'Choose an env'
choice name: 'DIRS', choices: choiceArray, description: 'Choose a dir from branch'
gitParameter branchFilter: 'origin/(.*)',
tagFilter: '*',
defaultValue: 'main',
name: 'BRANCH',
type: 'BRANCHTAG',
quickFilterEnabled: 'true',
description: 'branch to execute',
useRepository: 'gitrepo'
}
}
facing some issue.so need to assist for this while select the branch in jenkins it has to shown that repo directories as a drop down.

How can i pass variables from jenkins file to terraform

when i run terraform with local variables inside variable.tf everything work like a charm
i want to pass Jenkins parameters inside terraform variable.tf file so it will be dynamic from Jenkins
how can i achieve it?
pipeline {
agent any
options {
skipDefaultCheckout true
}
environment {
TF_VAR_datacenter="${DATA_CENTER}"
TF_VAR_cluster="${CLUSTER}"
TF_VAR_esxi="${ESXI}"
TF_VAR_datastore="${DATASTORE}"
TF_VAR_network="${NETWORK}"
TF_VAR_server_hostname="${SERVER_HOSTNAME}"
TF_VAR_server_mac="${SERVER_MAC}"
}
parameters {
string(name: 'DATA_CENTER', defaultValue: 'xxx', description: 'vcenter data center',)
string(name: 'CLUSTER', defaultValue: 'xxx', description: 'data center cluster',)
string(name: 'ESXI', defaultValue: 'xxx', description: 'esxi hostname',)
string(name: 'DATASTORE', defaultValue: 'xxx', description: 'data center datastore',)
string(name: 'NETWORK', defaultValue: 'xxx', description: 'data center network',)
string(name: 'SERVER_HOSTNAME', defaultValue: 'xxx', description: 'server hostname',)
string(name: 'SERVER_MAC', defaultValue: 'xxx', description: 'server mac',)
string(name: 'SERVER_IP', defaultValue: 'xxx', description: 'server ip',)
string(name: 'SERVER_NETMASK', defaultValue: 'xxx', description: 'server netmask',)
string(name: 'SERVER_GATEWAY', defaultValue: 'xxx', description: 'server gateway',)
string(name: 'COBBLER_PROFILE', defaultValue: 'xxx', description: 'cobbler profile',)
choice(name: 'BUILD_DESTROY', description: '', choices: ['build' , 'destroy'])
}
stages {
stage('OS PROVISION') {
steps {
dir("/root/terraform"){
sh """
export TF_VAR_datacenter=${DATA_CENTER}
export TF_VAR_cluster=${CLUSTER}
export TF_VAR_esxi=${ESXI}
export TF_VAR_datastore=${DATASTORE}
export TF_VAR_network=${NETWORK}
export TF_VAR_server_hostname=${SERVER_HOSTNAME}
export TF_VAR_server_mac=${SERVER_MAC}
terraform init
terraform apply -auto-approve
"""
}
}
}
}
post {
always {
echo 'This will always run'
}
}
}
I prefer use this format:
terraform apply \
-var 'vpc_id=$(AWS_VPC_ID)' \
-var 'subnet_id=$(AWS_SUBNET_ID)' \
-var 'aws_region=$(AWS_REGION)' \
-var 'ami_id=$(AMI_ID)'\
-var 'instance_type=$(AWS_EC2_TYPE)' \
-var 'key_pair=$(KEY_PAIR_NAME)' \
-var 'tags={ "Owner":"$(OWNER)", "Service":"$(SERVICE)", "Terraform":"true", "Env":"$(ENV)" }'
Your question needs a bit more clarity, but I am going to make a few educated guesses.
with local variables inside variable.tf
Do you mean locals {, or what exactly do you mean by local variables?
pass Jenkins parameters inside terraform variable.tf file
This isn't the right file to 'pass parameters'. *.tf files are for declaring variables.
You are probably looking at *.tfvars files.
You have 3 options with *.tfvars files:
a file named exactly terraform.tfvars
a file named <anything>.auto.tfvars
a file named <anything>.tfvars which you reference using the -var-file CLI parameter.
The format of *.tfvars files is simply:
var1_name = var1_value
You can (must) use the usual HCL markup fo trings, lists, maps, ...

Cloudbees Jenkins: Triggering a downstream job between in a different Jenkins instance

Objective: To trigger a downstream job from a different Jenkins instance and display the console output in the upstream job.
Job type: Pipeline scripts.
The complete code below:
properties([
parameters([
string(name: 'var1', defaultValue: "value1", description: ''),
string(name: 'var2', defaultValue: "value2", description: ''),
string(name: 'var3', defaultValue: "value3", description: '')
])
])
node('unique tag'){
stage("Trigger downstream"){
//From Jenkins
def remoteRunWrapper = triggerRemoteJob(
mode: [$class: 'ConfirmStarted', timeout: [timeoutStr: '1h'], whenTimeout: [$class: 'StopAsFailure']],
remotePathMissing: [$class: 'StopAsFailure'],
parameterFactories: [[$class: 'SimpleString', name: 'var1', value: var1], [$class: 'SimpleString', name: 'var2', value: var2], [$class: 'SimpleString', name: 'var3', value: var3]],
remotePathUrl: 'jenkins://..',
)
print(remoteRunWrapper.toString())
//would want to use other capabilities offered by remoteRunWrapper
}
}
The triggerRemoteJob is able to trigger the downstream job and return with an instance of RemoteRunWrapper after the job has started. The RemoteRunWrapper instance should provide capabilities that can allow me to check on the downstream job/retrieve logs. There is however no documentation on the RemoteRunWrapper that I could find. The methods described in the RunWrapper documentation cannot be used and the script fails with the error:
groovy.lang.MissingMethodException: No signature of method: com.cloudbees.opscenter.triggers.RemoteRunWrapper.getId() is applicable for argument types: () values: []
How can I find the capabilities offered by RemoteRunWrapper? Are there any better ways to achieve this?
Note:
1) The use of
mode: [$class: 'ConfirmStarted', timeout: [timeoutStr: '1h'], whenTimeout: [$class: 'StopAsFailure']],
remotePathUrl: 'jenkins://...'
is necessary as the below:
remoteJenkinsUrl: 'https://myjenkins:8080/...'
job: 'TheJob'
from the triggerRemoteJob documentation is failing to trigger the job and is returning a null object and the methods that are described here also cause the script to fail with MissingMethodException.
2) The [$class: 'RemoteBuildConfiguration'] provides an option 'enhancedLogging' that allows the console output of the remote job to also be logged. However when used, a classNotFound exception is seen (import statement was included).
3) It does not really matter whether the downstream job is triggered asynchronous or synchronously as long as it is possible to log the console output of the downstream job in the console output of the upstream job.

How to declare variables in jenkinsfile

I am trying to fetch the repo details from a variable in the jenkinsfile. Can someone guide on why this is not working?
parameters {
string(defaultValue: "develop", description: 'enter the branch name to use', name: 'branch')
string(defaultValue: "repo1", description: 'enter the repo name to use', name: 'reponame')
}
stage('Branch Update'){
dir("${param.reponame}"){
bat """ echo branch is ${params.branch}"""
}
}
When i run the above, I get the following error message:
groovy.lang.MissingPropertyException: No such property: param for class: groovy.lang.Binding
[Pipeline] }
You have a typo. Instead of param.reponame it should be params.reponame.

Jenkins declarative pipeline - User input parameters

I've looked for some example of user input parameters using Jenkins declarative pipeline, however all the examples are using the scripted pipelines. Here is a sample of code I'm trying to get working:
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
input id: 'test', message: 'Hello', parameters: [string(defaultValue: '', description: '', name: 'myparam')]
sh "echo ${env}"
}
}
}
}
I can't seem to work out how I can access the myparam variable, it would be great if someone could help me out.
Thanks
When using input, it is very important to use agent none on the global pipeline level, and assign agents to individual stages. Put the input procedures in a separate stage that also uses agent none. If you allocate an agent node for the input stage, that agent executor will remain reserved by this build until a user continues or aborts the build process.
This example should help with using the Input:
def approvalMap // collect data from approval step
pipeline {
agent none
stages {
stage('Stage 1') {
agent none
steps {
timeout(60) { // timeout waiting for input after 60 minutes
script {
// capture the approval details in approvalMap.
approvalMap = input
id: 'test',
message: 'Hello',
ok: 'Proceed?',
parameters: [
choice(
choices: 'apple\npear\norange',
description: 'Select a fruit for this build',
name: 'FRUIT'
),
string(
defaultValue: '',
description: '',
name: 'myparam'
)
],
submitter: 'user1,user2,group1',
submitterParameter: 'APPROVER'
}
}
}
}
stage('Stage 2') {
agent any
steps {
// print the details gathered from the approval
echo "This build was approved by: ${approvalMap['APPROVER']}"
echo "This build is brought to you today by the fruit: ${approvalMap['FRUIT']}"
echo "This is myparam: ${approvalMap['myparam']}"
}
}
}
}
When the input function returns, if it only has a single parameter to return, it returns that value directly. If there are multiple parameters in the input, it returns a map (hash, dictionary), of the values. To capture this value we have to drop to groovy scripting.
It is good practice to wrap your input code in a timeout step so that build don't remain in an unresolved state for an extended time.

Resources