I have this Jenkins pipeline where I need to run ansiblePlaybook() command with inventory file. Inventory file does contain date (current date) part (my-instance-ips-(mm-dd-yyyy)). Here, I am facing issue in creating currentDate variable and pass into pipeline script
Jenkins File:
pipeline {
agent any
stages {
stage ( 'Executing shell script' ) {
steps {
script {
sh """
currentDate = "\$(date +'%m-%d-%Y')"
inventory_file = "my-instance-ips-{$currentDate}.yml"
ansiblePlaybook (
playbook: 'task.yml',
inventory: $inventory_file,
installation: 'ansible-2.6.5','
-e "DATE = $currentDate")
"""
}
}
}
}
}
Error Message:
groovy.lang.MissingPropertyException: No such property: currentDate for class: groovy.lang.Bindin
Could someone help me out to create current date in pipeline script and the same should pass over to ansible playbook command?
Looks like you are invoking ansible plugin. If that is what you are trying to achive, then your ansible playbook call should not be inside sh step.
You need to get the command's output first then invoke ansible plugin.
import java.time.format.DateTimeFormatter
pipeline {
agent any
stages {
stage ( 'Executing shell script' ) {
steps {
script {
/* currentDate = sh (
script: "date +'%m-%d-%Y'"
returnStatus: true
).trim()
*/
cDate = java.time.LocalDate.now()
currentDate = cDate.format(DateTimeFormatter.ofPattern("MM-dd-yyyy"))
inventory_file = "my-instance-ips-${currentDate}.yml"
println inventory_file
ansiblePlaybook ([
playbook: 'task.yml',
// credentialsId: 'xxxx',
disableHostKeyChecking: true,
inventory: "${inventory_file}",
extraVars: [
DATE: "${currentDate}"
]
])
}
}
}
}
}
Related
How to pass values to a shell script from jenkins during the runtime of the pipeline job.
I have a shell script and want to pass the values dynamically.
#!/usr/bin/env bash
....
/some code
....
export USER="" // <--- want to pass this value from pipeline
export password="" //<---possibly as a secret
The jenkins pipeline executes the above shell script
node('abc'){
stage('build'){
sh "cd .."
sh "./script.sh"
}
}
You can do something like the following:
pipeline {
agent any
environment {
USER_PASS_CREDS = credentials('user-pass')
}
stages {
stage('build') {
steps {
sh "cd .."
sh('./script.sh ${USER_PASS_CREDS_USR} ${USER_PASS_CREDS_PSW}')
}
}
}
}
The credentials is from using the Credentials API and Credentials plugin. Your other option is Credentials Binding plugin where it allows you to include credentials as part of a build step:
stage('build with creds') {
steps {
withCredentials([usernamePassword(credentialsId: 'user-pass', usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD')]) {
// available as an env variable, but will be masked if you try to print it out any which way
// note: single quotes prevent Groovy interpolation; expansion is by Bourne Shell, which is what you want
sh 'echo $PASSWORD'
// also available as a Groovy variable
echo USERNAME
// or inside double quotes for string interpolation
echo "username is $USERNAME"
sh('./script.sh $USERNAME $PASSWORD')
}
}
}
Hopefully this helps.
I declared the environment variables in pipeline syntax and I'm trying to assign values to the variables by reading the file from workspace. Assigned values are not reflected in environment variable. my configuration looks like below
pipeline {
agent any
environment {
test = ''
}
stages {
stage('Test') {
script {
writeFile(file: 'hello.txt', text: "hello world")
env.test = readFile(file: 'hello.txt')
echo 'test:'"${env.test}" // coming as null
}
}
}
}
}
Try to remove test from environment block.
Also, you have a problem with '' and "" when you display env.test, try to do this:
echo "test: ${env.test}" // coming as null
I stuck with for loop condition in Pipeline
pipeline {
agent any
stages{
stage ('Showing Working Space') {
when {
anyOf {
environment name: 'Test', value: 'ALL'
environment name: 'Test', value: 'IMAGE'
}
}
steps {
sh "echo Display ${Var1}"
script{
sh 'for service in (echo "$Var1"|sed "s/,/ /g");do echo $service; done'
}
}
}
}
}
Getting error like " syntax error near unexpected token `('"
Var1 = has multiple values
Need to execute the "For loop" to pass the values to another script
Please help on this
I believe what you want is
pipeline {
agent any
stages {
stage('Showing Working Space') {
when {
anyOf {
environment name: 'Test', value: 'ALL'
environment name: 'Test', value: 'IMAGE'
}
}
steps {
sh "echo Display ${Var1}"
script {
sh 'for service in $(echo "$Var1"|sed "s/,/ /g"); do echo $service; done'
}
}
}
}
}
In essence, replace service in (echo with service in $(echo (note the $).
I am using the terraform to build my ec2-instances as part of instance bootstrap, added cloud-init config to run multiple userdata scripts. but the content_type = "text/x-shellscript" always executed first. I verified the cat /var/log/cloud-init-output.log file. it shows the shell script is invoked first. How do I config the shell script to run at last?
data "template_cloudinit_config" "myapp_cloudinit_config" {
gzip = false
base64_encode = false
# Main cloud-config configuration file.
part {
content_type = "text/cloud-config"
content = "${data.template_file.base_bootstrap_file.rendered}"
merge_type = "list(append)+dict(recurse_array)+str()"
}
part {
content_type = "text/cloud-config"
content = "${module.template_file_appsec_init.appsec_user_data_rendered}"
merge_type = "list(append)+dict(recurse_array)+str()"
}
part {
content_type = "text/x-shellscript"
content = "${module.template_file_beat_init.beat_user_data_rendered}"
}
}
Shell script looks like below
module " template_file_beat_init" {
source = "url" #the source url contains the zip file which includes the below shell script
}
#!/bin/sh
deploy_the_app() {
//invoke ansible playbook execution
}
deploy_the_app
Cloud provider: AWS
OS : RHEL 8.3
cloud-init --version: /usr/bin/cloud-init 19.4
Terraform v0.11.8
I have a pipeline job which run with below pipeline groovy script,
pipeline {
parameters{
string(name: 'Unique_Number', defaultValue: '', description: 'Enter Unique Number')
}
stages {
stage('Build') {
agent { node { label 'Build' } }
steps {
script {
sh build.sh
}
}
stage('Deploy') {
agent { node { label 'Deploy' } }
steps {
script {
sh deploy.sh
}
}
stage('Test') {
agent { node { label 'Test' } }
steps {
script {
sh test.sh
}
}
}
}
I just trigger this job multiple times with different unique ID number as input parameter. So as a result i will have multiple run/build for this job at different stages.
With this, i need to trigger a multiple run/build to be promote to next stage (i.e., from build to deploy or from deploy to test) in this pipeline job as a one single build instead of triggering each and every single run/build to next stage. Is there any possibility?
I was also trying to do the same thing and found no relevant answers. May this help to someone.
This will read a file that contains the Jenkins Job name and run them iteratively from one single job.
Please change below code accordingly in your Jenkins.
pipeline {
agent any
stages {
stage('Hello') {
steps {
script{
git branch: 'Your Branch name', credentialsId: 'Your crendiatails', url: ' Your BitBucket Repo URL '
##To read file from workspace which will contain the Jenkins Job Name ###
def filePath = readFile "${WORKSPACE}/ Your File Location"
##To read file line by line ###
def lines = filePath.readLines()
##To iterate and run Jenkins Jobs one by one ####
for (line in lines) {
build(job: "$line/branchName",
parameters:
[string(name: 'vertical', value: "${params.vertical}"),
string(name: 'environment', value: "${params.environment}"),
string(name: 'branch', value: "${params.aerdevops_branch}"),
string(name: 'project', value: "${params.host_project}")
]
)
}
}
}
}
}
}
You can start multiple jobs from one pipeline if you run something as:
build job:"One", wait: false
build job:"Two", wait: false
Your main job starts children pipelines and children pipelines will run in parallel.
You can read PipeLine Build Step documentation for more information.
Also, you can read about the parallel run in declarative pipeline
Here you can find a lot of examples for parallel running