Cloudbees Jenkins: Triggering a downstream job between in a different Jenkins instance - jenkins-pipeline

Objective: To trigger a downstream job from a different Jenkins instance and display the console output in the upstream job.
Job type: Pipeline scripts.
The complete code below:
properties([
parameters([
string(name: 'var1', defaultValue: "value1", description: ''),
string(name: 'var2', defaultValue: "value2", description: ''),
string(name: 'var3', defaultValue: "value3", description: '')
])
])
node('unique tag'){
stage("Trigger downstream"){
//From Jenkins
def remoteRunWrapper = triggerRemoteJob(
mode: [$class: 'ConfirmStarted', timeout: [timeoutStr: '1h'], whenTimeout: [$class: 'StopAsFailure']],
remotePathMissing: [$class: 'StopAsFailure'],
parameterFactories: [[$class: 'SimpleString', name: 'var1', value: var1], [$class: 'SimpleString', name: 'var2', value: var2], [$class: 'SimpleString', name: 'var3', value: var3]],
remotePathUrl: 'jenkins://..',
)
print(remoteRunWrapper.toString())
//would want to use other capabilities offered by remoteRunWrapper
}
}
The triggerRemoteJob is able to trigger the downstream job and return with an instance of RemoteRunWrapper after the job has started. The RemoteRunWrapper instance should provide capabilities that can allow me to check on the downstream job/retrieve logs. There is however no documentation on the RemoteRunWrapper that I could find. The methods described in the RunWrapper documentation cannot be used and the script fails with the error:
groovy.lang.MissingMethodException: No signature of method: com.cloudbees.opscenter.triggers.RemoteRunWrapper.getId() is applicable for argument types: () values: []
How can I find the capabilities offered by RemoteRunWrapper? Are there any better ways to achieve this?
Note:
1) The use of
mode: [$class: 'ConfirmStarted', timeout: [timeoutStr: '1h'], whenTimeout: [$class: 'StopAsFailure']],
remotePathUrl: 'jenkins://...'
is necessary as the below:
remoteJenkinsUrl: 'https://myjenkins:8080/...'
job: 'TheJob'
from the triggerRemoteJob documentation is failing to trigger the job and is returning a null object and the methods that are described here also cause the script to fail with MissingMethodException.
2) The [$class: 'RemoteBuildConfiguration'] provides an option 'enhancedLogging' that allows the console output of the remote job to also be logged. However when used, a classNotFound exception is seen (import statement was included).
3) It does not really matter whether the downstream job is triggered asynchronous or synchronously as long as it is possible to log the console output of the downstream job in the console output of the upstream job.

Related

pass value from job2 to job1 jenkins

I have a Jenkins job1 which triggers job2
stage("trigger job2") {
steps {
build job: job2,
parameters: [
string(name: "test1", value: "test1"),
string(name: "test2", value: "test2"),
],
and job2 triggered and running, I need to know if job2's last stage fails; for example, in my case, the last stage is "RESULT". Now, if the RESULT stage on job2 is red/failed, return this one pass to job1 and in job1 stage trigger job2 should also display in red.
I tried these cases but they do not work.
Jenkins version is Jenkins 2.346.1
What I tried:
JOB1: modified
def p = build job: job2, propagate: true,
parameters: [
string(name: "test1", value: "test1"),
string(name: "test2", value: "test2"),
],
script {
build.waitForCompletion()
}
if (p.result == 'FAILURE') {
currentBuild.result = "FAILURE"
error 'Job2 failed, marking Job1 as failed in the current stage'
}
}
Try 2:
JOB1 modified
def result = build.waitForCompletion()
if (result.result == 'FAILURE') {
currentBuild.result = "FAILURE"
error 'Job2 failed, marking Job1 as failed in the current stage'
}
I tried 3-4 different cases, but they didn't work, could someone advise me?
I would say it is the default behaviour in Jenkins.
Jenkins job fails be default if any of its stages fail. Job2 will fail if its last stage fails. As far as I know and what doc says it is enough to write this:
build job: job2,
parameters: [
string(name: "test1", value: "test1"),
string(name: "test2", value: "test2"),
]
to make job1 fail when job2 fails. Stage which triggers job2 (in job1) should be red as you want.

DataflowPythonOperator in Airflow 2 failing on get 404

I am trying to run a job in Airflow 2.1.2 which executes a dataflow job. The dataflow job reads data from storage bucket and uploads it to bigquery. The dataflow_default_options in the DAG has region defined as europe-west1 however it is overridden by the actual job in DAG to us-central1. Due to this the dataflow job fails on big query upload as the region is us-central1
It was working fine before when I was using the older version of airflow(1.10.15).
Code below:
DEFAULT_DAG_ARGS = {
'start_date': YESTERDAY,
'email': models.Variable.get('email'),
'email_on_failure': True,
'email_on_retry': False,
'retries': 0,
'project_id': models.Variable.get('gcp_project'),
'dataflow_default_options': {
'region': 'europe-west1',
'project': models.Variable.get('gcp_project'),
'temp_location': models.Variable.get('gcp_temp_location'),
'runner': 'DataflowRunner',
'zone': 'europe-west1-d'
}
}
with models.DAG(dag_id='GcsToBigQueryTriggered',
description='A DAG triggered by an external Cloud Function',
schedule_interval=None,
default_args=DEFAULT_DAG_ARGS,
max_active_runs=1) as dag:
# Args required for the Dataflow job.
job_args = {
'input': 'gs://{{ dag_run.conf["bucket"] }}/{{ dag_run.conf["name"] }}',
'output': models.Variable.get('bq_output_table'),
'fields': models.Variable.get('input_field_names'),
'load_dt': DS_TAG
}
# Main Dataflow task that will process and load the input delimited file.
dataflow_task = dataflow_operator.DataFlowPythonOperator(
task_id="data-ingest-gcs-process-bq",
py_file=DATAFLOW_FILE,
options=job_args)
If i change the region in the options of the dataflow_task to europe-west1, then the Dataflow job passes however it fails in Airflow with 404 error code as it waits for the JOB_DONE status of the dataflow job in the wrong region(us-central1).
Am I missing something ? Any help would be highly appreciated ?

How to restrict only one parameter in jenkins pipeline?

I have the below pipeline script with string parameters. The Target parameter will fail if multiple comma separated inputs (target1, target2) are provided in Jenkins. How can I restrict the Jenkins pipeline to accept just one parameter (target) as parameter and not multiple comma separated values.
properties([
parameters([
string(defaultValue: '', description: '', name: 'ID'),
string(defaultValue: '', description: '', name: 'Target')
])
])
What you could do in the first stage/step
if ((params.Target.split(',')).size() > 1) {
error("Build failed because of this and that..")
}

Jenkinsfile: Send mail to all users listed in "People"

I would like to send an e-mail notification to all users listed in the People tab in the job view:
The post Use Jenkins 'Mailer' inside pipeline workflow shows how to send e-mail notifications within a Jenkinsfile:
emailext(body: '${DEFAULT_CONTENT}', mimeType: 'text/html',
replyTo: '$DEFAULT_REPLYTO', subject: '${DEFAULT_SUBJECT}',
to: emailextrecipients([[$class: 'CulpritsRecipientProvider'],
[$class: 'RequesterRecipientProvider']]))
I modified it to send e-mails only if the build failed or fixed, inspired by Justin Simons comment in https://baptiste-wicht.com/posts/2017/06/jenkins-tip-send-notifications-fixed-builds-declarative-pipeline.html#comment-3478592834:
mailNotificationAlreadySend = false
pipeline {
...
stages {
...
}
post {
changed {
sendMailNotification()
}
failure {
sendMailNotification()
}
}
}
void sendMailNotification() {
if (!mailNotificationAlreadySend) {
emailext(body: '${DEFAULT_CONTENT}', mimeType: 'text/html',
replyTo: '$DEFAULT_REPLYTO', subject: '${DEFAULT_SUBJECT}',
recipientProviders: [[$class: 'DevelopersRecipientProvider'],
[$class: 'CulpritsRecipientProvider']]
)
mailNotificationAlreadySend = true
}
}
But this sends the e-mails only to the developer who caused the build fail and all following contributors until the build result is successful again.
How should the emailext method be configured to send e-mails to all users listed in the People tab in the job view?
I already tried all recipientProviders available in https://github.com/jenkinsci/email-ext-plugin/tree/master/src/main/java/hudson/plugins/emailext/plugins/recipients without any success.

Jenkins declarative pipeline - User input parameters

I've looked for some example of user input parameters using Jenkins declarative pipeline, however all the examples are using the scripted pipelines. Here is a sample of code I'm trying to get working:
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
input id: 'test', message: 'Hello', parameters: [string(defaultValue: '', description: '', name: 'myparam')]
sh "echo ${env}"
}
}
}
}
I can't seem to work out how I can access the myparam variable, it would be great if someone could help me out.
Thanks
When using input, it is very important to use agent none on the global pipeline level, and assign agents to individual stages. Put the input procedures in a separate stage that also uses agent none. If you allocate an agent node for the input stage, that agent executor will remain reserved by this build until a user continues or aborts the build process.
This example should help with using the Input:
def approvalMap // collect data from approval step
pipeline {
agent none
stages {
stage('Stage 1') {
agent none
steps {
timeout(60) { // timeout waiting for input after 60 minutes
script {
// capture the approval details in approvalMap.
approvalMap = input
id: 'test',
message: 'Hello',
ok: 'Proceed?',
parameters: [
choice(
choices: 'apple\npear\norange',
description: 'Select a fruit for this build',
name: 'FRUIT'
),
string(
defaultValue: '',
description: '',
name: 'myparam'
)
],
submitter: 'user1,user2,group1',
submitterParameter: 'APPROVER'
}
}
}
}
stage('Stage 2') {
agent any
steps {
// print the details gathered from the approval
echo "This build was approved by: ${approvalMap['APPROVER']}"
echo "This build is brought to you today by the fruit: ${approvalMap['FRUIT']}"
echo "This is myparam: ${approvalMap['myparam']}"
}
}
}
}
When the input function returns, if it only has a single parameter to return, it returns that value directly. If there are multiple parameters in the input, it returns a map (hash, dictionary), of the values. To capture this value we have to drop to groovy scripting.
It is good practice to wrap your input code in a timeout step so that build don't remain in an unresolved state for an extended time.

Resources