When triggering jenkins pipeline from Spinnaker is it possible to pass pipelineParams? - jenkins-pipeline

When triggering jobs from Spinnaker is there a way to pass pipelineParams. For example I see
{
"continuePipeline": false,
"failPipeline": true,
"isNew": true,
"job": "job123",
"master": "master123",
"name": "Jenkins",
"parameters": {
"mavenProfile": "FooBar" <-- ???
},
"type": "jenkins"
}
What purpose do parameters field serve? Can we use it to pass parameters to the Jenkins pipelines?
Has anyone successfully accomplished passing parameters to Jenkins pipelines?
When above stage gets triggered, it immediately fails with a message:
job/master, passing params to a job which doesn't need them

The Jenkins integration in Spinnaker launches individual jobs.
The parameters that you have highlighted points to job parameters defined if you select This project is parameterized in the Jenkins Job config. The reason you're getting that error is because Jenkins hits 2 different endpoints for launching jobs, one with parameters and one without.
As far as I know, there is not a way to launch jenkins pipelines from Spinnaker, but I imagine it would look different than the Jenkins launch job stage since it would have to hit a different API endpoint.

As the answer from Thomas Lin states, the problem was I as trying to pass in parameters to a non-parameterized pipeline.
I went ahead and made my jenkins pipeline paramterized:
https://github.com/jenkinsci/pipeline-model-definition-plugin/wiki/Parametrized-pipelines
parameters {
string(defaultValue: "", description: 'What profile?', name: 'mavenProfile')
}
And as soon as I did that my Jenkins pipeline started receiving parameters from my Spinnaker pipeline.

Related

Jenkins: Automate Notifications for a Jenkins Pipeline

I have a Jenkins pipeline which performs different steps during deployment. While the deployment is being performed, I would like Jenkins to send notifications about the status of each step to a channel added on communication tool "Teams".
Can someone provide a suggestion on the best route to achieve this?
You should be able to use the Office 365 Connector for this.
stage('Upload') {
steps {
// some instructions here
office365ConnectorSend webhookUrl: 'https://outlook.office.com/webhook/123456...',
message: 'Application has been [deployed](https://uat.green.biz)',
status: 'Success'
}
}

How to create dataflow pipeline and auto deploy to google cloud?

I'm using Apache beam and maven to create pipeline and run dataflow jobs. After the logic coding, I run the following command to upload the job/template to Google Cloud.
mvn compile exec:java -Dexec.mainClass=com.package.MyMainClass -Dexec.args="--runner=DataflowRunner --autoscalingAlgorithm=NONE --numWorkers=25 --project=<PROJEC> --subnetwork=regions/us-east1/subnetworks/default --zone=us-east1-b --network=default --stagingLocation=gs://<TBD> --templateLocation=gs://<TBD> --otherCustomOptions"
After that, I've seen two ways of starting to run the job
I had to go to the Dataflow UI page, click to create a new job and use my own template blablabla... and then the job will start running
The job already started running
I wonder how 2 is implemented. I basically want to get rid of the hassle of going into the UI. I want to submit and start the job right here at my laptop. Any insights will be appreciated!
It's important to make a distinction between traditional and templated Dataflow job execution:
If you use Dataflow templates (as in your case), staging and execution are separate steps. This separation gives you additional flexibility to decide who can run jobs and where the jobs are run from.
However, once your template is staged, you need to explicitly run your job from that template. To automate this process, you can make use of:
The API:
POST https://dataflow.googleapis.com/v1b3/projects/YOUR_PROJECT_ID/templates:launch?gcsPath=gs://YOUR_BUCKET_NAME/templates/TemplateName
{
"jobName": "JOB_NAME",
"parameters": {
"inputFile" : "gs://YOUR_BUCKET_NAME/input/my_input.txt",
"outputFile": "gs://YOUR_BUCKET_NAME/output/my_output"
},
"environment": {
"tempLocation": "gs://YOUR_BUCKET_NAME/temp",
"zone": "us-central1-f"
}
}
The gcloud command line tool:
gcloud dataflow jobs run JOB_NAME \
--gcs-location gs://YOUR_BUCKET_NAME/templates/MyTemplate \
--parameters inputFile=gs://YOUR_BUCKET_NAME/input/my_input.txt,outputFile=gs://YOUR_BUCKET_NAME/output/my_output
Or any of the client libraries.
Alternatively, if you don't want to create a Dataflow template and you just want to deploy and run the job directly (which is probably what you're reffering to in point 2), you can just remove the --templateLocation parameter. If you get any errors when doing this, make sure that your pipeline code can be executed for a non-templated job as well; for reference, take a look at this question.
Once the template is staged, as well as the UI you can start it using:
REST API
Gcloud Command Line

create consolidated build report from multiple jenkins instances

I am having multiple jenkins instances like Jenkins A, Jenkins B and Jenkins C.
Now I am trying to make a report which will have the details about all of the three jenkins at one place.
report about : "Total Build", "Success", "Failed".
( from jenkins A,Jenkins B, Jenkins C)
Is there any shell script which runs on every Jenkins Instances and combine the Script Output at one place?
Make sure anonymous has read access to all your jobs in jenkins. Use powershell to invoke the job's url in the following format
http://jenkinsA:8080/view/viewname/job/jobname/1/console
http://jenkinsA:8080/view/viewname/job/jobname/2/console
http://jenkinsB:8080/view/viewname/job/jobname/1/console
http://jenkinsC:8080/view/viewname/job/jobname/2/console
Keep a count for the number of possible builds for each job. Each time your count is incremented, create an http request in powershell to invoke the url. If the request returns 404, you know the build does not exist and if it is http 200, the build exists. extract the line 'Finished:' which will give you either success or failure.
Based on the results, increment your success or failure count.
Hope it helps!

hudson and jenkins parameterized trigger plugin - running the same job multiple times with different parameters

I'm trying to run the same job multiple times with different parameters via a parent job. However, only the first of the triggered jobs runs.
The parent job has the checkbox "Trigger parameterized build on other projects" checked, and there are two triggers created, each with a different parameter value for a parameter x on the downstream job. Job 1 has x=1, Job 2 has x=2. Only job 1 is run!?
What am I missing?
This is a bug in Hudson.
It was reported to Jenkins and fixed there, both in the core and this particular plugin several months ago.
See also the Jenkins vs Hudson discussion on StackOverflow for further reasons to upgrade to Jenkins.

Jenkins/Hudson upstream job does not get the status "ball" color of the downstream jobs

I have a job upstream that executes 4 downstream jobs.
If the upstream job finish successfully the downstream jobs start their execution.
The upstream job, since it finish successfully, gets a blue ball (build result=stable), but even tough the downstream jobs fail (red ball) or are unstable (yellow ball), the upstream job maintain its blue color.
Is there anyway to get the result of the upstream job dependent on the downstream jobs?, i mean, if three downstream jobs get a stable build but one of them get an unstable build, the upstream build result should be unstable.
I found the solution. There is a plugin called Groovy Postbuild pluging that let you execute a Groovy script in the post build phase.
Addind a simple code to the downstream jobs you can modify the upstream overall status.
This is the code you need to add:
upstreamBuilds = manager.build.getUpstreamBuilds();
upstreamJob = upstreamBuilds.keySet().iterator().next();
lastUpstreamBuild = upstreamJob.getLastBuild();
if(lastUpstreamBuild.getResult().isBetterThan(manager.build.result)) {
lastUpstreamBuild.setResult(manager.build.result);
}
You can find more info in the entry of my blog here.
Another option that might work for you is to use the parametrised build plugin. It allows you to have your 4 "downstream" builds as build steps. This means that your "parent" build can fail if any of the child builds do.
We do this when we want to hide complexity for the build-pipeline plugin view.
We had a similar sort of issue and haven't found a perfect solution. A partial solution is to use the Promoted Builds Plugin. Configure it for your upstream project to include some visual indicator when the downstream job finishes. It doesn't change the overall job status, but it does notify us when the downstream job fails.
Perhaps this plugin does what you are looking for?
Jenkins Prerequisite build step Plugin
the work around for my project is to create a new job, which is the down stream of the down streams. We set a post build step "Trigger parameterized build on other projects " in all three of the original downstream jobs. The parameter that parse into the new job depends on the three jobs' status and the parameter will causes the new job react accordingly.
1. Create new job which contains one simple class and one simple test. Both parameters dependens, i.e. class fail if parameter "status" = fail, class pass but test fail if parameter "status"=unstable, etc.
2. Set Trigger parameterized build on other projects for the three original downstream jobs with relevant configurations.
3. Set notification of the new job accordingly.

Resources