I have two Jenkins Pipeline, say Pipeline A and Pipeline B. Here, Pipeline B is the subset of Pipeline A i.e. first Pipeline A will run then it will invoke Pipeline B.
Pipeline A is building the maven project using pom.xml.
Pipeline B will then get invoked, which will deploy the .war of maven project to artifactory.
I want to read the pom.xml in Pipeline B which will be passed as parameter from Pipeline A.
Can anyone help me with the way how we can read the pom.xml in Pipeline B?
Note: I am using declarative pipeline code.
You need to archive your file pom.xml in the pipeline A (with step archiveArtifacts). And then copy this archived file from pipeline A into your pipeline B (using Copy Artifact Plugin).
Something like this :
Pipeline A :
stage('Archive pom.xml'){
steps {
archiveArtifacts artifacts: 'pom.xml'
}
}
Pipeline B :
stage('Get pom.xml'){
steps {
copyArtifacts projectName: 'pipeline-A', filter: 'pom.xml'
}
}
Correct answer I believe would be to use an artifact repository manager to store the pom from the pipeline A, which you grab from the artifact repository manager during the pipeline B execution.
Related
I have Jenkins MultiBranch Pipeline created. I have single repository
library-project (root project) has readme file and sub projects; no Jenkinsfile
profit (gradle project) - has own Jenkinsfile i.e profit/Jenkinsfile
cost (gradle project) - has own Jenkinsfile i.e cost/Jenkinsfile
I tried to setup a Jenkins MultiBranch Pipeline. The first issue is Jenkinsfile in sub projects could not be scanned. For Build Configuration I chose Mode by Jenkinsfile and Script Path I put Jenkinsfile as a path. Discover Branches section for a git project I did try to regular expression filter by branch name (i.e profit and cost branch), but it does not find the Jenkinsfile so no build happens.
How can a single multibranch pipeline successfully scan the two subprojects Jenkinsfile? Is there a way to trigger only subproject that is changed?
In a declarative pipeline, I manually specify pom.xml path in Jenkinsfile and Jenkins is able to locate it as expected at build time.
pipeline {
agent any
options {
timestamps()
}
stages {
stage('Compile') {
steps {
withMaven(maven: 'MAVEN_HOME') {
sh 'mvn -f /Users/jo/.jenkins/workspace/DeclarativePipelineDemo/Demo/pom.xml clean install' //filepath
}
}
}
}
Now, is there a more elegant way to tell Jenkins to dynamically capture the workspace/pom.xml classpath directly from my project so I don't need to manually specify it?
If your Jenkinsfile in same repo of pom.xml, you can use relative path.
When Jenkins run you pipeline, it will automatically clone the repo that holds the Jenkinsfile to Jenkins slave.
If pom.xml in base dir of project, you can try
sh mvn -f pom.xml ...
I am trying to integrate Artifactory into my Jenkins pipeline in order to push Maven artifacts to Artifactory.
My current understanding is, that I should deploy the built Maven artifacts with using the Jenkins pipeline rather than through a Maven plugin during the deploy lifecycle of Maven.
Based on the documentation that I have read so far I think I need code similar to this in my pipeline:
stage('Build') {
steps {
/** Start a docker container with maven and run mvn clean install */
}
}
stage ('Deploy to Artifactory') {
steps {
script {
def server = Artifactory.server 'my-server-id'
def rtMaven = Artifactory.newMavenBuild()
rtMaven.deployer.addProperty("status", "in-qa")
buildInfo = rtMaven.run pom: 'pom.xml', goals: 'clean install'
server.publishBuildInfo buildInfo
}
}
}
However, I fail to completely understand what this is doing and I am unable to find more detailed documentation, execept for this JFrog blog entry and this JFrog Confluence page.
In particular it seems that if I specify goals to the run directive it will run the Maven Pipeline again, which would not make much since the pipeline already ran in the first stage (e.g. 'Build').
In addition I should note, that I am running the maven build inside a docker container, so it seems that using the above setup is not quite enough for me.
Is there a best practice to approach this?
What I am looking for is a way to collect the artifacts that I built with maven in my docker container and upload them to Artifactory without running maven again.
I'm brand new to Jenkins, and my searches are turning up so little that I think I just don't get the terminology yet.
I have a project I want to build with Jenkins Pipelines. It's a Java/maven project, in a GIT repository. It depends on two other Java/maven projects of mine, also in GIT repositories.
How do I explain this relationship to Jenkins?
Let's simplify. Say I have ProjectA that depends on ProjectB. I can get Jenkins to build ProjectB no problem. I can even archive the jar if I want, so a compiled copy of ProjectB is stored in my Jenkins server.
But no matter what I do, ProjectA fails to build with
[ERROR] Failed to execute goal on project ProjectA: Could not resolve dependencies for project ProjectA: The following artifacts could not be resolved: ProjectB:jar:0.9: Failure to find ProjectB:jar:0.9 in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced -> [Help 1]
This HAS to be super simple, I just can't figure out what I even need to search for.
My Jenkinsfile in ProjectA looks like this right now:
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v /root/.m2:/root/.m2'
}
}
stages {
stage('Build') {
steps {
build 'ProjectB'
sh 'mvn -B -DskipTests clean package'
}
}
}
}
Project A is independent of Project B. Archiving them in Jenkins does not put them anywhere that Maven can recognize and use them as a dependency. You have to make sure your maven job knows how to find them. There are a couple options.
You could publish to a repository like artifactory. Then Maven just needs to be configured to look at that repo.
You could use the Jenkins REST API, or even just the uri to find and download the artifact into the workspace if your new build.
You can use the Copy Artifact plugin to pull the artifact from another build into your workspace so you can use it.
Or, since it is a pipeline, you can build both pieces in different stages of the same pipeline.
If the libararies you are building in Job B are only specifically used for job A, I would consider just building it all in the same pipeline. But sometimes it still makes more sense to use some kind of external repository to publish your libraries, then just configure maven to look at that repo to find your dependencies. I usually would use that option, but it does take more software and more setup.
In my Jenkins pipeline I try to build in the Maven Surefire Plugin so I can run Maven Test within the Pipeline and ignore the failure of the Test (so that the pipeline process can go on):
mvn clean deploy -Dmaven.test.failure.ignore=false
However if I try to use the command I get following error:
class org.codehaus.groovy.ast.expr.UnaryMinusExpression, with its value 'Dmaven.test.failure.ignore', is a bad expression as the left hand side of an assignment operator at line: 6 column: 51. File: WorkflowScript # line 6, column 51.
oy -Dmaven.test.failure.ignore=false
^
I don't quite understand why it won't work, can anyone explain?
Your command line call of Maven is interpreted as groovy command. This means that you made a syntax error in your groovy script.
Yo need to use the pipeline maven plugin and run the maven call with a shell step:
https://wiki.jenkins.io/display/JENKINS/Pipeline+Maven+Plugin
Example:
withMaven(
// Maven installation declared in the Jenkins "Global Tool Configuration"
maven: 'M3',
// Maven settings.xml file defined with the Jenkins Config File Provider Plugin
// Maven settings and global settings can also be defined in Jenkins Global Tools Configuration
mavenSettingsConfig: 'my-maven-settings',
mavenLocalRepo: '.repository') {
// Run the maven build
sh "mvn clean deploy -Dmaven.test.failure.ignore=false"
} // withMaven will discover the generated Maven artifacts, JUnit Surefire & FailSafe reports and FindBugs reports