Deploying maven artifacts without rebuilding - maven

We are using declarative pipeline, latest Jenkins. Builds are executed in a docker slave container, which has maven and other tools. Our current Jenkinsfile resembles this:
stage('build') { steps { sh 'mvn clean install'} }
stage('test') { /* functional tests using a different tool */ }
stage('publish') {
steps {
// mvn clean deploy -DskipTests -DaltDeploymentRepository... (rebuilds, which we want to avoid, as its a multimodule project)
// withMaven(options:[artifactsPublisher()] { } ... (from maven-pipeline-plugin)
}
}
Jenkins classic mode has a Maven Integration plugin which provides a section "Deploy artifacts to Maven repository" which uses Maven RedeployPublisher to only publish artifacts. I am looking for a pipeline equivalent of this, I thought the maven-pipeline-plugin does this, but cant find an example. Any pointers appreciated!

I stumbled upon your question looking for the same thing, and what worked for me was this:
stage('Deploy') {
sh "'${mvnHome}/bin/mvn' war:war deploy:deploy"
}
Of course, you need to change war:war to the type of the artifact that you want to deploy (so jar:jar or ear:ear). I found this basing on this answer, but it seems to be relevant to maven-war-plugin, as well as to maven-jar-plugin, although there is no forceCreation option in the war goal.

Related

Is there a way to programmatically specify pom.xml path in jenkinsfile

In a declarative pipeline, I manually specify pom.xml path in Jenkinsfile and Jenkins is able to locate it as expected at build time.
pipeline {
agent any
options {
timestamps()
}
stages {
stage('Compile') {
steps {
withMaven(maven: 'MAVEN_HOME') {
sh 'mvn -f /Users/jo/.jenkins/workspace/DeclarativePipelineDemo/Demo/pom.xml clean install' //filepath
}
}
}
}
Now, is there a more elegant way to tell Jenkins to dynamically capture the workspace/pom.xml classpath directly from my project so I don't need to manually specify it?
If your Jenkinsfile in same repo of pom.xml, you can use relative path.
When Jenkins run you pipeline, it will automatically clone the repo that holds the Jenkinsfile to Jenkins slave.
If pom.xml in base dir of project, you can try
sh mvn -f pom.xml ...

Integrate Artifactory into Jenkins Pipeline

I am trying to integrate Artifactory into my Jenkins pipeline in order to push Maven artifacts to Artifactory.
My current understanding is, that I should deploy the built Maven artifacts with using the Jenkins pipeline rather than through a Maven plugin during the deploy lifecycle of Maven.
Based on the documentation that I have read so far I think I need code similar to this in my pipeline:
stage('Build') {
steps {
/** Start a docker container with maven and run mvn clean install */
}
}
stage ('Deploy to Artifactory') {
steps {
script {
def server = Artifactory.server 'my-server-id'
def rtMaven = Artifactory.newMavenBuild()
rtMaven.deployer.addProperty("status", "in-qa")
buildInfo = rtMaven.run pom: 'pom.xml', goals: 'clean install'
server.publishBuildInfo buildInfo
}
}
}
However, I fail to completely understand what this is doing and I am unable to find more detailed documentation, execept for this JFrog blog entry and this JFrog Confluence page.
In particular it seems that if I specify goals to the run directive it will run the Maven Pipeline again, which would not make much since the pipeline already ran in the first stage (e.g. 'Build').
In addition I should note, that I am running the maven build inside a docker container, so it seems that using the above setup is not quite enough for me.
Is there a best practice to approach this?
What I am looking for is a way to collect the artifacts that I built with maven in my docker container and upload them to Artifactory without running maven again.

sonar qube on jenkins pipeline

How can I make that sonnar qube server analyze a mvn project on jenkins?. I have a pipeline project from CSM with a jenkinfile which point to a groovy file where all steps of the job are executed. All steps are working ok (mvn test, mvn package, mvn compile, etc), but donĀ“t know how to execute the mvn sonar:sonar. It gives following error.Image show how do I have sonar configured in jenkins and the job step where it fails.
And this is how I have the step described in groovy file of pipeline:
stage ('SonarQube analysis') {
withSonarQubeEnv('https://sonarqube.xxxxx.com') {
sh 'mvn sonar:sonar'
}
}
Try using the server installation name in withSonarQubeEnv, right now you are using URL i.e. withSonarQubeEnv('Grey Azure Sonarqube').
Documentation

Configure settings.xml in jenkins slave created on the fly in AWS

I am creating Jenkins Slave on the fly configuring it on AWS with Spot Instances.
In the global tool configuration, I have set to use my own "settings.xml" for the master is working perfectly.
But when the server start slaves (without maven installed) it auto install maven (set in the Jenkins file to install this tool) but without putting any settings.xml
*I know I can copy the setting.xml directly from the server but for me looks like it is not the appropriate way to do it.
* I already did mvn -X in order to see find the folder for the settings but this is not used.
Added one small slice of the jenkinsfile
pipeline {
tools{
maven 'maven default'
}
agent any
stages {
stage('Maven build') {
steps {
sh 'mvn clean install'
}
}
}
}
You have to use withMaven() in the Pipeline code..which has to look like this:
withMaven(jdk: jdkName, maven: MavenInGlobalToolsName, mavenSettingsConfig: 'IdInConfigFileProvided', mavenLocalRepo:".repository") {
sh "mvn clean verify"
}
The IdInConfigFileProvided is the important part which makes the reference to the config file provider plugin...
The other solution could be to use the config file provider directly in Jenkins file:
configFileProvider(
[configFile(fileId: 'maven-settings', variable: 'MAVEN_SETTINGS')]) {
sh 'mvn -s $MAVEN_SETTINGS clean package'
}
But this solution will not handle the support global tools support for Maven itself. So I would suggest to prefer the withMaven() solution.

Jenkins Pipeline Plugin -- getting and building dependencies

I'm brand new to Jenkins, and my searches are turning up so little that I think I just don't get the terminology yet.
I have a project I want to build with Jenkins Pipelines. It's a Java/maven project, in a GIT repository. It depends on two other Java/maven projects of mine, also in GIT repositories.
How do I explain this relationship to Jenkins?
Let's simplify. Say I have ProjectA that depends on ProjectB. I can get Jenkins to build ProjectB no problem. I can even archive the jar if I want, so a compiled copy of ProjectB is stored in my Jenkins server.
But no matter what I do, ProjectA fails to build with
[ERROR] Failed to execute goal on project ProjectA: Could not resolve dependencies for project ProjectA: The following artifacts could not be resolved: ProjectB:jar:0.9: Failure to find ProjectB:jar:0.9 in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced -> [Help 1]
This HAS to be super simple, I just can't figure out what I even need to search for.
My Jenkinsfile in ProjectA looks like this right now:
pipeline {
agent {
docker {
image 'maven:3-alpine'
args '-v /root/.m2:/root/.m2'
}
}
stages {
stage('Build') {
steps {
build 'ProjectB'
sh 'mvn -B -DskipTests clean package'
}
}
}
}
Project A is independent of Project B. Archiving them in Jenkins does not put them anywhere that Maven can recognize and use them as a dependency. You have to make sure your maven job knows how to find them. There are a couple options.
You could publish to a repository like artifactory. Then Maven just needs to be configured to look at that repo.
You could use the Jenkins REST API, or even just the uri to find and download the artifact into the workspace if your new build.
You can use the Copy Artifact plugin to pull the artifact from another build into your workspace so you can use it.
Or, since it is a pipeline, you can build both pieces in different stages of the same pipeline.
If the libararies you are building in Job B are only specifically used for job A, I would consider just building it all in the same pipeline. But sometimes it still makes more sense to use some kind of external repository to publish your libraries, then just configure maven to look at that repo to find your dependencies. I usually would use that option, but it does take more software and more setup.

Resources