I am using a Jenkins pipeline to download my code and building it inside the official maven container
pipeline {
agent none
stages {
stage('Back-end build') {
agent {
docker {
image 'maven'
label 'master'
}
}
steps {
sh 'mvn -f de.vitasystems.qrcode.generator/pom.xml -s /usr/share/maven/ref/settings.xml clean package'
stash includes: 'target/*.war', name: 'app'
}
}
}
}
After doing that I realized that I need the settings.xml for the nexus repository configuration and the settings-security.xml in order to allow to download it.
Then I did this:
I had created another Dockerfile that use the maven one (FROM maven)
and it copy the necessary files setting.xml and for been used in the
previous pipeline.
It is referring to the correct repository but it is not authorized to download files
Add the security-settings.xml is needed in order to be able to download the data from my nexus (maven nexus password) but I can not reference it or use it.
How can I use the security-settings.xml in this container?? Maven documentation says that I put the file in the $HOME/.m2 folder but it is not working.
Regards.
try to use the config file provider plugin , he will copy the settings.xml from the master into your workspace
stage('Build')
{
steps {
configFileProvider(
[configFile(fileId: 'your-id-xxx', variable: 'MAVEN_SETTINGS')]) {
sh 'mvn -s $MAVEN_SETTINGS clean install -P integration-tests'
}
I have already found the solution, the setting-security are not used until you are with the root user.
I make a Dockerfile to create a new docker images that wrap-up:
FROM maven
COPY settings.xml /root/.m2/settings.xml
COPY settings-security.xml root/.m2/settings-security.xml
ENTRYPOINT ["/usr/local/bin/mvn-entrypoint.sh"]
CMD ["mvn"]
This image will wrap-up the security configuration but for be able to be use in the jenkinsfile you need to run the container using root user.
Example
agent {
docker {
image 'maven'
label 'master'
args '-u root'
}
Related
In a declarative pipeline, I manually specify pom.xml path in Jenkinsfile and Jenkins is able to locate it as expected at build time.
pipeline {
agent any
options {
timestamps()
}
stages {
stage('Compile') {
steps {
withMaven(maven: 'MAVEN_HOME') {
sh 'mvn -f /Users/jo/.jenkins/workspace/DeclarativePipelineDemo/Demo/pom.xml clean install' //filepath
}
}
}
}
Now, is there a more elegant way to tell Jenkins to dynamically capture the workspace/pom.xml classpath directly from my project so I don't need to manually specify it?
If your Jenkinsfile in same repo of pom.xml, you can use relative path.
When Jenkins run you pipeline, it will automatically clone the repo that holds the Jenkinsfile to Jenkins slave.
If pom.xml in base dir of project, you can try
sh mvn -f pom.xml ...
I am trying to integrate Artifactory into my Jenkins pipeline in order to push Maven artifacts to Artifactory.
My current understanding is, that I should deploy the built Maven artifacts with using the Jenkins pipeline rather than through a Maven plugin during the deploy lifecycle of Maven.
Based on the documentation that I have read so far I think I need code similar to this in my pipeline:
stage('Build') {
steps {
/** Start a docker container with maven and run mvn clean install */
}
}
stage ('Deploy to Artifactory') {
steps {
script {
def server = Artifactory.server 'my-server-id'
def rtMaven = Artifactory.newMavenBuild()
rtMaven.deployer.addProperty("status", "in-qa")
buildInfo = rtMaven.run pom: 'pom.xml', goals: 'clean install'
server.publishBuildInfo buildInfo
}
}
}
However, I fail to completely understand what this is doing and I am unable to find more detailed documentation, execept for this JFrog blog entry and this JFrog Confluence page.
In particular it seems that if I specify goals to the run directive it will run the Maven Pipeline again, which would not make much since the pipeline already ran in the first stage (e.g. 'Build').
In addition I should note, that I am running the maven build inside a docker container, so it seems that using the above setup is not quite enough for me.
Is there a best practice to approach this?
What I am looking for is a way to collect the artifacts that I built with maven in my docker container and upload them to Artifactory without running maven again.
I am creating Jenkins Slave on the fly configuring it on AWS with Spot Instances.
In the global tool configuration, I have set to use my own "settings.xml" for the master is working perfectly.
But when the server start slaves (without maven installed) it auto install maven (set in the Jenkins file to install this tool) but without putting any settings.xml
*I know I can copy the setting.xml directly from the server but for me looks like it is not the appropriate way to do it.
* I already did mvn -X in order to see find the folder for the settings but this is not used.
Added one small slice of the jenkinsfile
pipeline {
tools{
maven 'maven default'
}
agent any
stages {
stage('Maven build') {
steps {
sh 'mvn clean install'
}
}
}
}
You have to use withMaven() in the Pipeline code..which has to look like this:
withMaven(jdk: jdkName, maven: MavenInGlobalToolsName, mavenSettingsConfig: 'IdInConfigFileProvided', mavenLocalRepo:".repository") {
sh "mvn clean verify"
}
The IdInConfigFileProvided is the important part which makes the reference to the config file provider plugin...
The other solution could be to use the config file provider directly in Jenkins file:
configFileProvider(
[configFile(fileId: 'maven-settings', variable: 'MAVEN_SETTINGS')]) {
sh 'mvn -s $MAVEN_SETTINGS clean package'
}
But this solution will not handle the support global tools support for Maven itself. So I would suggest to prefer the withMaven() solution.
I am trying to use Jenkins Pipelines to build a Maven Java project and deploy its artifacts into Nexus. The Nexus credentials are managed by Jenkins, so I need Jenkins to provide the Maven settings.xml file. My pipeline uses docker to run the build. My Jenkinsfile looks like:
node('docker') {
git 'git#bitbucket.org:myorg/myproject.git'
stage 'Build and Test'
// This throws: java.lang.UnsupportedOperationException: Refusing to marshal org.codehaus.groovy.runtime.InvokerInvocationException for security reasons
//configFileProvider([configFile(fileId: '7d6efcd2-ff3d-43dc-be40-898dab2bff64', variable: 'MYSETTINGS')]) {
// sh 'cp ${MYSETTINGS} mysettings.xml'
// }
docker.image('maven:3.3.9').inside {
// sh 'mvn -s mysettings.xml -U -e clean deploy'
// This fails trying to access AWS ECR (???)
withMaven(mavenSettingsConfig: '7d6efcd2-ff3d-43dc-be40-898dab2bff64') {
sh 'mvn -U -e clean deploy'
}
}
So far I am unable to provide the correct maven settings. There are some 'Knows limitation' on the withMaven (Maven Pipeline Plugin, https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Maven+Plugin)
Is there a workaround ? I tried to use the configFileProvider, but it throws UnsupportedOperationException because of security reasons.
i want to use the maven local repository additionally to a maven remote one. I found the JIRA-Issue http://issues.gradle.org/browse/GRADLE-1173 for that, but adapting my gradle build file in that way some snapshot dependencies which are only available in the local maven repository are still not found. I get an error that the Snapshot-Dependency is not found.
Is it possible to have one local and one remote maven repository?
Here is the relevant part of my gradle build file:
apply plugin: 'maven'
repositories {
mavenLocal()
maven {
credentials {
username "myusername"
password "mypassword"
}
url "http://myremoterepository"
}
}
I also needed to do a similar setup with my project and I can verify your build.gradle setup works provided your Maven is setup correctly.
Gradle's mavenLocal() relies on the localRepository definition from the maven settings.xml file:
<localRepository>USER_HOME\.m2\repository</localRepository>
The settings.xml should be in either your M2_HOME/conf or your USER_HOME/.m2 directory. You should check:
maven is installed correctly
M2_HOME environment variable exists
settings.xml has the correct localRepository defined..
Maven can only use a single local ("local" = on the harddisk of the computer on which Maven runs) repository.
If you need more, you're options are:
Run a remote server (like a company wide proxy) and deploy everything there. Put that server as a mirror into your settings.xml.
Run mvn install to copy the artifacts in your local repo (obviously only when you do have the sources)
Run a local server
Copy the artifacts manually into your local repo