Set Debian control file properties when deploying to Artifactory - maven

I'm building a binary Debian archive with Maven, using org.vafer.jdeb plugin to create the .deb and maven-deploy-plugin plugin to deploy the archive on Artifactory.
The archive is correctly built (with control file and its mandatory fields). Nevertheless, when deployed to Artifactory, properties as deb.distribution, deb.component, deb.architecture, etc... are not set ; the archive can't be found.
Any idea on how to set the properties ?

Can you please specify to which repository type are you deploying?
Since this is a .deb file, in order for Artifactory to produce the right metadata for it, you will need to use Debian repositories. Since you are deploying using Maven, it sounds like you deploy the artifacts to a Maven repository which could cause the metadata to be calculated differently.
I recommend maybe adding a promotion step to copy/move these files from the Maven repository to a Debian repository and check whether this step adds the properties.
I hope this helps and clarifies further.

Related

Maven and dependencies NOT in repository

We have a dependency third-party library that is available online in jar form, but it is not in Maven Repository, or known to be in any other repository.
How can we use pom.xml to auto-retrieve this dependency, based on a URL?
We don't want to store it in our Git repo, because that's A Bad Thing.
The idea here is that when people check out the project, they can use their IDE Maven integration (or just mvn command line tools) to download all the dependencies. So we would want to be able to also download this other third party dependency just like all the ones in Maven repo.
I have not been able to come up with an answer to this based on searches -- all solutions seem to be "download it first and create a local repo." Obviously Maven can download from the Internet, since that's how it connects to Maven Central and other repos. So I don't see why it cannot download arbitrary URLs that present packages in recognizable formats.
Long term, the best solution is to use your own artifact repository like Nexus, Artifactory or Archiva.
All of these have a manual upload function that you can use to set the groupId, artifactId and version, so you can then refer to the artifact as usual.
If you want to go really low tech, I think you can just put some machine's local repository behind an Apache, provided you grant read/write access.
Then you need to add your new repository in the Maven settings.xml file, as described here.
Maven uses the coordinates to navigate the repository (which has a specific layout) and verify artifact checksums for corruption/tampering using metadata files in specific locations of the repo.
AFAIK this is similar to other package management systems like APT and RubyGems that use repo manifests and don't allow arbitrary URL downloads.
Skipping the repository manager
If you really don't want or can't use a repository manager, you can always download the artifact and manually install it using the Maven Install Plugin:
mvn install:install-file -Dfile=your-artifact-1.0.jar -DgroupId=org.some.group -DartifactId=your-artifact -Dversion=1.0
However, you'll have to do this on every machine that runs the build, every time that artifact needs to change.

How to download maven dependencies from Jenkins without a binary repository

Are there any plugins or ways to download the dependencies for a maven project from Jenkins? I am using Jenkins for a multi-module desktop application. Although I know I could just archive all dependencies, I don't see why there isn't the ability to download dependencies using maven which installed on the same machine as Jenkins. Preferably one would specify the location of a pom and then have the ability with one click to download all the dependencies for that pom. Can you do this? I do not need or want an entire binary repository for this feature.
Edit: I will try and rephrase this as I don't think people are understanding.
In Jenkins one has the ability to archive artifacts at the end of a build. Also in jenkins you have integration with maven. When building a jar in maven you have arguablly 2 options:
You can either use the assembly plugin which zips all .class files
together with those produced from your source code resulting in 1 jar
You can create a jar just source code which references all
dependency jars which are located in a separate folder.
In Jenkins one also has the ability to download the latest artifact. Now if I am using Option 2, I can either archieve just the jar which my sources produced, which I would say is more desirable for space and is the whole purpose of the archive functionality, or you can also archive the libraries too.
Here is the PROBLEM!! If I don't archive the libraries then I cannot easily run this jar, as it is a desktop application and its dependencies cannot be obtained in the same mannor as clicking on a link from jenkins. So lets say my question is what is the easiest way to obtain them? Extra info: assume jenkins is running as a server and you can't use artifactory or another server application, that seems to me to be massive over kill.
Use the maven plugin and create a maven job for your project. Jenkins will then use the maven command you provide in the job configuration to build the project. This means maven will download the projects dependencies and store them on the machine jenkins is running. Normally this would be <JENKINS_HOME>/.m2/repository. This way you get a local repository that only contains the dependencies of the projects you created maven jobs for.

Jenkins: deploying war files from artifactory

We are using Jenkins to build (maven) & deploy artifacts (JARs & *WAR*s) to an in-house artifactory server (both snapshots and releases).
For deployment, currently, we got Jenkins jobs that package the war file (from a release scm tag) and deploy to different environments/servers. We want to skip the package phase as it seems unnecessary to package it again & again for a released version because it's not possible to get a different copy of war file even after trying 1000 times.
We are looking for a way in Jenkins to get the artifact (war) from Artifactory and deploy it to a container. I am sure other people would have faced this situation too but I am not able to find any online material regarding this.
Is there any Jenkins plugin that takes a war file from Artifactory (based on a version) and deploy it to a remote container?
If this is not the right way of doing it then what are the recommendations for any other approach?
Thanks
I don't know about a plugin which takes a version # and deploys that, but you can build a Jenkins job to deploy the last successful release to a previous environment (thus copying from DEV-->QA for example.)
To do this, you would use the copy-artifact-plugin.
Here's an easy to follow run-through of this kind of setup:
http://www.lordofthejars.com/2012/09/deploying-jee-artifacts-with-jenkins.html
Every artifact stored in Artifactory will have a unique URL that includes the version number. It will take the format
http://artifactory-server/repository-name/path-to-artifact/version/filename
e.g.
http://artifactory/apps-releases-local/com/yourorg/yourapp/1.5.67/webapp.war
(depending on how you do your packaging, the WAR file name may include the version number as well).
So your deployment job can construct the Artifactory URL and download the file. Depending on how you have security set up in Artifactory, you may need to authenticate the request.

Maven―Dependencies, static content from remote repository

I am a bit new to maven, but I have some experiences with ant and the build process. I would like to do one thing that is kind of driving me nuts:
Given:
A remote repository (git, svn, hg,…) that holds static content (like images),
one maven project that uses/manages the mentioned repository in the same way as it does with all other dependencies (checkout on install, update whenever updates occur), in other words: the maven project depends on that repository
I finally want to be able to access the content (no *.svn or *.git) and copy it into my build, on build time*.
I want maven to store a local copy of that repository in maven`s local repository (~/.m2/repository) only once on each machine and manage it like all other dependencies.
*I am not trying to build a Java project
Thanks for help!
From what I've seen, Maven projects don't use version control repositories as external artifacts. That's a little too fine-grained for what you want, I think.
I've done something similar, when Project A wanted to use resources from Project B.
Project B, as part of its build procedure, collected it's resources into a ZIP file and deployed the ZIP file into a maven repository.
Project A then references the ZIP file artifact, unpacking it when building to where it needs it.
Look into the dependency plugin for maven, especially the dependency:unpack and dependency:unpack-dependencies goal.
Have fun

Trouble with maven in Netbeans

I want to create maven project in Netbeans. So, I do File->New project->Maven->Java Application. After that I try to build the project and get error:
The POM for org.apache.maven.plugins:maven-surefire-plugin:jar:2.10 is missing, no dependency information available.
But I can build this project from command line with mvn compile. Could uou tell me what is the problem with Netbeans?
NetBeans is using 3.0.4 maven by default. Unless you change that in Tools/Options menu. Are you building with 3.0.4 as well or are you using some earlier versions (2.x)?
That would explain the behaviour because 3.0.4 will not blindly rely on what artifact is in local repository but some additional metadata is also consulted to make sure your project build with the given set of defined repositories.
A common example when the problem occurs to me.
I use central directly everything downloads. when I later add a mirror, all artifacts are checked again through the mirror to make sure they are accessible. if teh Mirror doesn't actually mirror central, I get an error that way.
Another common example is: when building with 2.x, the additional metadata is not written, when later building with 3.0.4, all remote context is checked no matter what is present in local repo and the additional metadata files are constructed.

Resources