Maven deploying without uploading any pom file - maven

I'm building a multiplatform javafx application. The final process is to create an installer (exe,dmg,deb.. with jre bundled) and upload it to a special "product release" repo. Given how javafx build needs to be done, it runs an jenkins matrix job on 3 different platforms. The last step is the deployment. I attach the installer with the build-helper-maven plugin.
I'm able to upload the installer correctly for one platform, but because the deploy seems to upload the pom file, it cannot be uploaded again from another jenkins slave.
First I was having problem of maven uploading the "main jar", but I managed to disable that by binding the jar plugin to 'none' phase (I use the maven-javafx-plugin which creates own main jar). However I'm unable to disable the pom generation and uploading. I have set
<generatePom>false</generatePom>
for the maven-deploy-plugin but it seems to not have any effect (I assume it works for the main jar which I have already disabled).
Is it possible to disable the pom generation/upload completely (similar like gradle's 'uploadDescriptor false' option) and only upload 'attached artifacts' ?
EDIT/NOTE: I probably try the deploy file option next, https://maven.apache.org/plugins/maven-deploy-plugin/deploy-file-mojo.html, but it would be nice to know if this can be done on the complete project level

I managed to solve this. I did it with the deploy-file option:
I bound the default-deploy to none in order to disable it. Then I made new "native-deploy" execution which I bound to deploy phase. In it's configuration I put the generatePom false and pointed the file to the same I was earlier using to attach it as side artifact (via the build helper plugin). I did have to put the coodrinates again (i.e. passtrough from project.*). But did work and uploaded the installers from all machines without any pom files. I did have to re-enable the default jar creation because maven complained that there is no artifact bound (I guess I could change the packaging to be pom, anyway it is not uploaded because the default upload is disabled so ok there).
I guess if there is more general solution it would be more correct answer but for me this is enough.

Related

Why refresh of Maven repository is not enough for IntelliJ?

I had a NoClassDefFoundError problem with some test, launched from IntelliJ. In order to repair the situation, I had to make several changes in many poms of the project - adding new packages and excluding some old ones for to escape the overlapping of them. Also, I reapired the situation with different versions. But the situation did not improve. Again, some package, declared in pom, was not found where it should be.
I refreshed the maven repository by
mvn -e clean install -U
, as is advised in https://stackoverflow.com/a/9697970/715269 - so old and upvoted answer, that it surely looks as Santa.
The problem remained unchanged.
I output the maven map. It was correct and it contained all needed.
I looked at the list of the External Libraries of the project. It was the old uncorrected list of overlapping jars with same names and different versions, and without good packages I added just now, and well seen in maven tree output!
Already hapless,
I reimported packages in IntelliJ
by:
Ctrl+Shift+A, Reimport All Maven Projects.
Ho! The list of libraries got repaired. And the problem, mentioned in subj, disappeared.
The question is: How it could happen, that the same project has that very pom for everything, but gets packages differently being launched in maven and in IntelliJ?
I know about that feature "delegate IDE build to Maven". And I keep it turned off. But I am NOT talking about the different SW for building. Whether they are different or not, they should be up to the actual pom's. And whereas maven, if turned off from the automatic building won't know about changes in poms, IntelliJ KNOWS about them. It could have jars up to pom, or up to maven - it has sense, but it simply has some old rubbish. Was there some deep thought under that construction?
Every time you manually change the pom.xml file, including the dependencies you need to load these changes into IDE. IDE does it on Reload from Maven action. See also Import Maven dependencies.
Intellij doesn't use maven to bulid and run a project except you are delegating build and run action to maven:
Since, IDEA doen't really use maven to run and build, it uses the pom.xml to import the project structure and "tries" to build the project the same way was maven does.
Actually, there are quite a few differences between these to build processes.
Generating sources or filtering resources (don't know if this is still an issue) aren't done during building the project with Intellij IDEA.
In case you are using code generation you have to build the project via maven first and then - when all the resouces are filtered and additional sources are generated - you are able to run, debug aso. the project with Inellij IDEA.
That's an important thing to be aware of and that's the reason why maven and IntelliJ IDEA project structures might get out of sync.
You can enable the "Reload project after changes in build scripts" feature and select the Any changes checkbox to keep your project structure updated:
Why should you disable this feature anyway
If you are working on a build file (gradle or maven is not important) reloading the structure on any change can be very anoying. It's cpu intense, dependcies are fetched aso.
Therefore, I prefer to reload project structure only in case of an external change. This happens when pulling an updated version of the build file for example.

Tweak generation of IntelliJ configuration for building artifacts (derived from build.gradle)

Having a working setup to build my war-file using gradle, I'd like to tweak the generation of IntelliJ's own configuration for building artifacts (which is being derived from build.gradle) in a more persistent way.
For now, I am able to change IntelliJ's build configuration (for e.g. "Gradle : org.example.servlet : Servlet.war (exploded)") doing Build => Build artifacts... => Edit... as desired (in my case additionally adding the content of a certain directory to the exploded war). However, whenever "Refreshing the gradle project" or doing an "Import Changes" after e.g. applying a change to build.gradle, IntelliJ regenerates this configuration for building this artifact and my manual changes are lost.
This is expected, IntelliJ even warns about this (when making changes after clicking above Edit...).
Question: Is there a way to tweak the generation of IntelliJ's own configuration for building artifacts in such a way that the changes survive the mentioned refresh? Most likely by adding meta-information to build.gradle?
Looking around I came across gradle-idea-ext-plugin but was unable to set it to any use. If someone thinks, this is the way to go, please give an example of adding a folder's content to an artifact.

How to skip a maven build step without modifying the pom itself?

We have a maven based Java EE project controlled by the customer. For internal reasons, we cannot execute one of the build steps, but the rest works fine and produces the jar we want.
Since editing the pom file would require taking care when committing to customer's SVN and copying the pom file would require taking care to sync changes comming from there, we are looking for a way to skip this specific step in the build section during the maven call itself, so to say mvn clean install but-leave-out-this-build-plugin-step, is there any?
Edit:
The plugin in question is the rpm-maven-plugin, which prevents the build from running on Windows. We found information how to make it work which won't really fit in our current setup. And since we cannot modify the customer's pom, I was looking for a way to trigger the skipping externally. But maybe there are other ways to just ignore/skip/fake this step?
It depends on what plugin you want to skip. Many plugins have ability to be skipped via system property (-Dblabla).
For deploy plugin it is -Dmaven.deploy.skip=true, for surefire -DskipTests=true.
Read plugin documentation, maybe you can find skip property
The rpm plugin hase a property disabled, unfortunately it is not accessible by a property. So, if setting this property in the customer's pom (or asking for editing it) with a default value of false is an option, this may be the solution.

Deploy zip artifact from another build action to Nexus

Is it possible to deploy arbitrary zip archive artifacts to Nexus through Maven as snapshots?
We have a build step that is not supported through any application-specific Maven plugin. Instead, our full build and deployment process is as follows:
1) Maven POM compiles the Java component of the build, using Jenkins.
2) Shell script calls create a deployable artifact shell scripts were wrapped around calling a code generation application, which are then zipped up into an archive by the application itself. I need these artifacts deployed to Nexus as both snapshots, and as releases as appropriate.
I tried using the maven-assembly-plugin however this assumes that the plugin itself is creating the zip archive, not simply deploying an archive that was produced by some other method.
I would prefer to do this within Maven since our Nexus settings and credentials are already within the environment and do not need to be passed manually on the command line. Using the Nexus UI for this is not a viable option since this needs to be part of a standard build-deploy-test process, which may happen many times per day, for a couple dozen applications.
For completeness, I'm answering my own question (oh bother...)
I resolved this issue by using the maven-assembly-plugin, which allows you to define arbitrary artifacts, and deploy them (snapshots or releases) to Nexus. The assembly plugin uses a bill of materials (src.xml) that defines the exact contents of the artifact (either including or excluding files, directories, changing file permissions, etc). This can also be used for creating Java uber jars, but it appears that using the Maven Shade Plugin is the preferred method for creating uber jars.
Maven Assembly Plugin main webpage

Maven deploy: forcing the deploy even if artifact already exists

I'm building a project, which is made up from several (sometimes unrelated) modules and some more non standard java modules (built with ANT).
Each maven module is deployed to the releases repository on completion.
If the build fails in the middle, I might have some modules already deployed, so if I try to rebuild, the new attempt to deploy will fail since the artifacts are already deployed.
Is it possible to force a deploy or instead, remove the deployed artifact before I deploy again?
It sounds like the middleware admins have configured your remote repo instance (Nexus or Artifactory or whatever) to not allow artifact redeployment, and as #khmarbaise says there are good reasons for that. Nexus can be configured to allow artifact deletion by users in a particular role or with artifact deletion privileges. If your admins have it set up that way perhaps you can request the delete privilege and remove the offending artifacts. Or, perhaps the Nexus admin will agree to do it for you.
If neither of these is possible, here are some things to try which might keep this from happening in the future:
If you are using the release plugin, do a dry run (-DdryRun=true on the release:prepare command line) first. Maven should report any errors without committing to SCM.
Try running mvn install on your group of projects first. This will install the artifacts to your local repo, not the remote. If there's a problem you can whack the artifacts out of your local repo and start from scratch, repeating until you get a complete build.
If you are running a multi-module build, there are command line options that allow resuming a Maven build from a particular project forward.
Define -Dmaven.deploy.skip=true on the Maven command line. This is similar to suggestion #2, except Maven will actually load & configure the deploy plugin, it just won't do the actual deploy to the remote repo. Once everything works, remove the skip property.
I know it might be late, but in Nexus there's an option where allows the redeployment of artifacts.
Just select the repositories in the left, choose the repository you want to change the policy and then set it to Allow Redeploy.
The possible options have been increased ;)
Use the parameter deployAtEnd (More information: here). With this parameter the artifacts are deployed only, if all artifacts were built successfully.

Resources