Best practice for using Maven or Gradle without internet access - maven

My company has a policy that software deployed into production has be be built on a specific machine that has no access to the internet.
We're currently using Maven. When running build on development machines, maven automatically download the dependencies from central Maven repository without problem. Then before go production, we put all files in local Maven repository (.m2/repository) into source control, and then run offline build with
mvn -o -Dmaven.local.repo=<local repo dir> package
this method works, but managing thousands of files in source control is a real pain, particularly the dependencies for Maven plugins. Thus my question, how can I improve the workflow so as to make it easier to maintain the dependencies in the source control?
I'm considering switching to Gradle, mainly because it's more flexible and doesn't depend on plugin downloaded from repository. but then I found out the Gradle local cache directory is not transportable between computers, which means I cannot check it into source control.
Suggestions and recommendations are all appreciated.

Use internal repository manager like Nexus or Artifactory. Always put released artefact to production.
But building project on production machine is not good idea. Better use complete artefact like EAR or WAR with all dependencies included, or something like jar-with-dependencies or other assembled distros. Build project on your CI server and deploy complete package with one click to production server.

Related

Does "build with local dependencies" exist in Maven without multi-module?

I have a set of applications, all use Maven and the local repository. The applications form a dependency tree using <dependency> in their pom.xml. All of these projects have -SNAPSHOT in their version.
Is it possible for Maven (or some compatible dependency manager) to build an application together with all of its local dependencies whose source changed?
I do not want to create a multi-module project, because:
the projects are exactly libraries, not modules;
I do not want an additional complexity just to have a form of build which is already precisely defined;
I want the process to be dynamic: if a library is mature enough to be put into a remote repository, it would be no more rebuilt with the main project and that's ok.
For now, there is a lot of refactoring, moving code from one library to another etc. and it happens often that substantial parts of the dependency tree need to be rebuilt. I thus need to manually write mvn install in several projects in order to assure that there is no stale code.
No, it doesn't work. Even with a multi-module project, maven does not detect which modules have changed sources in it and which do not.
There was a (flaky) implementation in Maven 2, but it was not continued in 3.x, see How to get maven 3.0 to only build modules with local scm changes
I hoped they would include it again in maven 4, but I didn't see it yet: https://maarten.mulders.it/2020/11/whats-new-in-maven-4/
I once did a similar setup, but had to use shell scripts with some git magic to get it working.
You can also decide to put your libraries in separate repo's from the start, and use the repo tool that google uses for android development: https://github.com/GerritCodeReview/git-repo/blob/main/README.md
Once you run mvn install on the particular Maven project, it will be accessible for all other Maven projects, which are on the same workstation, during dependency collection (before the compile phase).
Official Maven Build Lifecycle description:
install - install the package into the local repository, for use as a dependency in other projects locally
It's not necessary to keep libraries as part of the same project(or have it as a multi-module project). But once you want to share those libraries with your teammates, you would need either to force them installing libraries locally (as you did), or store those libraries at some external repo, like Artifactory or Nexus

How configure Maven to Work Offline? Complete solution

I need to configure maven to download the dependencies to a directory within my project so that I can copy my project to another PC without internet access. I have found the -o option and the "dependency: copy-dependencies" plugin, but nobody explains how to consume those dependencies later. What would be the way to download the dependencies and then consume it on a PC without an Internet connection?
Maven caches downloaded dependencies (and plugins -- just having the project's dependencies won't necessarily be enough depending on the pom structure) in ~/.m2/repository. If you build your project, then clone the ~/.m2/repository directory as well as your project to another machine, you should be able to build in offline mode with all dependencies available to use.
The dependency:copy-dependencies is pretty useless for the task you try to solve. You usually need much more to successfully build a project.
You can use a dedicated local repository for you project (this can be set on the command line), so that you can copy that (without the content coming from all the other projects).
But if you are in a company, the recommended way is to set up a Nexus/Artifactory server that manages your dependencies. Then you don't need internet access to build, but just access to that server.

Building and deploying native code using Maven

I've spent years trying to deploy libraries that use native code to Maven Central. I've run into the following problems:
There weren't any good plugins for building native code using Maven. native-maven-plugin was a very rigid build system that, among other things, made it difficult to debug the resulting binaries. You'd have to manually synchronize the native-maven-plugin build system with the native IDE you use for debugging.
Maven did not replace variables in deployed pom.xml files: MNG-2971, MNG-4223. This meant that libraries had to declare platform-specific dependencies once per Maven profile (as opposed to declaring the dependency once and setting a different classifier per profile); otherwise, anyone who depended on your library had to re-define those same properties in their project file in order to resolve transitive dependencies. See Maven: Using inherited property in dependency classifier causes build failure.
Jenkins had abysmal support for running similar logic across different platforms (e.g. "shell" vs "batch" tasks, and coordinating a build across multiple machines)
Running Windows, Linux and Mac in virtual machines was way too slow and fragile. Even if you got it working, attempting to configure the VMs as Jenkins slaves was a lesson in frustration (you'd get frequent intermittent build errors).
Maven Central requires a main jar for artifacts that are platform-specific: OSSRH-975
Sonatype OSS Repository Hosting and maven-release-plugin assumed that it would be possible to release a project in an atomic manner from a single machine but I need to build the OS-specific bits on separate machines.
I'm going to use this Stackoverflow question to document how I've managed to overcome these limitations.
Here is how I overcame the aforementioned problems:
I used CMake for building native code. The beauty of this system is that it generates project files for your favorite (native) IDE. You use the same project files to compile and debug the code. You no longer need to synchronize the two systems manually.
Maven didn't support CMake, so I built my own plugin: https://github.com/cmake-maven-project/cmake-maven-project
I manually hard-coded platform-specific dependencies into each Maven profile, instead of defining the dependency once with a different classifier per profile. This was more work, but it doesn't look like they will be fixing this bug in Maven anytime soon.
I plan to investigate http://www.mojohaus.org/flatten-maven-plugin/ and https://github.com/mjiderhamn/promote-maven-plugin as alternatives in the near future.
Jenkins pipeline does a good job of orchestrating a build across multiple machines.
Running Jenkins slaves on virtual machines is still very error-prone but I've managed to workaround most of the problems. I've uploaded my VMWare configuration steps and Jenkins job configuration to help others get started.
I now create an empty JAR file for platform-specific artifacts in order to suppress the Sonatype error. This was actually recommended by Sonatype's support staff.
It turns out that maven-release-plugin delegates to other plugins under the hood. Instead of invoking it, I do the following:
Use mvn versions:set to change the version number from SNAPSHOT to a release and back.
Tag and commit the release myself.
Use nexus-staging:rc-open, nexus-staging:deploy -DstagingProfileId=${stagingProfileId} -DstagingRepositoryId=${stagingRepositoryId}, and nexus-staging:rc-close to upload artifacts from different platforms into the same repository. This is called a Staging Workflow (referenced below).
Upon review, release the repository to Maven Central.
Important: do not enable <autoReleaseAfterClose> in the nexus-staging plugin because it closes the staging repository after each deploy instead of waiting for all deploys to complete.
Per https://issues.sonatype.org/browse/NEXUS-18753 it isn't possible to release SNAPSHOT artifacts atomically (there is no workaround). When releasing SNAPSHOTs, you need to skip rc-open, rc-close and invoke nexus-staging:deploy without -DstagingProfileId=${stagingProfileId} -DstagingRepositoryId=${stagingRepositoryId}. Each artifact will be uploaded into a separate repository.
See my Requirements API for a real-life example that works.
Other quirks to watch out for:
skipNexusStagingDeployMojo must be false in last reactor module (otherwise no artifacts will be deployed): https://issues.sonatype.org/browse/NEXUS-12365. The best workaround is to use Maven profiles to omit whatever modules you want when deploying (don't use skipNexusStagingDeployMojo at all)
skipLocalStaging prevents deploying multiple artifacts into the same repository: https://issues.sonatype.org/browse/NEXUS-12351

How to deploy locally already installed artifact with Maven

Story
I know, that maven deploy command runs through the whole lifecycle. My problem, that it takes to much time in my case. Let me explain:
There is an application built up from a Server, and a single sourced Eclipse RAP&RCP client
The communication is defined by shared API projects which are built together
with the Server, but also needed by the GUI projects
The GUI projects are built by Tycho, so its impossible to build
both of them in one build (in one reactor, EDIT: since the P2 artifacts are different for RCP and RAP)
I build a release with a multi step Jenkins build. To make sure, that
everything is fine I first make a clean install for Server and the
GUI variations one by one, and then I deploy them, if nothing fails
Question
Building everything twice takes a lot of time. Is there anything like "please simply deploy all built artifacts as they are from my local repository to the POM defined repository with skipping the whole lifecycle"?
If you have the artifact already by the previous build, you may consider the deploy:deploy-file by following the Guide to deploying 3rd party JARs to remote repository. I always use this goal to publish some stable artifact to the developer public remote repository for letting other team to test/use.
I hope this may help.
I don't think that there is a pure Maven solution to this. The problem is that your deploy only build won't know which artifacts to deploy – AFAIK this information is only in the in-memory Maven model and not persisted to the target folder.
The problem can be solved with a Maven repository manager that supports staging, like the (commercial) Nexus Pro. Then, your build would deploy straight away into a staging repository, and only promote the artifacts to the (main) repository if everything succeeded.

How to enable inside glassfish access to maven repository?

I have a following problem. We have a central maven repository hosted on our company server. Our team is working on a project. Everyone here uses that repository to get the required artifacts. If something is missing at the moment and is required for the task that the developer is currently dealing with, he installs this artifact manually to the central repository, so that his commits don't break the automated builds.
Now, each developer also has Glassfish v2 installed on his machine. That is for testing and debugging purposes. Before committing the changes, developer makes the .ear for the project with Maven help. However, after the developer deploys the ear to it's local glassfish, frequent errors arise, because the set of glassfish libraries may not contain all the latest dependencies of the central company repository.
Right now in case of the error the developer simply reads the log and looks what exactly is missing. After that he manually copies the required jar inside his local $GLASSFISH_HOME$/lib dir. But that seems a little bit frustrating. How can this be done automatically?
Right now we are trying to implement the following solution. The developer has to synchronize his local maven repository gathering all the artifacts from the central one that are required by the project. This local repository has to be placed on the java classpath, so that glassfish would also see it. Is that a correct approach? Maybe there is a way to install directly all the required artifacts from the central repository inside $GLASSFISH_HOME$/dir and this can be done automatically during deploy?
About having to install dependencies. If the developers need to install dependencies missing from public maven repositories, take into account that usually maven proxies have the ability to cache public repos. For instance, archiva has a proxying cache. If the dependencies are your own project deliverables you should consider releasing and deploying with maven to your company repo.
About latest versions. You need to specify maven what version of dependencies should use. I would prefer editing my poms manually, anyway there's a variety of ways to achieve that.
The libraries should be part of the project, I think. If not standard libraries of glassfish, they should be included, for instance, in your war file as part of your project. If not standard but not part of your project (not the regular approach) consider managing this glassfish as a project on its own (own git/svn repo, own pom, own versions, own everything).
Good luck.

Resources