How configure Maven to Work Offline? Complete solution - maven

I need to configure maven to download the dependencies to a directory within my project so that I can copy my project to another PC without internet access. I have found the -o option and the "dependency: copy-dependencies" plugin, but nobody explains how to consume those dependencies later. What would be the way to download the dependencies and then consume it on a PC without an Internet connection?

Maven caches downloaded dependencies (and plugins -- just having the project's dependencies won't necessarily be enough depending on the pom structure) in ~/.m2/repository. If you build your project, then clone the ~/.m2/repository directory as well as your project to another machine, you should be able to build in offline mode with all dependencies available to use.

The dependency:copy-dependencies is pretty useless for the task you try to solve. You usually need much more to successfully build a project.
You can use a dedicated local repository for you project (this can be set on the command line), so that you can copy that (without the content coming from all the other projects).
But if you are in a company, the recommended way is to set up a Nexus/Artifactory server that manages your dependencies. Then you don't need internet access to build, but just access to that server.

Related

Maven: Create Clean .m2 from Dirty .m2 to Support Offline Builds

I have a Maven project. One of the repositories is no longer available online.
My local environment has what it needs to build the project offline with -o.
What's a good way to capture only the files this project needs, to provide a copy of the artifacts to others to develop the project in the same offline manner?
I'd like to avoid providing my entire .m2 directory for efficiency.
It would be great if there was a way to save all of the build artifacts for a particular Maven project from a bloated working .m2 to a clean .m2, including only what's necessary to the project, to easily share with others.
Thanks!

Does "build with local dependencies" exist in Maven without multi-module?

I have a set of applications, all use Maven and the local repository. The applications form a dependency tree using <dependency> in their pom.xml. All of these projects have -SNAPSHOT in their version.
Is it possible for Maven (or some compatible dependency manager) to build an application together with all of its local dependencies whose source changed?
I do not want to create a multi-module project, because:
the projects are exactly libraries, not modules;
I do not want an additional complexity just to have a form of build which is already precisely defined;
I want the process to be dynamic: if a library is mature enough to be put into a remote repository, it would be no more rebuilt with the main project and that's ok.
For now, there is a lot of refactoring, moving code from one library to another etc. and it happens often that substantial parts of the dependency tree need to be rebuilt. I thus need to manually write mvn install in several projects in order to assure that there is no stale code.
No, it doesn't work. Even with a multi-module project, maven does not detect which modules have changed sources in it and which do not.
There was a (flaky) implementation in Maven 2, but it was not continued in 3.x, see How to get maven 3.0 to only build modules with local scm changes
I hoped they would include it again in maven 4, but I didn't see it yet: https://maarten.mulders.it/2020/11/whats-new-in-maven-4/
I once did a similar setup, but had to use shell scripts with some git magic to get it working.
You can also decide to put your libraries in separate repo's from the start, and use the repo tool that google uses for android development: https://github.com/GerritCodeReview/git-repo/blob/main/README.md
Once you run mvn install on the particular Maven project, it will be accessible for all other Maven projects, which are on the same workstation, during dependency collection (before the compile phase).
Official Maven Build Lifecycle description:
install - install the package into the local repository, for use as a dependency in other projects locally
It's not necessary to keep libraries as part of the same project(or have it as a multi-module project). But once you want to share those libraries with your teammates, you would need either to force them installing libraries locally (as you did), or store those libraries at some external repo, like Artifactory or Nexus

Best practice for using Maven or Gradle without internet access

My company has a policy that software deployed into production has be be built on a specific machine that has no access to the internet.
We're currently using Maven. When running build on development machines, maven automatically download the dependencies from central Maven repository without problem. Then before go production, we put all files in local Maven repository (.m2/repository) into source control, and then run offline build with
mvn -o -Dmaven.local.repo=<local repo dir> package
this method works, but managing thousands of files in source control is a real pain, particularly the dependencies for Maven plugins. Thus my question, how can I improve the workflow so as to make it easier to maintain the dependencies in the source control?
I'm considering switching to Gradle, mainly because it's more flexible and doesn't depend on plugin downloaded from repository. but then I found out the Gradle local cache directory is not transportable between computers, which means I cannot check it into source control.
Suggestions and recommendations are all appreciated.
Use internal repository manager like Nexus or Artifactory. Always put released artefact to production.
But building project on production machine is not good idea. Better use complete artefact like EAR or WAR with all dependencies included, or something like jar-with-dependencies or other assembled distros. Build project on your CI server and deploy complete package with one click to production server.

How to extract all the dependencies from a Maven project for standalone offline building?

I know that I am not the only person that might need to send a Maven project to someone that doesn't have access to my private remote repository and only needs to build the project in a stand alone fashion.
In my case I need to send my Mavenized project to a customer that doesn't have access to our internal Archiva instance where we host all of our dependencies.
How can I create a stand alone Maven project with all the dependencies to build the project in a stand alone fashion?
NOTE: I don't want to just export the dependencies, I need an automated way to add them to the stand alone local repository as well.
You should be able to configure settings to use
< localRepository >${some.location.in.your project}< /localRepository >
This describes how to configure settings. After that you run online build and package your project with repository. Unpack and you should be able to build in offline mode.
see dependency plugin.
http://maven.apache.org/plugins/maven-dependency-plugin/go-offline-mojo.html
This can help you to download all internet :-)
run this goal with -Dmaven.repo.local=path to where you want everything.

How to enable inside glassfish access to maven repository?

I have a following problem. We have a central maven repository hosted on our company server. Our team is working on a project. Everyone here uses that repository to get the required artifacts. If something is missing at the moment and is required for the task that the developer is currently dealing with, he installs this artifact manually to the central repository, so that his commits don't break the automated builds.
Now, each developer also has Glassfish v2 installed on his machine. That is for testing and debugging purposes. Before committing the changes, developer makes the .ear for the project with Maven help. However, after the developer deploys the ear to it's local glassfish, frequent errors arise, because the set of glassfish libraries may not contain all the latest dependencies of the central company repository.
Right now in case of the error the developer simply reads the log and looks what exactly is missing. After that he manually copies the required jar inside his local $GLASSFISH_HOME$/lib dir. But that seems a little bit frustrating. How can this be done automatically?
Right now we are trying to implement the following solution. The developer has to synchronize his local maven repository gathering all the artifacts from the central one that are required by the project. This local repository has to be placed on the java classpath, so that glassfish would also see it. Is that a correct approach? Maybe there is a way to install directly all the required artifacts from the central repository inside $GLASSFISH_HOME$/dir and this can be done automatically during deploy?
About having to install dependencies. If the developers need to install dependencies missing from public maven repositories, take into account that usually maven proxies have the ability to cache public repos. For instance, archiva has a proxying cache. If the dependencies are your own project deliverables you should consider releasing and deploying with maven to your company repo.
About latest versions. You need to specify maven what version of dependencies should use. I would prefer editing my poms manually, anyway there's a variety of ways to achieve that.
The libraries should be part of the project, I think. If not standard libraries of glassfish, they should be included, for instance, in your war file as part of your project. If not standard but not part of your project (not the regular approach) consider managing this glassfish as a project on its own (own git/svn repo, own pom, own versions, own everything).
Good luck.

Resources