I have a project contains two sub projects:
A. a common library for external api
B. a program depends on above library
They are inside same directory. How I made B refer to A with maven?
Normally you will always share through a maven repository. That is mavens way to ensure a consistent and correct solution and a solution shareable by all developers.
You should search for a public maven repository with project A (e.g. http://search.maven.org or http://mvnrepository.com) and include in your pom
If it does not exist in public (is proprietary in someway or other), consider using an enterprise-wide maven repository such as nexus or artifactory to push to repositories.
Finally, some developers resort to either installing a mvn-local file if you are ever only going to work on an explicit workstation.
If you still prefer a filebased acces, it is possible to define a maven file repository and reference it in your pom. E.g. Heroku use this for bundling extra dependencies into their system.
Declare A as dependency in B's pom.xml. Make sure A has valid pom.xml and is deployed to your repository (local/nexus). We do that all the time. Take care to assign SNAPSHOT version if you always want latest to be pulled from repository.
Related
I have a set of applications, all use Maven and the local repository. The applications form a dependency tree using <dependency> in their pom.xml. All of these projects have -SNAPSHOT in their version.
Is it possible for Maven (or some compatible dependency manager) to build an application together with all of its local dependencies whose source changed?
I do not want to create a multi-module project, because:
the projects are exactly libraries, not modules;
I do not want an additional complexity just to have a form of build which is already precisely defined;
I want the process to be dynamic: if a library is mature enough to be put into a remote repository, it would be no more rebuilt with the main project and that's ok.
For now, there is a lot of refactoring, moving code from one library to another etc. and it happens often that substantial parts of the dependency tree need to be rebuilt. I thus need to manually write mvn install in several projects in order to assure that there is no stale code.
No, it doesn't work. Even with a multi-module project, maven does not detect which modules have changed sources in it and which do not.
There was a (flaky) implementation in Maven 2, but it was not continued in 3.x, see How to get maven 3.0 to only build modules with local scm changes
I hoped they would include it again in maven 4, but I didn't see it yet: https://maarten.mulders.it/2020/11/whats-new-in-maven-4/
I once did a similar setup, but had to use shell scripts with some git magic to get it working.
You can also decide to put your libraries in separate repo's from the start, and use the repo tool that google uses for android development: https://github.com/GerritCodeReview/git-repo/blob/main/README.md
Once you run mvn install on the particular Maven project, it will be accessible for all other Maven projects, which are on the same workstation, during dependency collection (before the compile phase).
Official Maven Build Lifecycle description:
install - install the package into the local repository, for use as a dependency in other projects locally
It's not necessary to keep libraries as part of the same project(or have it as a multi-module project). But once you want to share those libraries with your teammates, you would need either to force them installing libraries locally (as you did), or store those libraries at some external repo, like Artifactory or Nexus
I am working on a Maven project where a build is done through Jenkins, and 1 particular JAR has been removed from corporate repository recently.
So my build is failing as parent pom.xml is referring to a JAR not available in the repo.
But I have the old certified copy downloaded in my local repo, and I want to use the same JAR during build copying in the project folder and want to use the local copy of dependent JAR in pom.xml instead of downloading from corporate repository embedded in the project structure.
How can I do this?
Frankly, this does not sound like a good idea.
While it is possible to reference jars with the <systemPath> entry, it is generally considered bad practise.
By "corporate repository", do you mean a repository of your own company or of some other company? In the first case, you should request that the jar is put back in. In the second case, it would be better not to use the external corporate repository directly, but to set up your own Nexus/Artifactory through which you use (different) external repositories. This Nexus/Artifactory can then host additional artifacts like the one you need (you can e.g. upload them through the UI).
We have a large custom artifact repository which is used by our old internal ant builds.
It stores jars in much the same way that a maven repository does. i.e.
http://repo/root/<group>/<artifact>/<version>/<artifact>-<version>.jar
But, this repository does_not_ contain pom files. Just jars and src jars.
We are now migrating a whole lot of projects to using maven/gradle, these use an Artifactory installation that we have. But the projects still have a lot of dependencies on artifacts stored in the old repository.
I was wondering if anyone knew a way of accessing this old style repo (which does not have poms) using maven/gradle?
We could synthesize and insert a whole lot of simple poms, which just have group/artifact/version etc, and no dependencies. But was wondering if there might be a simpler way.
After all, the group/artifact/version is in the path itself. The poms never contain dependencies, so in this situation the poms wouldn't (as far as I can see) provide any additional info.
Any advice/help would be greatly appreciated.
When you transform the projects to Maven, you need to touch the dependency definitions. You need to to replace old, file-based accesses by Maven coordinates.
Therefore, I would suggest the following (we did something very similar, only with a Windows network drive instead of a http based repository):
Write a script that uploads all your artifacts from the old repository to your artifactory. If you use maven deploy:deploy-file, Maven will create stub poms for you.
Write scripts for the developers that translate the references on the old repository by the respective Maven coordinates for the pom.
As a side note: In our company, the old "repository" and the Maven repository were actively used (and written to) at the same time, so we developed a two-way synchronisation job between our Nexus and the old "repository".
Gradle doesn't need pom files, if they aren't available it should just reference the jars directly. So this should "just work"
repositories {
maven {
url "http://repo/root"
}
}
If, for some reason, there's slight differences you could use the Ivy repository. See custom ivy repositories and IvyArtifactRepository Eg:
repositories {
ivy {
url "http://repo/root"
layout "pattern", {
artifact "[organisation]/[module]/[revision]/[artifact]-[revision](-[classifier])(.[ext])"
}
}
}
In my multi-module Maven project, suppose I have two modules, car and horse. They both depend on a JAR file, transport.jar, a file not available in any online Maven repositories. As such, I need to find a way to make these modules depend on a file found somewhere in the project folder structure.
From what I understand, the default Maven solution would be to manually register the JAR file in the local repository. While this would work on a development machine, it breaks on the build server, which clears its local repository before each build.
I've been searching online on how to do this on and off for a while and found some helpful things, but nothing that completely works.
For instance, a common answer is to add a dependency to the file using <scope>system</scope>. However, not only do others claim that it's extremely bad practice to do so, it also doesn't work on the build server. (On a side note, I would also like to point out that using absolute paths to the JAR is also out of the question due to, again, it being built on several different machines.)
A more useful method I found was to define a local repository in the POM file, pointing towards the path file:${project.basedir}/lib. (Such as in this article) Unfortunately, if I place the JAR and repository definition in the car POM, I cannot successfully add a dependency to the JAR in horse. I've tried both with and without an additional reference to car in horse, as well as defining a second repository in horse, pointing to file:${project.basedir}/../car/lib. This problem would also remain if I tried to make a third module, transport-lib, specifically for wrapping the JAR dependency.
I could most likely add the JAR file to both modules and define two separate module-local repositories, but I really don't want to unless I have to due to the need to keep the two (often updated) JARs in sync etc.
So, my question is as follows: Can someone give me a confirmed-to-work method to have two modules depend on the same JAR file inside the project, given the parameters and restrictions mentioned?
Best solution is to use a repository manager like Archiva, Artifactory or Nexus and install that artifact into the repository manager. Afterwards you can use this artifact directly in your pom files without any issue.
Don't use the scope system, cause it will cause other problem after a release for other etc.
I have a following problem. We have a central maven repository hosted on our company server. Our team is working on a project. Everyone here uses that repository to get the required artifacts. If something is missing at the moment and is required for the task that the developer is currently dealing with, he installs this artifact manually to the central repository, so that his commits don't break the automated builds.
Now, each developer also has Glassfish v2 installed on his machine. That is for testing and debugging purposes. Before committing the changes, developer makes the .ear for the project with Maven help. However, after the developer deploys the ear to it's local glassfish, frequent errors arise, because the set of glassfish libraries may not contain all the latest dependencies of the central company repository.
Right now in case of the error the developer simply reads the log and looks what exactly is missing. After that he manually copies the required jar inside his local $GLASSFISH_HOME$/lib dir. But that seems a little bit frustrating. How can this be done automatically?
Right now we are trying to implement the following solution. The developer has to synchronize his local maven repository gathering all the artifacts from the central one that are required by the project. This local repository has to be placed on the java classpath, so that glassfish would also see it. Is that a correct approach? Maybe there is a way to install directly all the required artifacts from the central repository inside $GLASSFISH_HOME$/dir and this can be done automatically during deploy?
About having to install dependencies. If the developers need to install dependencies missing from public maven repositories, take into account that usually maven proxies have the ability to cache public repos. For instance, archiva has a proxying cache. If the dependencies are your own project deliverables you should consider releasing and deploying with maven to your company repo.
About latest versions. You need to specify maven what version of dependencies should use. I would prefer editing my poms manually, anyway there's a variety of ways to achieve that.
The libraries should be part of the project, I think. If not standard libraries of glassfish, they should be included, for instance, in your war file as part of your project. If not standard but not part of your project (not the regular approach) consider managing this glassfish as a project on its own (own git/svn repo, own pom, own versions, own everything).
Good luck.