I know that Maven houses the outcome of the build in the local repository (artifacts goes installed under ~/.m2/repository/), but it also outputs the compiled classes in the target folder next to src.
Is there any difference between what goes in the local repository and what goes in the target folder?
They are completely different and shouldn't be mixed up.
target represents the build directory. This is to say, every temporary file that is generated during the build from the sources ends up there. Quite notably, you'll find the compiled classes of the main and test Java sources, but you'll also find lots of things in there (generated source files, filtered files, etc.). What matters, is that everything that is contained in this folder is inherently temporary. You can delete it at any time, running mvn clean, and be assured that the next build will (or at least should) work just fine. All the files and folders generated under target serve a single purpose: create the artifacts of the project. A Maven project, for example with jar packaging, will have a single main artifact, which is composed of its final name with a jar extension, and will contain the compiled Java classes. The final name can be a custom name, set within the POM, or the default one derived from the Maven coordinates of the project. Such a project can also have additional attached artifacts, like a test JAR, or a sources JAR.
The local repository only contains the artifacts. There are no temporary files in there. What is installed when running mvn install is strictly the generated artifacts of the Maven project, i.e. the end products, plus the POM file of the project. Everything that served to create them isn't put in the local repository, and the build of a project must never put temporary things in there. Keep in mind that the local repository is a Maven repository, and, as such, follows a strict naming scheme: a project with a group id of my.groupid, an artifact id of my-artifactid and a version of 1.0 will get installed in the folder my/groupid/my-artifactid/1.0; in which you'll find the POM file, and all the other artifacts. The name of the artifacts themselves cannot be overriden: it will be my-artifactid-1.0.jar for a JAR project (perhaps with a classifier added).
This is generally a source of confusion: the name of the main artifact file that is generated under the target folder is completely distinct from the name that it will have in the local repository when installed, or in remote repositories when deployed. The first can be controlled, but the latter is defined by the naming scheme of the repository, which is calculated from the coordinates.
To recap: target contains all the gory temporary details during the build which creates the artifacts of a project (main JAR, sources, Javadoc... i.e. everything that is supposed to be deployed and released by that project), while the local repository (and remote repositories) will contain only the artifacts themselves.
Not much in terms of the generated module.jar if that's what you are really concern about. The .jar generated is the same, also considering recompiling the code would clean your /target folder but not the .m2 one.
Though /target folder would generally be composed of the compiled source classes /target/classes and /target/generated-sourceetc along with a module.jar.
On the other hand the local ~.m2/repository would consist of module.jar along with the pom.xml for that module and all the configs(repositories, dependencies etc) to rebuild that module from if required.
Related
Currently we have one main pom file which builds the code for multiple modules using the module tag. When I use the "mvn clean package deploy" command (which references the main pom file and performs these actions fro all other modules), the packaged file(war/jar) for each module is placed in it's respective target directory. Since there are different modules and they have their own respective group-id,artifact-id etc. the packaged files are spread across different folders.
My applications consists of all of these modules and I need all the packaged files under one single folder. Till now we have used an ant script to copy the relevant files from all of these modules to a single folder.
Apart from copying the files/aggregating all the packaged files and then uploading it as part of deploy:deploy-file is there any way I can deploy all the files to the same folder?
No, this is not possible.
The directory structure in Artifactory is always of the form
org/something/artifact/1.2.3/artifact-1.2.3.jar
assuming groupId org.something, artifactId artifact and version 1.2.3.
This structure cannot be changed.
I'm working on a sort of deployment script for a Java project using Python/shell. The script currently can copy jars either from a Sonatype Nexus repository or from the project's target folder. The remote/Nexus setup seems all good, but I'm interested in instead copying from the local maven repository because allows me to always know the location of the jar regardless of where the project is installed.
I guess my question is: Am I overlooking anything by just copying the first jar from the folder ~/.m2/repository/{groupid}/{artifactid}/{version}? Or is this totally a good way to go about this?
If the groupId consists of more than one part, for example org.apache.httpcomponents , then the folder structure reflects this: org/apache/httpcomponents/...
There may exist more than one jar file inside the version directory.
Currently I'm working on integrating grunt build process to our maven build process.
Here are two options I can think of:
Pointing some folder in resources as a target build dir for grunt project.
Building grunt project wherever and enlisting this this non-standard folders as included resources in the pom.xml.
It looks like there is yet some space to enhancement. Basically, since grunt subproject
does not depend on any external resources, it would be nice to learn how not to rebuild war-file that had been already compiled, but modify it after rebuilding grunt-project.
So, the question is:
What are the best practices adding generated resources to an existing war file.
The approach of adding resources to a war file amounts to modifying a maven-built artifact after maven builds it. That runs counter to the maven philosophy of tightly controlling the entire build of every artifact. You really have three choices:
Include the grunt generated source in the war's source and build a single artifact with maven. In this case you rebuild the war every time grunt resources change, or
Put the grunt generated sources in a second maven artifact and make that artifact a dependency of the war artifact. Maven will still rebuild the war every time, but you get the separation of builds you seem to be implying in your question, or
Make the dependency in (2) a runtime dependency, if possible. You basically make the scope of the grunt artifact dependency "provided" so you don't have to rebuild the war every time your grunt artifacts change. You only have to rebuild your grunt artifact.
It sounds like you want to go with option (3).
A war file is just a zip file with a certain file layout, you could just add to the archive using a zip tool. For example my linux platform zip command has the -g command.
-g
--grow Grow (append to) the specified zip archive, instead of creating a new one. If this operation fails, zip attempts to restore the
archive to its original state. If the restoration fails, the archive
might become corrupted. This option is ignored when there's no
existing archive or when at least one archive member must be updated
or deleted.
I am a bit new to maven, but I have some experiences with ant and the build process. I would like to do one thing that is kind of driving me nuts:
Given:
A remote repository (git, svn, hg,…) that holds static content (like images),
one maven project that uses/manages the mentioned repository in the same way as it does with all other dependencies (checkout on install, update whenever updates occur), in other words: the maven project depends on that repository
I finally want to be able to access the content (no *.svn or *.git) and copy it into my build, on build time*.
I want maven to store a local copy of that repository in maven`s local repository (~/.m2/repository) only once on each machine and manage it like all other dependencies.
*I am not trying to build a Java project
Thanks for help!
From what I've seen, Maven projects don't use version control repositories as external artifacts. That's a little too fine-grained for what you want, I think.
I've done something similar, when Project A wanted to use resources from Project B.
Project B, as part of its build procedure, collected it's resources into a ZIP file and deployed the ZIP file into a maven repository.
Project A then references the ZIP file artifact, unpacking it when building to where it needs it.
Look into the dependency plugin for maven, especially the dependency:unpack and dependency:unpack-dependencies goal.
Have fun
Forgive me if this is remedial, but I am still new to Maven and it's functionality.
In my project, when it "builds" and gets to the compile phase, it will create a target directory with just compiled libraries and update (or create if not there) the local .m2 directory.
When I get to the "test" phase, I want it to build against the target directory's library files, and not the local .m2 directory.
Any hints, recommendations, or suggests would be greatly appreciated. Thanks!
Maven has this concept of “the reactor”, which is just a fancy term for the list of projects being built. At the start of a Maven build, and at the end, Maven prints out this list of projects (using /project/name if defined or groupId:artifactId otherwise).
For each project in the reactor, Maven maintains a list of artifacts that have been attached. By default, each module's pom.xml is attached, and as each plugin runs, they have the option of attaching additional artifacts. Most plugins do not attach artifacts, here are some plugins that do:
jar:jar creates a .jar and attaches it
war:war creates a .war and attaches it
source:jar creates a .jar of the source Java code and attaches it with a classifier of source
java doc:jar creates a .jar of the JavaDocs ad attaches it with a classifier of javadoc
There is also a default primary artifact (this is the one that gets replaced by jar:jar) which is actually a directory and not a file, as such it will not get installed or deployed to the local repository cache or a remote repository.
So when in the reactor, and a plugin that attaches the primary artifact has not run yet, and another plugin asks for the primary artifact, it will be given the directory ${project.build.outputDirectory}. If after the primary artifact as been attached, then that primary artifact will be provided.
The test phase happens before the package phase, so will use the directory and not the .jar. The integation-test phase happens after, so will always use the .jar.
Things get more complex in a multi-module project (which is where my long intro should help you out)
Maven has to build the test classpath. If one of the dependencies is within the reactor, Maven will use the artifact attached to the reactor. Otherwise it will use the local cache (populating from the remote repositories if necessary).
When you run
mvn test
In a multimdule project from the root, there is no replacement of the default (directory-based) artifact, so intra-module classpath will be to the target/classes directories.
When you run
mvn package
In the same project, however, because each module completes its life cycle sequentially, all the dependent modules will have swapped in their .jar files as their attached artifact.
All of this should show you that Maven is doing the sensible thing. Hope this has helped.
The test phase is going to execute tests on your project. The project won't reference itself via the dependency mechanism. Only dependencies will be referenced via your local repository, i.e. .m2/repository
Also, it's not the compile phase that installs the artifact to the local repository, it's the install phase. And, then, there's a later phase, called deploy, that will deploy the artifact to a remote repository, provided you have a remote repository configured as the deploy target. Note, install and deploy are nearly identical phases except install is a local only thing; thus, it's the common build phase to hit when doing dev environment work. Normally the build server will do the deploy stuff.