My need is pretty basic but I could not find any clean answer to it: I simply need to be able to distribute a resource in a multi-module project.
Let us consider for example the LICENSE file, which I hereby assume to be the same for all modules. I prefer not to manually copy it into each and every module because the file could change over time. I also prefer not to statically link to resources (even if using relative paths) outside the project folder, because the modular structure can possibly change too.
Is there any plugin that can be used to robustly guarantee that each module is given the required file? It would be equally acceptable for such copy to be obtained by exploiting the POM of the parent project or directly performed by the super project in the modular hierarchy.
you could use the assembly and the dependency plugins.. did you stumble over that link?
http://www.sonatype.com/people/2008/04/how-to-share-resources-across-projects-in-maven/
it describes that option ..its from 2008, but maven is around for quite some time.. so I guess its more or less up to date
edit regarding comment
Another option is the maven-remote-resources-plugin.
For a more detailed example see:
http://maven.apache.org/plugins/maven-remote-resources-plugin/examples/sharing-resources.html
Since their intro speaks actually for itself, I quote (maven.apache.org)
This plugin is used to retrieve JARs of resources from remote repositories, process those resources, and incorporate them into JARs you build with Maven. A very common use-case is the need to package certain resources in a consistent way across your organization: at Apache it is required that every JAR produced contains a copy of the Apache license and a notice file that references all used software in a given project.
Related
I know you can access different modules (included using include) in a project via org.gradle.api.Project#getSubprojects(), and I know you can get the name and directories of separate builds that have been included (using includeBuild) via org.gradle.api.invocation.Gradle#getIncludedBuilds().
But how can my plugin get information such as the locations of Java source files and class files for projects included using includeBuild?
My goal here is to determine which files have changed in the current git branch (which I can do), and then collect their corresponding class files into a jar file that's used for our patching mechanism that inserts the patch jars at the front of the classpath rather than redeploying the whole application.
I don’t think it is a goal of Gradle to provide including builds with detailed information on included builds. Currently, the Gradle docs basically only state two goals for such composite builds:
combine builds that are usually developed independently, […]
decompose a large multi-project build into smaller, more isolated chunks […]
Actually, isolation between the involved builds seems to be an important theme in general:
Included builds do not share any configuration with the composite build, or the other included builds. Each included build is configured and executed in isolation.
For that reason, it also doesn’t seem to be possible or even desired to let an including build consume any build configurations (like task outputs) of an included build. That would only couple the builds and hence thwart the isolation goal.
Included builds interact with other builds only via dependency substitution:
If any build in the composite has a dependency that can be satisfied by the included build, then that dependency will be replaced by a project dependency on the included build.
So, if you’d like to consume specific parts of an included build from the including build, then you have to do multiple things:
Have a configuration in the included build which produces these “specific parts” as an artifact.
Have a configuration in the including build which consumes the artifact as a dependency.
Make sure that both configurations are compatible wrt. their capabilities so that dependency substitution works.
Let some task in the including build use the dependency artifact in whatever way you need.
Those things happen kind of automatically when you have a simple dependency between two Gradle projects, like a Java application depending on a Java library. But you can define your own kinds of dependencies, too.
The question is: would that really be worth the effort? Can’t you maybe solve your goal more easily or at least without relying on programmatically retrieved information on included builds? For example: if you know that your included build produces class files under build/classes/java/main, then maybe just take the classes of interest from there via org.gradle.api.initialization.IncludedBuild#getProjectDir().
I know, this may not be the answer you had hoped to get. I still hope it’s useful.
I found this note in the Maven's documentation:
You can add elements to this classloader by extensions. These are loaded into the same place as ${maven.home}/lib and hence are available to the Maven core and all plugins for the current project and subsequent projects (in future, we plan to remove it from subsequent projects).
I couldn't understand what they mean by "subsequent projects" here. As far as I understand, extensions are enhancements to lifecycle phases of Maven and are not project specific. So it makes sense to work for all the Maven projects.
Question: Can anyone explain what this statement means "in future, we plan to remove it from subsequent projects"
First an extensions can be extensions of a life cycle but not need to. You can implement an extensions also as an EventSpy for example.
This documentation is related to the Core Classloader which is available within such extensions and makes it also possible to enhance it via an extensions. This classloader contains those files from ${maven.home}/lib which is not a good idea and not necessary. It would be better having only the Maven Plugin API there and it's instances which are currently used and not more...
There existing some extensions like Wagon which are using to make a transport in special cases possible which could be project specific.
Starting with Maven 3.3.1 the core extensions mechanism has been improved to make loading project specific extensions more simpler which means they are located into ${maven.projectBasedir}/.mvn/extensions.xml file and also being loading from an repository. Before 3.3.1 you need to do that manually via mvn -Dmaven.ext.class.path=extension.jar.
I've inherited a few maven projects which have added a /dependencies directory to capture Java jar libraries that aren't part of the project war and must be installed by a DevOps into a Tomcat installation.
The libraries in this directory seem to fall into four categories:
"provided" scope libraries,
downstream dependencies of those provided libraries, and
discoverable implementations of api jars
"mystery" libraries, i.e., not available in an external repository, and maybe unsure where they ever came from.
Is there a strategy to get Maven to help manage these dependencies and perhaps fetch them for external install?
There are probably several strategies to choose from.
Number one: leave it as it is. If it works and the build is reproducible (on different environments) that seems one valid solution.
The "mystery" part of the build might not be more of an issue for new people working with it.
I think it is valid to create an own maven module to be delivered to the infrastructure team. This module can contain the jars in the /dependencies folder.
What you would need to do is create a pom.xml and add all dependencies currently in that directory (of course not the transitive ones). The magic ones would need to go in a repository proxy (nexus, artifactory, ...). If you don't have a maven repository yet: you want one! (its easy to setup and it does help a lot!)
I would then use the assembly plugin or some ant task to build the zip do be delivered. So the infrastructure team is able to just unzip / copy the files where they need to be. This step can then even be scripted (so the upload / unzip is done through SSH or something like that).
This is probably only one way to do it. I would assume to resolve the jar's in the /dependencies directory may be a bit of a pain.
The advantage is obviously that you document and simplify the management of those libraries. I would also assume if you update some of them it is easier across branches to merge since there are no binary files around. So it may be worth the effort.
In my multi-module Maven project, suppose I have two modules, car and horse. They both depend on a JAR file, transport.jar, a file not available in any online Maven repositories. As such, I need to find a way to make these modules depend on a file found somewhere in the project folder structure.
From what I understand, the default Maven solution would be to manually register the JAR file in the local repository. While this would work on a development machine, it breaks on the build server, which clears its local repository before each build.
I've been searching online on how to do this on and off for a while and found some helpful things, but nothing that completely works.
For instance, a common answer is to add a dependency to the file using <scope>system</scope>. However, not only do others claim that it's extremely bad practice to do so, it also doesn't work on the build server. (On a side note, I would also like to point out that using absolute paths to the JAR is also out of the question due to, again, it being built on several different machines.)
A more useful method I found was to define a local repository in the POM file, pointing towards the path file:${project.basedir}/lib. (Such as in this article) Unfortunately, if I place the JAR and repository definition in the car POM, I cannot successfully add a dependency to the JAR in horse. I've tried both with and without an additional reference to car in horse, as well as defining a second repository in horse, pointing to file:${project.basedir}/../car/lib. This problem would also remain if I tried to make a third module, transport-lib, specifically for wrapping the JAR dependency.
I could most likely add the JAR file to both modules and define two separate module-local repositories, but I really don't want to unless I have to due to the need to keep the two (often updated) JARs in sync etc.
So, my question is as follows: Can someone give me a confirmed-to-work method to have two modules depend on the same JAR file inside the project, given the parameters and restrictions mentioned?
Best solution is to use a repository manager like Archiva, Artifactory or Nexus and install that artifact into the repository manager. Afterwards you can use this artifact directly in your pom files without any issue.
Don't use the scope system, cause it will cause other problem after a release for other etc.
I would like to create an assembly descriptor defining how my applications are uploaded on Nexus and in particular I would like those final packages to include a couple of startup scripts handy to run applications on different platforms. Untill now I have always copied the assembly descritor and the startup template scripts from project to project, but now I would like to find a clever solution for the problem.
In the documentation page of the maven-assembly-plugin I found an example showing how to share descriptors across multiple projects; unfortunately it does not cover the case where one wants to include a common resource in the the distribution package but and after a couple of experiments I came to the conclusion it is impossible to do that: to include a resource in the distribution file one has to specify the path of the resource (apparently it is not possible to link a resource contained within a jar).
Additional solutions I have found so far try to use the ant or dependency plugins to unpack the jar containing the the assembly descritor and the script templates inside the build folder before packaging the application.
Even if I can stick with the the solutions of the last paragraph, I am wondering if there is a clearer way to achieve that: do I really need to use additional plugins for such a task?