Continuous integration and Gradle: external project as dependency - gradle

I have a Gradle-based project with a external project as dependency.
settings.gradle
include ':app'
include ':mylibrary'
project(':mylibrary').projectDir = new File('/path/to/my/library')
I'm developing mylibrary at the same time as my app, this is the reason why I include it in my project this way, instead of as dependency (I would have to upload the artifact in each change or copy the jar to libs folder, and that is tedious).
My problem is that when I commit changes of my app, Jenkins build fails because it can't find the module mylibrary (obviously, beacuse it is in my local file system).
How should I handle with that?

I just solved the same problem this way.
Add mylibrary as a proper dependency in app. The app build will try to pull the library from your configured remote Maven or Ivy repo(s).
When you change mylibrary, you may want to make that change visible to app immediately, without publishing to the binary repo. You can accomplish this by using the :mylibrary:publishToMavenLocal task, which will put the binary lib in your ~/.m2/reposity. This only takes one extra command so it's not too bad, I think. This assumes the new Maven publisher is being used in mylibrary:
apply plugin: 'maven-publish'
Next, app needs to be instructed to fetch dependencies from there:
repositories {
mavenLocal()
//other repos
}
The app:compileJava task should now check your local Maven repo first, before checking remote repos.
I find this approach is preferable to the answer by taringamberini, because you don't have to publish changes to library which are not ready for customer release. You can even test that your changes to mylibrary don't break app, before you merge library changes to master.
(You don't want to publish changes to the binary repo, which are not in master. That would be a recipe for disaster...)

You should work with 2 gradle projects in your workspace:
mylibrary
myapp (with a dependency on mylibrary)
When mylibrary is modified you should test it and, if test have passed, commit it to source repository and to binary repository. (If you didn't want do that manually you might configure 3 Jenkins jobs, build <-- test <-- deploy to binary repository, configuring a little continuous-delivery pipeline).
Next when mylibrary is modified you should test it and, if test have passed, commit it to repository. Such commit triggers a Jenkins build and because mylibrary exists both in the source and binary repositories, it will compile successfully.
You may version mylibrary adopting Semantic Versioning and managing the last version of mylibrary with graddle's Dynamic Versions or Changing Modules feature.

Related

How to trigger IntelliJ to reimport single Maven dependency?

I have a workflow working on an application and one of its libraries that somewhat looks like this:
Make changes to library -> Push library jar to remote Maven repository with no version change -> Pull updated library jar from the remote repo to the downstream app -> Test and make changes to the app and library
But seems like the way IntelliJ indexes and/or caches Maven dependencies is not affected by me running a clean install from the Maven interface. Is there a surefire way to force IntelliJ to discard any cached dependency and reimport, or possibly do it only for a desired library?
Very likely this has nothing to do with IntelliJ. Since the version number is the same, maven won't re-download your dependency. Try to just delete the dependency locally from the maven repository:
rm -rf ~/.m2/repository/<..path to your library package..>
You could also avoid pushing the library to the remote repository, and test completely locally, by using the library as a local dependency. For this approach, see answers here: How to add local jar files to a Maven project?
Or since you are not effectively changes the library version the right approach would be to use the library project sources as a direct dependency for IDE maven project. For this - add this Maven library project as a new module to existing Maven project: File | New... | Module from Existing Sources... and select pom.xml file of this library project.

Maven dependency resolving in multi module project

I have a question about how the Maven dependency resolving mechanism is working in a multi module project.
Normally I only use 'mvn clean install' when I build my multi module projects and my assumption was that if any module in the project needs a previous module, dependency will be resolved by going local repository and loading the corresponding 'jar'.
For project internal reason, I have to use 'mvn clean compile,' this command naturally does not create any 'jar' while 'install' is not there. So here I started wondering, how the dependency resolution for a multi module project works, while jar' is not created but project still able to see the changes from the previous builds. Does the target directories used for dependency management?
Or for 'mvn clean compile' target directory used but for 'mvn clean install' the local repository.
Can anybody explain me how the dependency resolution works in a 'multi module' project.
Thx for answers.....
I think you will understand better if you look at https://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html
There is a life cycle in the process of building the jar. The compile target will compile the code and create a complete classes folder in your target directory. This target will resolve all your dependencies in your poms and download any dependencies to your local repo, not already there.
The install target will create the jar from the classes directory and install it in your local repository.
I really think you will need to run the install target to get anything useful.
Maven is made of separate components.
There a component that deal with a given module and among other things try to get its dependencies. It ALWAYS get the dependencies from the local repository, eventually after having downloaded such dependencies. If the dependencies are not there and can't be dowloaded, it will fail. Eventually the module will create its own artifact that it will publish to the local repo.
Then there a compoment that when you ask it to build several maven modules, for example calling mvn at the root of a project does order the various module use the dependencies to find the best ordering for the build so that if a given module depend on another, it will be build after the module it depend on. It then call the previous compoent I described, building each module in order.
In all cases, a given module dependencies are always taken from the local repo. The expectation is that the previous modules that were built before actually pushed their artifact to the local repo typically with the mvn install but you could force it do it at any step thanks to proper configuration (may not be a good idea).
In all case if the previous component jar was not built and put into the repo, there no way such jar can be added in the classpath for the next module to be compiled.
Doing compile only on multiple projets isn't going to be any useful.

Maven―Dependencies, static content from remote repository

I am a bit new to maven, but I have some experiences with ant and the build process. I would like to do one thing that is kind of driving me nuts:
Given:
A remote repository (git, svn, hg,…) that holds static content (like images),
one maven project that uses/manages the mentioned repository in the same way as it does with all other dependencies (checkout on install, update whenever updates occur), in other words: the maven project depends on that repository
I finally want to be able to access the content (no *.svn or *.git) and copy it into my build, on build time*.
I want maven to store a local copy of that repository in maven`s local repository (~/.m2/repository) only once on each machine and manage it like all other dependencies.
*I am not trying to build a Java project
Thanks for help!
From what I've seen, Maven projects don't use version control repositories as external artifacts. That's a little too fine-grained for what you want, I think.
I've done something similar, when Project A wanted to use resources from Project B.
Project B, as part of its build procedure, collected it's resources into a ZIP file and deployed the ZIP file into a maven repository.
Project A then references the ZIP file artifact, unpacking it when building to where it needs it.
Look into the dependency plugin for maven, especially the dependency:unpack and dependency:unpack-dependencies goal.
Have fun

Maven: Change the "test" phase directory from local .m2 to target?

Forgive me if this is remedial, but I am still new to Maven and it's functionality.
In my project, when it "builds" and gets to the compile phase, it will create a target directory with just compiled libraries and update (or create if not there) the local .m2 directory.
When I get to the "test" phase, I want it to build against the target directory's library files, and not the local .m2 directory.
Any hints, recommendations, or suggests would be greatly appreciated. Thanks!
Maven has this concept of “the reactor”, which is just a fancy term for the list of projects being built. At the start of a Maven build, and at the end, Maven prints out this list of projects (using /project/name if defined or groupId:artifactId otherwise).
For each project in the reactor, Maven maintains a list of artifacts that have been attached. By default, each module's pom.xml is attached, and as each plugin runs, they have the option of attaching additional artifacts. Most plugins do not attach artifacts, here are some plugins that do:
jar:jar creates a .jar and attaches it
war:war creates a .war and attaches it
source:jar creates a .jar of the source Java code and attaches it with a classifier of source
java doc:jar creates a .jar of the JavaDocs ad attaches it with a classifier of javadoc
There is also a default primary artifact (this is the one that gets replaced by jar:jar) which is actually a directory and not a file, as such it will not get installed or deployed to the local repository cache or a remote repository.
So when in the reactor, and a plugin that attaches the primary artifact has not run yet, and another plugin asks for the primary artifact, it will be given the directory ${project.build.outputDirectory}. If after the primary artifact as been attached, then that primary artifact will be provided.
The test phase happens before the package phase, so will use the directory and not the .jar. The integation-test phase happens after, so will always use the .jar.
Things get more complex in a multi-module project (which is where my long intro should help you out)
Maven has to build the test classpath. If one of the dependencies is within the reactor, Maven will use the artifact attached to the reactor. Otherwise it will use the local cache (populating from the remote repositories if necessary).
When you run
mvn test
In a multimdule project from the root, there is no replacement of the default (directory-based) artifact, so intra-module classpath will be to the target/classes directories.
When you run
mvn package
In the same project, however, because each module completes its life cycle sequentially, all the dependent modules will have swapped in their .jar files as their attached artifact.
All of this should show you that Maven is doing the sensible thing. Hope this has helped.
The test phase is going to execute tests on your project. The project won't reference itself via the dependency mechanism. Only dependencies will be referenced via your local repository, i.e. .m2/repository
Also, it's not the compile phase that installs the artifact to the local repository, it's the install phase. And, then, there's a later phase, called deploy, that will deploy the artifact to a remote repository, provided you have a remote repository configured as the deploy target. Note, install and deploy are nearly identical phases except install is a local only thing; thus, it's the common build phase to hit when doing dev environment work. Normally the build server will do the deploy stuff.

Maven without (remote) repository?

I have a Maven 2 multi-module project and want to be sure everything is taken from my local checked-out source.
Is it possible to tell Maven to never download anything for the modules it has the source of? Do I have to disable the remote repositories?
Does Maven always have to go the expensive way of installing a module into the local repository, and then extracting it again for each of its dependents?
Does Maven automatically first recompile dependencies for a module if their local source changed, and then compile the dependent?
Is it possible to tell Maven to never download anything for the modules it has the source of?
No. Maven 2 only "sees" the current module while it builds. On the plus side, you can build part of the tree by running Maven in a module.
Do I have to disable the remote repositories?
Yes, use the "offline" option -o or -offline. Or use settings.xml with a proxy that doesn't have any files. This isn't what you want, though.
Does Maven always have to go the expensive way of installing a module into the local repository, and then extracting it again for each of its dependents?
Yes but it's not expensive. During the build, the file is copied (that was expensive ten years ago). When a dependency is used, Maven just adds the path to the file to the Java process. So the file isn't copied or modified again. Maven assumes that files in the local repository don't change (or only change once when a download/install happens).
Does Maven automatically first recompile dependencies for a module if their local source changed?
No. There were plans for Maven 3 but I can't find an option to enable something like that.
To solve your issues, you should install a local proxy (like Nexus).
Maven download stuffs (dependencies) only if it's not available in your local reposiotory ($USER_HOME/.m2/repository). If you do not want anything to be downloaded use offline mode. This can be done by using -o switch. E.g.
mvn -o clean install
There is nothing expensive in it. If you are building the complete parent project, it will build all the modules and then copy the artifacts to your local repository. Then, when you build a project that has dependencies on those project, Maven will just copy them from local repository on your hard disk to the package that is going to be created for current project.
No. I have been burnt. Maven does not compile dependencies automatically. There is a plugin called Maven Reactor Plug-in. This plugin enables you to build a project's dependencies before the project is built.

Resources