I have several projects with dependencies between them and in the production build all the projects are packaged into jars, published and the main project simply adds dependency on these jar files.
During development and CI however I would like the main project to have dependencies on the related projects and build them too as part of the process (not compile, deploy and then build next, but rather compile all together).
This mans that in production I would have:
compile group:'com.me.aaa', name:'myCommonProj', version: '1.0.0'
while for the development builds I would like to have:
compile project(':myCommonProj')
Option #2 however requires the settings.gradle file.
How can I accomplish this setup?
I'd apply gitflow to the project and work in the following way:
in dev branch I'd keep compile(':myProject') dependency along with appropriate setting.gradle file.
in master branch I'd keep only fixed dependencies.
This scenario isn't very comfortable however because dev branch would be merged partially (fixed vs local dependencies), but it can be quite easily done. On the other side, working this way will prevent you from keeping strange logic in build script.
Second option is if it all un build.gradle and settings.gradle. CI server should expose build numbers and other environment variables. They can be used to include appropriate dependencies depending on environment.
Related
I have the following modules with the top one depending on the next depending on the next ( These links have VERY SIMPLE build.gradle and settings.gradle files)
https://github.com/deanhiller/webpieces/tree/master/core/core-ssl
https://github.com/deanhiller/webpieces/tree/master/core/core-datawrapper
https://github.com/deanhiller/webpieces/tree/master/core/core-util
https://github.com/deanhiller/webpieces/tree/master/core/core-logging
I temporarily added an throw new RuntimeException to a test in core-datawrapper and core-util and built the project core-ssl (in the repo, ../../gradlew build)
The settings of core-ssl (found in above link and pasted here) is
includeBuild '../core-datawrapper'
includeBuild '../core-mock'
The settings of core-datawrapper (again in above links)
includeBuild '../core-util'
I clear out core-util/build and I see these targets run
> Task :core-util:compileJava
> Task :core-util:classes
> Task :core-util:jar
That is it. Why is the tests not running? I thought build depended on assemble and test separately?
The same for ../../gradlew clean and ../../gradlew publish
Ideally, I want my target to affect all transitive projects as well. As developers add projects, I don't want to have to add code to each gradle project in the transitive deps list either.
Yes, that’s correct.
This is most probably due to the original use-case of composite builds.
You have some binary dependency on some library, and then you want to change something in the library and test it in your project before even committing or publishing the library.
So the composite build result replaces the binary dependeny and only as much work as really necessary is done, meaning the jar is built.
If you use composite builds as normal structuring element, you either need to wire the lifecycle tasks you want to have wired yourself or search for a plugin that maybe does it.
As developers add projects, I don't want to have to add code to each gradle project in the transitive deps list either.
You can programmatically iterate through the included builds using gradle.includedBuilds, so you can do the wiring in a generic way.
I have a set of applications, all use Maven and the local repository. The applications form a dependency tree using <dependency> in their pom.xml. All of these projects have -SNAPSHOT in their version.
Is it possible for Maven (or some compatible dependency manager) to build an application together with all of its local dependencies whose source changed?
I do not want to create a multi-module project, because:
the projects are exactly libraries, not modules;
I do not want an additional complexity just to have a form of build which is already precisely defined;
I want the process to be dynamic: if a library is mature enough to be put into a remote repository, it would be no more rebuilt with the main project and that's ok.
For now, there is a lot of refactoring, moving code from one library to another etc. and it happens often that substantial parts of the dependency tree need to be rebuilt. I thus need to manually write mvn install in several projects in order to assure that there is no stale code.
No, it doesn't work. Even with a multi-module project, maven does not detect which modules have changed sources in it and which do not.
There was a (flaky) implementation in Maven 2, but it was not continued in 3.x, see How to get maven 3.0 to only build modules with local scm changes
I hoped they would include it again in maven 4, but I didn't see it yet: https://maarten.mulders.it/2020/11/whats-new-in-maven-4/
I once did a similar setup, but had to use shell scripts with some git magic to get it working.
You can also decide to put your libraries in separate repo's from the start, and use the repo tool that google uses for android development: https://github.com/GerritCodeReview/git-repo/blob/main/README.md
Once you run mvn install on the particular Maven project, it will be accessible for all other Maven projects, which are on the same workstation, during dependency collection (before the compile phase).
Official Maven Build Lifecycle description:
install - install the package into the local repository, for use as a dependency in other projects locally
It's not necessary to keep libraries as part of the same project(or have it as a multi-module project). But once you want to share those libraries with your teammates, you would need either to force them installing libraries locally (as you did), or store those libraries at some external repo, like Artifactory or Nexus
We have a directory structure like so
java
build/build.gradle (This does NOT exist yet, but we want this)
servers
server1/build.gradle
server2/build.gradle
libraries
lib1/build.gradle
lib2/build.gradle
We have 11 servers and 14 libraries with varying uses of dependencies. EACH server is a composite build ONLY depending on libraries (we don’t allow servers to depend on each other). In this way, as our mono-repo grows, opening up server1 does NOT get slower and slower as more and more gradle code is added(ie. gradle only loads server1 and all it’s libraries and none of the other libraries OR servers are loaded keeping things FAST).
Ok, so one problem we are running into is duplication now which is why we need build/build.gradle file AND we want EVERY module in our mono repo to include that somehow for a few goals(each goal may need a different solution)
GOAL 1: To have an ext { … } section containing a Map of Strings to gradle dependencies much like so
deps = [
'web-webserver': "org.webpieces:http-webserver:${webpiecesVersion}",
'web-webserver-test': "org.webpieces:http-webserver-test:${webpiecesVersion}",
'web-devrouter': "org.webpieces:http-router-dev:${webpiecesVersion}"
]
In this way, we want ALL our projects to them import dependencies like so
compile deps['web-webserver']
GOAL 2: We want to 'include' a standard list of plugins so we are versioning all gradle plugins the same across the repo. While the above configures all jars to avoid jar hell in a mono-repo, we would like to do the same with just this section
plugins {
id 'com.github.sherter.google-java-format' version '0.9'
}
Of course, it each project may also want to add a few more plugins OR even not depend on this section(in case of an emergency and trying to just get the job done).
GOAL 3: We want checkstyle configuration (or any plugin config) to be defined the SAME for all projects (eventually!!!). We would like the checkstyle gradle to live in a common area but have all libraries somehow pull it in. Again, it would be nice for it to be optional in that, I can pull the gradle section into my build.gradle OR can create a new one in case of emergencies so I don't have to fix all projects in the monorepo right away.
IDEALLY, perhaps I kind of want configuration injection where when I run server1/build.gradle, it actually runs java/build/build.grade as it’s parent somehow but with overrides (IF I declare 'extends xxx.gradle' maybe) then all libraries it uses also use java/build/build.gradle as their parent. I am not sure this is possible or feasible. I am pretty sure 'extends xxx' doesn't exist in gradle.
Are any of these GOALS possible?
thanks,
Dean
I have been working on a monorepo with the exact same requirement as you, using gradle composite builds as well. The way we have solved this problem is by using pre compiled plugins
You need to do a new gradle project with only the code you want to share. This will create a plugin, that you can just add as a composite build and apply to the other projects.
I'm a bit confused by why you don't just use a "standard" gradle top level build file and compose the others as subprojects.
This solves all 3 of your goals
If you are concerned by build speed, you can target each server individually simply by running
./gradlew :server1:build
But if you are not able to do this for some reason you can use the apply from: syntax as described here
I was trying to build Maven pom in something similar to the following hierarchical form:
root
+-- A-POM
+-- B-POM
+-- C-POM
+---D-POM
I was hoping that this could take care of my changed module problem. That is, if C is changed, then A must be rebuilt, etc.
But I ran into the issue that it seems the packaging at root is "pom," and after that I can't have A as packaging "war" then continue to drill in to have A include B, C as its modules. It seems to me that any POM which does not have "pom" in the then it can't have child modules. Is my understanding correct? Is there a way to do what I wanted to do?
In addition, I don't seem about to chain the "changed" mechanism in Maven (must due to my lack of knowledge). I like to have Maven detect a dependent project has changed and rebuild all the affected projects.
Thanks so much!
the reactor project (the root of the multimodule project) must have pom packaging. So your nested structure is invalid since A is not of type pom and I'm pretty sure you won't get it to work this way.
Second point is that Maven is a modularized build system and uses repository mechanisms to locate pre-built artifacts instead of checking out all modules from version control and building them in a monolithic way like in the old days ;) This means that Maven cannot know what to rebuild when you change something at your module since it simple does not have all the other module there at this time.
I think this is more a CI task than that should be handled by the build system itself. I know that your can achieve such a behavior with an appropriate build/CI Server like Jenkins that supports upstream and downstream projects. This means it is able to detect dependencies between the projects and trigger other builds as soon as a dependency has been built. This comes close to the behavior you are trying to achieve.
Btw. rebuilding other projects is only required for SNAPSHOT dependencies. Jenkins with the maven plugin supports this behavior but, depending on the number of SNAPSHOT dependencies of your project, this can cause long chains of project builds on the server. Some folks are of the opinion that in general SNAPSHOT versions are hell for CI tasks since these artifacts can change over time and are not reproducible. You could think over completely omitting SNAPSHOT versions and building final versions each time. This would also obviate your requirement to rebuild other modules as soon as a module changes. There are simply no changes until you upgrade dependency versions.
The entire java project has an ant build; however couple of module(s) have maven build too.
My new module (maven built, say A) has dependency over an existing module(or simply a folder?, say B) which is being built using ant which just packages the src into jar and drops it inside the project.
Maven build for module A fails (unable to locate moduleB files); Options -
1. Package module B using maven, push to m2_repo
I do not want to go with this option.
Please let me know what are the other options available for the same.
If you have control over the source of all your modules, and if you decided that Maven is your way to go forward, then I recommend to to go all-in as soon as possible. If you do not then you will have continuous problems for the two build strategies to play along. And not only during build time but also at deploy time when it is time to collect all your runtime dependencies. Unless you build one uber-JAR as your only deployable artifact.
If your modules will (almost) always release in sync, then consider using a multi-module project setup.
When I say "all-in" then I do not mean that you have to give up Ant completely. You can use the maven-antrun-plugin to kick of your existing ant builds.
You should also consider running your own repository server, e.g. Nexus, to take full advantage of your maven builds.