how to implement monorepo with gradle and multiple gradle projects - gradle

We have a few full gradle projects A, B, C, D. These are microservices that are going to start sharing the protobuf generated java files. we are thinking a structure like this
A
build.gradle (this is a full on gradle build)
B
build.gradle (this is B's full on gradle)
common
build.gradle (build the protobuf that is used by A and B)
Now, the question is how do we make sure that when a developer builds A, it also builds common in case it changed on his git pull. Same goes for B. The settings.gradle file didn't seem to hav a ../../:project or something like that.
I do remember gradle came out with a way to build multiple gradle projects though as well.
Ideally, when someone changes common, multiple jenkins builds would be kicked off as well verifying that changing core code didn't break any of the services that use it. I am not quite sure how to
1. document the things that depend on common
2. use the document to kick off builds of all things depending on common
Then if this were to grow, and you have D depends on C depends on common, each build needs to be kicked off feeding the binary upstream from common to C and then C's jar and common's jar to D. I know 'pants' is used at twitter to do this. Google is using bazel. Perhaps I look into that instead of gradle? or can we intermingle them?

Simply declaring a dependency on common should be sufficient enough:
// Project A's build.gradle
dependencies {
implementation(project(":common"))
}
In order to build a, the build of common would need to succeed. If the build of common failed for whatever reason, then then build of a will also fail. Example:
$ ./gradlew project-a:build
> Task :common:compileJava FAILED
/Users/cisco/code/example-multi-project/common/src/main/java/common/ExampleCommon.java:6: error: incompatible types: int cannot be converted to String
return 1;
^
1 error
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':common:compileJava'.
> Compilation failed; see the compiler error output for details.
You can see in the above that when I tried to build project a (project-a:build), the common's build task was invoked (:common:compileJava).
Both projects a/b/etc should have thorough tests (unit, integration, smoke, etc) to make sure that any incompatible changes are detected early/often.
You can read more about multi-project builds in the official guide: https://docs.gradle.org/current/userguide/multi_project_builds.html

Related

How can Gradle plugin access information about included builds?

I know you can access different modules (included using include) in a project via org.gradle.api.Project#getSubprojects(), and I know you can get the name and directories of separate builds that have been included (using includeBuild) via org.gradle.api.invocation.Gradle#getIncludedBuilds().
But how can my plugin get information such as the locations of Java source files and class files for projects included using includeBuild?
My goal here is to determine which files have changed in the current git branch (which I can do), and then collect their corresponding class files into a jar file that's used for our patching mechanism that inserts the patch jars at the front of the classpath rather than redeploying the whole application.
I don’t think it is a goal of Gradle to provide including builds with detailed information on included builds. Currently, the Gradle docs basically only state two goals for such composite builds:
combine builds that are usually developed independently, […]
decompose a large multi-project build into smaller, more isolated chunks […]
Actually, isolation between the involved builds seems to be an important theme in general:
Included builds do not share any configuration with the composite build, or the other included builds. Each included build is configured and executed in isolation.
For that reason, it also doesn’t seem to be possible or even desired to let an including build consume any build configurations (like task outputs) of an included build. That would only couple the builds and hence thwart the isolation goal.
Included builds interact with other builds only via dependency substitution:
If any build in the composite has a dependency that can be satisfied by the included build, then that dependency will be replaced by a project dependency on the included build.
So, if you’d like to consume specific parts of an included build from the including build, then you have to do multiple things:
Have a configuration in the included build which produces these “specific parts” as an artifact.
Have a configuration in the including build which consumes the artifact as a dependency.
Make sure that both configurations are compatible wrt. their capabilities so that dependency substitution works.
Let some task in the including build use the dependency artifact in whatever way you need.
Those things happen kind of automatically when you have a simple dependency between two Gradle projects, like a Java application depending on a Java library. But you can define your own kinds of dependencies, too.
The question is: would that really be worth the effort? Can’t you maybe solve your goal more easily or at least without relying on programmatically retrieved information on included builds? For example: if you know that your included build produces class files under build/classes/java/main, then maybe just take the classes of interest from there via org.gradle.api.initialization.IncludedBuild#getProjectDir().
I know, this may not be the answer you had hoped to get. I still hope it’s useful.

gradle 7.1.1 not running tests of includedBuilds nor transitive ones?

I have the following modules with the top one depending on the next depending on the next ( These links have VERY SIMPLE build.gradle and settings.gradle files)
https://github.com/deanhiller/webpieces/tree/master/core/core-ssl
https://github.com/deanhiller/webpieces/tree/master/core/core-datawrapper
https://github.com/deanhiller/webpieces/tree/master/core/core-util
https://github.com/deanhiller/webpieces/tree/master/core/core-logging
I temporarily added an throw new RuntimeException to a test in core-datawrapper and core-util and built the project core-ssl (in the repo, ../../gradlew build)
The settings of core-ssl (found in above link and pasted here) is
includeBuild '../core-datawrapper'
includeBuild '../core-mock'
The settings of core-datawrapper (again in above links)
includeBuild '../core-util'
I clear out core-util/build and I see these targets run
> Task :core-util:compileJava
> Task :core-util:classes
> Task :core-util:jar
That is it. Why is the tests not running? I thought build depended on assemble and test separately?
The same for ../../gradlew clean and ../../gradlew publish
Ideally, I want my target to affect all transitive projects as well. As developers add projects, I don't want to have to add code to each gradle project in the transitive deps list either.
Yes, that’s correct.
This is most probably due to the original use-case of composite builds.
You have some binary dependency on some library, and then you want to change something in the library and test it in your project before even committing or publishing the library.
So the composite build result replaces the binary dependeny and only as much work as really necessary is done, meaning the jar is built.
If you use composite builds as normal structuring element, you either need to wire the lifecycle tasks you want to have wired yourself or search for a plugin that maybe does it.
As developers add projects, I don't want to have to add code to each gradle project in the transitive deps list either.
You can programmatically iterate through the included builds using gradle.includedBuilds, so you can do the wiring in a generic way.

Gradle monobuild and map of jar files for all gradle composite builds

We have a directory structure like so
java
build/build.gradle (This does NOT exist yet, but we want this)
servers
server1/build.gradle
server2/build.gradle
libraries
lib1/build.gradle
lib2/build.gradle
We have 11 servers and 14 libraries with varying uses of dependencies. EACH server is a composite build ONLY depending on libraries (we don’t allow servers to depend on each other). In this way, as our mono-repo grows, opening up server1 does NOT get slower and slower as more and more gradle code is added(ie. gradle only loads server1 and all it’s libraries and none of the other libraries OR servers are loaded keeping things FAST).
Ok, so one problem we are running into is duplication now which is why we need build/build.gradle file AND we want EVERY module in our mono repo to include that somehow for a few goals(each goal may need a different solution)
GOAL 1: To have an ext { … } section containing a Map of Strings to gradle dependencies much like so
deps = [
'web-webserver': "org.webpieces:http-webserver:${webpiecesVersion}",
'web-webserver-test': "org.webpieces:http-webserver-test:${webpiecesVersion}",
'web-devrouter': "org.webpieces:http-router-dev:${webpiecesVersion}"
]
In this way, we want ALL our projects to them import dependencies like so
compile deps['web-webserver']
GOAL 2: We want to 'include' a standard list of plugins so we are versioning all gradle plugins the same across the repo. While the above configures all jars to avoid jar hell in a mono-repo, we would like to do the same with just this section
plugins {
id 'com.github.sherter.google-java-format' version '0.9'
}
Of course, it each project may also want to add a few more plugins OR even not depend on this section(in case of an emergency and trying to just get the job done).
GOAL 3: We want checkstyle configuration (or any plugin config) to be defined the SAME for all projects (eventually!!!). We would like the checkstyle gradle to live in a common area but have all libraries somehow pull it in. Again, it would be nice for it to be optional in that, I can pull the gradle section into my build.gradle OR can create a new one in case of emergencies so I don't have to fix all projects in the monorepo right away.
IDEALLY, perhaps I kind of want configuration injection where when I run server1/build.gradle, it actually runs java/build/build.grade as it’s parent somehow but with overrides (IF I declare 'extends xxx.gradle' maybe) then all libraries it uses also use java/build/build.gradle as their parent. I am not sure this is possible or feasible. I am pretty sure 'extends xxx' doesn't exist in gradle.
Are any of these GOALS possible?
thanks,
Dean
I have been working on a monorepo with the exact same requirement as you, using gradle composite builds as well. The way we have solved this problem is by using pre compiled plugins
You need to do a new gradle project with only the code you want to share. This will create a plugin, that you can just add as a composite build and apply to the other projects.
I'm a bit confused by why you don't just use a "standard" gradle top level build file and compose the others as subprojects.
This solves all 3 of your goals
If you are concerned by build speed, you can target each server individually simply by running
./gradlew :server1:build
But if you are not able to do this for some reason you can use the apply from: syntax as described here

Team City Conditional Build Execution

I want to create a one-click release build. I am fine creating multiple build configurations and linking them together or building everything in one build configuration.
Module Dependencies are mentioned below:
Module C depends upon module A and B. Module D depends upon module C.
A > C > D
B > C
There could be instances where nothing has been changed on module A or B. If nothing has been changed in a module, I do not want to build and release them as it was already released in past. This would have been simple if I need to build all the modules every time which is not the case.
Let us consider a scenario that there is a change in module B. In this case, I only want to build module B, C & D (but not A).
Is there a way this could be achieved with build step or parameters or by any other means in Team City?
In the Snapshot Dependencies enable the checkbox
Do not run new build if there is a suitable one
Only use successful builds from suitable ones
I recommend using Artifact Dependencies with Snapshot Dependencies.
Enable the follows settings
Build from the same chain
Do not run new build if there is a suitable one
Only use successful builds from suitable ones
Then the TC will not build some of build chain if no chainges

How to create a maven assembly with transitive dependencies for different deployment scenarios?

I'm having a problem reconciling building a project for use within an application server and for use as a stand-alone application.
To give an overall simplified context, say I have three Projects A, B, C.
Project A depends on Project B which depends on Project C.
Project C has a dependency X which is marked as provided since it was expected that it would be available as a JEE library within say an application server. i.e. jms.jar.
So if I perform an assembly build of Project A, I get all the transitive dependencies save for those marked as provided as expected.
Now I have a new deployment scenario where Project A needs to be used in a standalone environment i.e. outside an application server.
So now I need the jms jar to be a compile dependency. Does this mean that I should explicitly add a compile dependency for X in Project A? Doesn't this violate the Law of Demeter, i.e. don't talk to strangers, in the sense Project A shouldn't explicitly know about Project C but only about Project B?
This is a simple example but in reality I have multiple dependencies which have been marked as provided but are now need to be compile or runtime dependencies so they end up in the artifact produced by the maven assembly plugin.
Is this a fundamental problem with Maven or am I not using the tools correctly?
Thanks in advance for any guidance.
If you need your build to have variations in it for different scenarios, you need to use profiles and keep certain things (such as some of the dependencies) in the various profiles.
http://maven.apache.org/pom.html#Profiles
Different dependencies for different build profiles in maven
answers a similar question - but you can swap in the "release" and "debug" for "Project A" and "Project C"
Provided dependencies are a difficult subject. First of all: Provided dependencies are not transitive in the following sense: If your project C has a provided dependency on X, then A will not get the dependency. It is silently ignored. This fits with the following meaning of "provided" which I propose:
Only the artifacts that are actually deployed should mark dependencies as "provided". Libraries or other jars that are not individually deployed to a specific server should not have provided dependencies. Instead, they should declare their dependencies as compile dependencies. In your example: Project C should have a compile dependency on X. If project A knows that X is provided, it sets X to provided in "dependencyManagement". As project A should know the environment in which it runs it should decide what is provided and what is not. And "dependenyManagement" is the right place to declare this.
If your project A should be able to run within and without a given server, you probably need to make a lot of adjustments, even change the type from ear to jar. So you either use build profiles for this, which then have different dependencyManagement entries, or you split A into two projects which depend on some other project that contains the common elements.
If some given project C already has a provided dependency on X and you cannot change that, this is effectively the same as a missing dependency in C. This has to be repaired at some point, and this could be project A itself.

Resources