Gradle - Compile submodules in parallel - gradle

I have a project with two sub modules.
Client - A UI based on Google's web developer kit
Server - A spring boot based server.
Now in my Gradle config(build file) on server, Im creating a jar file from the client and then including it on the server through the below snippet. Lastly, I create a .war file based on the server config.
dependencies {
compile project(':client')
}
The architecture is similar to Spring Boot's proposed ways of resource handling.
Now, when I run the Gradle build, because server is dependent on the client, the server compilation doesnt start until the client compilation and tests are done.
I feel that I'm not making use of Gradle's parallelism with this way of compiling client and server.
Are there any ways such that a compile and run test cases in parallel and then create a .war file only when both submodule's tasks are complete? How do i access the configurations of the client and server modules and then create a new war file on the rootProject?

You can try to add flag --parallel to your Gradle command. However this is still incubating feature. I noticed significant improvement on building time when running Gradle daemon, so you can try it out as well.

No, this level of parallelism is not currently available. I think the team are slowly working towards general parallel task execution, as described in their spec. That should allow for the kind of behaviour you're requesting.
That said, you can run tests in parallel if they're independent, via the maxParallelForks and forkEvery options. MrHaki gives a short how-to for this on his blog. Note that this only applies to a single Test task instance.

Related

Running jacoco report where integration tests are in one code base and source code is in another code base

I recently started working on creating jacoco reports for maven projects including unit and integration tests and they seem to work out correctly.
Now I have encountered a different scenario which I am not sure how to approach.
I have one workspace which consists of integration test cases - application A, but the source code does not exist in the same workspace/code base. The source code which actually runs on invoking these integration test scripts are in a different workspace/code base - application B(they are invoked using rest api calls with the localhost urls. The jboss server is started for application B so that the localhost context is up) from the integration tests.
The aim is to invoke these integration tests from application A, which in turn calls the source code of these tests in application B generating the jacoco report of the code coverage for application B.
I am not actually sure how to achieve this.
Can someone provide some input.
Thanks.
If I understand you correctly, you actually have 2 different processes in your scenario:
The "client" process that runs the integration tests and for which jacoco can be easily applied, but it's not what you need
The "server" process that runs the actual JBoss server and executes the actual code.
Client process contacts the server via HTTP.
In this case, I'm afraid jacoco won't be able to provide a coverage for you if you're running the tests from maven/gradle, because jacoco instruments only bytecode on the running JVM. So you have to be "creative" here :)
I'll list here some possible approaches
Disclaimer: I haven't tried them though (didn't work with jboss/java ee), but maybe you'll be able to at least borrow some ideas
The first approach would be running the tests together with the application somehow, like its done for example in spring tests (I'm not sure whether JBoss provides similar capabilities).
The idea is simple:
You run the integration test, it runs the jboss "embedded in the same jvm" and you can inject beans / EJB session beans into the test (like autowiring with spring).
The advantage of such a method is that you'll be able just to use jacoco maven plugin and it will instrument everything for you
I don't know how easy will be achieving this architecture technically, I know that recent jboss versions support embedded mode, So maybe you'll find This link to be a useful foundation
Another direction is to take a look at Arquillian project. They have some jacoco extension that probably will help, but I've never tried it.
And the last approach I can think of is running the jboss server with jacoco agent directly instead of relying on the build system that runs jacoco for you.
The idea here is to stream the results of covered server code into some file / tcp endpoint. So you run the jboss with -javaagent:[yourpath/]jacocoagent.jar and it starts streaming the results wherever you need it to stream. After the tests you should gather these results and prepare a report. You can find Here more information about this approach

How to load a WAR module into Spring's built-in Tomcat running in a standalone?

I am having three modules in my Maven project:
parent
rest-api # Spring REST API
web-client # AngularJS web client
application # Project to bundle it all up for a standalone
I am not sure if I have here an "elegant" solution so please hit me with a stick if that is complete garbage, but this is how it works - or how it's supposed to work:
rest-api
Module rest-api does simply offer the REST API and other core functionality - basically it is pure server code. It is a jar artifact.
web-client
To separate client and server code I am having the module web-client. It is a war project that hold the client webapp.
application
This module is supposed to glue it all up. It depends on rest-api and web-client. It does two important things:
It's pom.xml uses the spring-boot-maven-plugin to build a standalone runnable jar - my ultimate goal
It provides the main(String args[]) method that starts the #SpringBootApplication with SpringApplication.run(EasyModelAccessServer.class, args);
What I am currently able to is tell Eclipse to run this in a servlet container. The server boots up and I my two resources, the rest-api and the web-client working. Everything is fine so far.
The issue
What is not fully working is the standalone. If I package up the whole thing and run the server:
$ path/to/application: mvn clean package
$ path/to/application: java -jar target/application.jar
Only the REST API will work. The reason is because the web-client is not added or introduced as a web app to the Spring built-in Tomcat.
The question
is how I can make this work. There are two options which come to my mind:
Somehow sneak in the web-client.war file into the application.jar such that it is available as a resource and programmatically call tomcat.addWebapp("/web-client", "path/to/web-client.war") (or something like that) to load the additional service
Hope that the spring-boot-maven-plugin or another Spring Maven plug-in can do that for me and find somebody that links me to an example.
I've tried it with 1. but I didn't succeed to move web-client.war into application.jar but I am also not sure if I should actually do that.
FAQ
Q: "Why do you separate everything instead of merge all those modules into a server module where the Spring Maven plug-in would do everything for you out of the box?"
A: I really want to separate the client code from the server code. I could however merge web-client into application but last time I tried that I had 10 other issues why this did not work out so I decided to keep it that way and that it actually shouldn't be so hard to load an additional server resource.
Q: "Can I take a look at the project?"
A: Yes, you can. Just take a look: https://github.com/silentsnooc/easy-model-access Please forgive me that I am currently using tabs instead of whitespaces - I am going to change that as soon as I got everything up and running.

Maven Multi Module benefits over simple dependency

I have some years of experience with maven projects, even with multi modules ones (which has made me hate the multi modules feature of maven (so the disclaimer is now done)) and even if I really like maven there is something I cannot get a clear answer about :
What is a typical usecase of a multi module maven project ? What is the added value of such a structure compared to simple dependencies and parent pom ?
I have seen a lot of configuration of multi module projects but all of them could have clearly been addressed by creating a simple structure of dependency library living their own life as deliverables (even with a parent pom, as a separate deliverable : factorising depedencies and configuration) and I haven't found any usecase where I can clearly see an added value of the multi module structure.
I have always found that this kind of structure brings an overkilling complexity with no real benefit : where am I missing something ? (to be truly honest, I can get that some ear can benefit from this kind of structure but except from that particular usecase, any other real use and benefit ?)
Here's a real life case.
I have a multi-module project (and to your rant... I haven't seen any complications with it.) The end result is a webapp but I have different modules for api, impl, and webapp.
12 months after creating the project I find that I have to integrate with Amazon S3 using a stand-alone process run from a jar. I add a new module which depends on api/impl and write my code for the integration in the new module. I use the assembly plugin (or something like it) to create a runnable jar and now I have a war I can deploy in tomcat and a process I can deploy on another server. I have no web classes in my S3 integration process and I have no Amazon dependencies in my webapp but I can share all the stuff in api and impl.
3 months after that we decide to create a REST webapp. We want to do it as a separate app instead of just new URL mappings in the existing webapp. Simple. One more module, another webapp created as the result of the maven build with no special tinkering. Business logic is shared easily between webapp and rest-webapp and I can deploy them as needed.
The major benefit of multi modules are
one single maven command to build all your modules at once.
and the most important : maven take care of the build order for you.
configuring your CI-server is also very easy: one single jenkins job to build everything.
I already worked in a project with about 30 submodules. Sometimes, you need to change something in more than module, and running one single command and being sure that everything that need to be compiled is compiled in the correct order is a must.
EDIT
Why 30 submodules ?
Huge framework with lot's a features, lot's of developers, separation of features on a module base. It's a real life use case and the separation of the code into module was really meaningful.
I think you are correct in that most project that use multi modules, actually don't need them.
At where I work we use multimodule projects (and I think that for a good reason). We have something similar to a service oriented architecture, so each application
A client module
An interface module (which has shared objects between the client and implementation)
an implementation module
a war module
I agree that putting that implementation and war module in the same actual module would be ok, but the (arguably) benefit of this is that is very clear division between the classes that solve the problem and how the application communicates with the external world.
In previous projects that involved just a web application, I've tried to put everything in the same module, as it made testing easier, given the modules I was using.
Multi modules can help you with re-use your code.
It's one of the best benefits you'll feel in work.
Imagine if you have 3 web projects with a security layer, You'll have to copy paste your code 3 times and trying connect it with each project.
But what if you create a security module a project with a specific job.
It'll be easy to use it by injecting it to your app and then boom it works.
Also as mentioned in #ben75's answer the one maven build command and the correct order of building all your used jars. You'll think no more about which depends on another.
I find maven modules extremely useful for the following reasons:
Architecture layering and boundaries
For example, I make a maven module application-contract which contains the interfaces my presentation layer sees. So I have UI->Presenter-> application-contract <-application-impl <- infrastructure -> domain. This way, I know that my presentation/UI layer will not have access to classes from my Domain/application layers. If domain classes are not in classpath when I code in UI, I cant use them. And I like it this way (utilizing the class path restrictions). Perhaps Java 9 modules can solve this problem too, but (unfortunately) I have work with Java 8.
Running tests in one module each time
When I change code to a layer which is a module (as mentioned previously) I can run its tests only, without re-runing tests from code I did not change. This gives me speed. My presentation layer tests need ~3 seconds (for 300 tests). Every time I change code to a Presenter or whatever below application layer, I don't want my database H2 integration tests to run. Or My Image processing tests to run. Because these do IO and they are slow.
Building
Pretty much the same thing. When I change code to UI, i have only to build and deploy UI stuff (my UI is in Java).

Bamboo doesn't recognize test in my Spring project

I have a Spring project (Apache CXF, Spring, Hibernate, Maven ...) hosted on BitBucket and I'm trying to use Bamboo as my CI server. My idea is deploying the code directly to Heroku from Bamboo so that deploying time is automated.
I made a plan with a couple of tasks to achieve this. First I have a Source Code Checkout task and a builder task. Both of them are working, code is compiling and test are passing, I can see that in the task log. The problem is that Bamboo doesn't seem to recognize the tests (it marks the task are testless).
I have also tried to create a new JUnit test task and it's even worst. Log shows that everything is working properly but Bamboo marks the plan as a failure after the test task is executed.
Any ideas?
Not sure which version of Bamboo you're using, but in the version that we have, you have to turn on unit test result evaluation on the Builder tab. Please see the attached screenshot, and make sure that this is enabled, and the directory setting is pointing to the directory where Maven Surefire creates the test results (in XML format).

What is the "maven way" for this ant development workflow?

How can maven be configured to support this type of workflow:
One Time Setup Invoke maven to do one time setup of a developers machine such as
Create a custom version of tomcat configured for this application
Create a local postgres database on the developers machine
load sample data into the database
run a junit test to configure other resources needed to run the application
Integration Tests Invoke maven to do run integration tests which should do the following
Create an integration test db
setup the db
Run command line integration tests against the db
Run a test version of tomcat with the application in it
Run command line junit tests that test the restful services exposed by the application
Release Build Invoke maven to do a release build of the system
do all the steps for an integration test
generate resources and configurations that are used on the server rather than production
deposit the end result in a git repo, commit, and push the changes to production
Test Build Invoke maven to do a test build of the system
do all the steps of a release build but configure the test release package with test server configuration
The main thing I am struggling with is that maven has a single build life-cycle with a well defined sequence of phases not sure if the workflow I want to build is a good fit for maven.
Can maven be configured for this type of workflow? If yes what are the key features of maven that allow for the different configurations of the four main ways that I want to use maven?
Update What I mean by this workflow, is that I want to be able to do something like
mvn setup
mvn integration
mvn prod-release
mvn test-release
I know the above example look like ant, I am long time ant user and total noob with maven.
You could setup Maven to do all that...
You probably would use (shock horror) profiles to achieve some of this...
BUT you don't want to do that
You are following ANT style thinking... if you like that style of thinking then use ANT or Gradle and be happy.
If you want to follow the Maven way, then you will solve the problem differently.
Coming from the Maven way, here are my thoughts:
Why do you need one-time setup? I usually have a run profile that dynamically provisions the correct application server and starts it with the App deployed, tearing down everything afterwards when I hit ^C. Typically this involves starting up a database server or two... hence things I have developed like the cassandra-maven-plugin. That way when I am working on a different project (which could be in 10 minutes time) I don't have to worry about background database servers eating up all my laptop's ram.
Integration tests are actually trivial when you have the above working... in fact I created the Maven Failsafe Plugin to make it easy to have plugin execution tied to the appropriate phases for integration testing. The Maven convention is to have a profile called run-its for running integration tests.
Release builds being different from test builds... ugh! You should be building environment agnostic artifacts. Have them pick up their configuration from the environment they are deployed in. That removes the worry that something has changed between the "test" build and the "production" build. If you really need to bundle the config, then I usually would resort to a separate module for taking the agnostic artifact and rebundling with the required configuration. That way it is easy to prove that you have a reproducible transformation and that nothing has changed inbetween what went to QA vs what is going to Ops.
I always make the release builds include the integration testing.
So typically I have my projects such that
$ mvn -Prun
will fire up the application starting from zero. Hitting ^C will tear everything back down again, and mvn clean or in extreme situations if I have a more complex setup process and need some caching mvn post-clean (think really clean) will remove anything that the run profile put into play
To run the integration tests I typically do
$ mvn -Prun-its verify
To make a release I typically do
$ mvn release:prepare release:perform -B
That is (in my view) the ideal way of handling the above steps you need.
HTH.
BTW I have not had to use PostgreSQL specifically (typically my integration tests and run profile can get away with a pure java database such as derby or hsqldb and because the artifacts are environment agnostic it is easy to have the integration test/dev flyweight app server inject the correct JDBC url) so you may hit some issues with regard to PostgreSQL

Resources