Multiple Gradle bootJar Configurations - Different mainClass Attributes? - spring-boot

I have a Gradle/Spring Boot build using a .kts (Kotlin Style) Gradle build file. (That last part might not matter.)
My Spring Boot application has two different classes that can be used to start it, one for "client" mode and one for "server" mode.
I have a single bootJar task that looks like:
tasks.bootJar {
mainClassName = "com.me.ClientApplication"
}
So by default when I run ./gradlew build bootJar, I get a client version of the JAR that when run with java -jar theJar.jar, executes the ClientApplication class.
However, I would also like to publish a second JAR that has a different mainClassName and runs com.me.ServerApplication instead.
How should I approach this? I'm fairly new to Gradle - should I be providing the mainClassName to the ./gradlew command to override it? Or can I define separate tasks like clientBootJar and serverBootJar that will produce separate artifacts?

Since your configuration is mutually exclusive (can't have two main files), I would say that having two tasks would be optimal:
tasks.bootJarClient {
mainClassName = "com.me.ClientApplication"
}
tasks.bootJarClient {
mainClassName = "com.me.ServerApplication"
}
Other approach is to use properties, but it makes more sense mostly if you have much greater variance.
Now to make a custom task, in your build.gradle.kts add something along those lines (didn't test it with SpringBoot specifically):
open class BootJarClient : BootJar() {
override mainClassName = "YourClientClassName"
}
tasks.register<BootJarClient>("bootJarClient") {
group = "Other"
}
// Server is basically the same
Naming of the properties are taken from GitHub sources.

Related

Create separate shadowJars for code and dependencies

Using the gradle shadowJar plugin version 4.0.2 and gradle 4.10, I wanted to create two separate shadowJars, one for my source code and another for my dependencies (since dependencies are large and rarely changes I don’t want to repackage them every time I change my source code). What I have in mind is to have a gradle plugin that adds two separate tasks and which takes the same configurations supplied by user for shadowJar and overrides the configurations/sources used to create the shadowJar.
Below is what I have got so far, still trying to figure out a clean way to pass the shadow configs only once and whether there are other gotchas I need to worry about (ex: having two mergeServiceFiles will break etc)
import com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar
task dependencyShadowJar(type: ShadowJar) {
mergeServiceFiles()
zip64 = true
relocate 'com.google.common', 'com.shadow.com.google.common'
classifier = 'dependencies'
configurations = [project.configurations.runtime]
}
task userCodeShadowJar(type: ShadowJar) {
mergeServiceFiles()
zip64 = true
relocate 'com.google.common', 'com.shadow.com.google.common'
classifier = 'mycode'
from sourceSets.main.output
}
task splitShadowJar {
doLast {
println "Building separate src and dependency shadowJars"
}
}
splitShadowJar.dependsOn dependencyShadowJar
dependencyShadowJar.dependsOn userCodeShadowJar
Ideally I would like to have a shadowJar settings specified once and the tasks copies the same settings, does that require creating a custom Plugin task in groovy ?
Can I copy the settings from existing shadowJar that user specifies and just overrides the from or configurations part alone for my purpose
Anybody has attempted something similar ?

A code generator task in a multi-project gradle build

I have studied thousand similar questions on SO and I am still lost. I have a simple multiproject build:
rootProject.name = 'mwe'
include ":Generator"
include ":projectB"
include ":projectC"
with a top level build.gradle as follows (settings.gradle):
plugins { id "java" }
allprojects { repositories { jcenter() } }
and with two kinds of project build.gradle files. The first one (Generator) exposes a run command that runs the generator taking the command line argument:
plugins {
id "application"
id "scala"
}
dependencies { compile "org.scala-lang:scala-library:2.12.3" }
mainClassName = "Main"
ext { cmdlineargs = "" }
run { args cmdlineargs }
The code generator is to be called from projectB (and an analogous projectC, and many others). I am trying to do this as follows (projectB/build.gradle):
task TEST {
project (":Generator").ext.cmdlineargs = "Hurray!"
println ("Value set:" + project(":Generator").ext.cmdlineargs )
dependsOn (":Generator:run")
}
Whatever I try to do (a gradle newbie here) I am not getting what I need. I have two problems:
The property cmdlineargs is not set at the point that task :projectB:TEST is run. The println sees the right value but the argument passed to the executed main method is the one configured in Generator/build.gradle, not the one in projectB/build.gradle. As pointed out in responses this can be work around using lazy property evaluation, but this does not solve the second problem.
The generator is only run once, even if I build both projectB and projectC. I need to run Generator:run for each of projectB and projectC separately (to generate different sources for each dependent project).
How can I get this to work? I suppose a completely different strategy is needed. I don't have to use command line and run; I can also try to run the main class of the generator more directly and pass arguments to it, but I do find the run task quite convenient (the complex classpath is set up automatically, etc.). The generator is a Java/Scala project itself that is compiled within the same multi-project build.
Note: tasks aren't like methods in java. A task will execute either 0 or 1 times per gradle invocation. A task will never execute twice (or more) in a single Gradle invocation
I think you want two or more tasks. Eg:
task run1(type:xxx) {
args 'foo'
}
task run2(type:xxx) {
args 'bar'
}
Then you can depend on run1 or run2 in your other projects.

How to use Gretty integrationTestTask with a war file?

Is it possible to use gretty integrationTestTask with a project that uses a war folder?
It seems from the documentation appBeforeIntegrationTest does not have access to the war. Is there another way to run test cases so that it uses the war folder?
Ideally, I want jettyStart -> test -> jettyStop to run. Although when I run it straight jettyStart hangs indefinitely, until jettyStop is run. Is there a way to run jettyStart in Gradle in the background or something?
Regardless what file structure your application has, the integrationTestTask is supposed to be configured with the name of an exsiting gradle task to execute when gradle integrationTest is run:
gretty {
// ...
integrationTestTask = 'integrationTest' // name of existing gradle task
// ...
}
What you want to archive is this:
gretty {
integrationTestTask = 'test'
}
Gretty's workflow when calling integrationTest is as follows:

Gradle `tasks.withType()` strange behavior for tasks added afterwards

I am using Gradle 2.14.1 and the https://github.com/unbroken-dome/gradle-testsets-plugin to add an integration test task. I would like to configures the location of the HTML reports.
The default tasks uses:
<project>/build/reports/tests
The testsets plugin configures:
<project>/build/intTest
for the intTest task, and I want:
<project>/build/reports/test
<project>/build/reports/intTest
Here is my configuration:
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'org.unbroken-dome.gradle-plugins:gradle-testsets-plugin:1.2.0'
}
}
apply plugin: 'java'
apply plugin: 'org.unbroken-dome.test-sets'
defaultTasks = ['clean', 'build']
tasks.withType(Test) {
reports.html.destination = new File(project.reportsDir, name)
println(it.name + '\t' + reports.html.destination)
}
task wrapper(type: Wrapper) {
description = 'Defines the common gradle distribution for this project.'
gradleVersion = '2.14.1'
}
testSets {
intTest
}
intTest.dependsOn test
check.dependsOn intTest
repositories {
jcenter()
}
dependencies {
testCompile 'junit:junit:4.12'
intTestCompile 'junit:junit:4.12'
}
println('===== final config =====')
println(test.name + '\t' + test.reports.html.destination)
println(intTest.name + '\t' + intTest.reports.html.destination)
(Please ignore the println statements for the time being.)
After a full build, the intTest task's reports are in the wrong place (the default), and the configuration for the standard test task is applied:
$ ls build/
classes dependency-cache intTest intTest-results libs reports test-results tmp
$ ls build/reports/
test
I added some output to see what is going on, and it seems strange (the project's root is 'blob'):
test /home/wujek/blob/build/reports/test
intTest /home/wujek/blob/build/reports/intTest
===== final config =====
test /home/wujek/blob/build/reports/test
intTest /home/wujek/blob/build/intTest
So, in the tasks.withType() block the location is reported to be correct, but in the end, it is not.
Please note that moving the tasks.withType() block after the testSets block works for this project, but my real setup is mode complex: I have mutliple modules, and the root build.gradle uses the subprojects block with the tasks.withType() block to configure the report locations for all modules, and then one of the submodules adds a new test set and its test task's HTML report has the wrong location. To fix this, I have to repeat the configuration in the submodules that add test sets.
What is going on here? Why does the tasks.withType() block say the config works, but in reality it doesn't?
This is due to the pecularities of ordering configuration in Gradle. Let's walk through your code as Gradle would process it to see what happens (skipping over lines that aren't relevant):
apply plugin: 'org.unbroken-dome.test-sets'
This executes the apply method of the test-sets plugin, which includes the creation of a class which listens for test sets to be added. It adds a whenObjectAdded action to the testSets container. You haven't added any test sets yet, so lets move back to your build.gradle.
tasks.withType(Test) {
reports.html.destination = new File(project.reportsDir, name)
println(it.name + '\t' + reports.html.destination)
}
You've now added an action that will apply to all existing Test tasks, and to new ones as they are created. Now where it all unwinds:
testSets {
intTest
}
This creates a testSet called intTest. The whenObjectAdded action in the test-sets plugin now fires:
The intTest sets Test task is created.
Your withType action fires, because there's now a new Test task. This sets the report to go where you want.
The whenObjectAdded action now continues, getting to this line which also sets the html report location, overriding what you just set.
When you change this to declare the testSet first it goes:
whenObjectAdded - Create the Test task
whenObjectAdded - Set the test task's HTML report location
withType is registered by you
withType fires setting the HTML report location to your desired destination
There aren't hard and fast rules to avoid this, since plugins can and do take wildly different approaches to how/when they register their configuration actions. Generally, I try to break my build.gradle down in this order:
Buildscript block (if needed)
apply plugins
set project level properties (group, version, sourceCompatibility, etc)
Configure extensions (sourceSets, testSets, configurations, dependencies)
Configure tasks (either direct or withType)
Usually this helps allow plugins that fire config to register default values before my configuration comes in to change things.
FYI, Gradle's new, and still incubating, model space is intended to help solve this configuration ordering problem. It won't be perfect, but allows the order to be more explicit.

Gradle - can I include task's output in project dependencies

I have a task that generates java sources and a set of jars from these sources (say, project a). I would like to export these jars to dependent projects (say, project b). So here's roughly what I have right now:
//a.gradle
configurations{
generatedJars
}
task generateJars(type: JavaExec) {
//generate jars ...
outputs.files += //append generated jars here
}
dependencies{
generatedJars generateJars.outputs.files
}
//b.gradle
dependencies{
project(path: ':a', configuration: 'generatedJars')
}
It works OK, except that adding generateJars.outputs.files as a dependency does not tell gradle that it has to run generateJars task when there are no jars generated yet. I have tried adding the task itself as a dependency hoping that it would work in the same way as it does when you add a jar/zip task to an artifact configuration (e.g. artifacts{ myJarTask }), but it throws an error telling me that I cannot do that. Of course I can inject the generateJars task somewhere in the build process before :b starts evaluating, but that's clumsy and brittle, so I would like to avoid it.
I feel like I should be adding the generated jars to artifacts{ ... } of the project, but I am not sure how to make them then visible to dependent projects. Is there a better way of achieving this?
Dependent projects (project b) will need to do setup IntelliJ IDEA module classpath to point to project a's generated jars. Something rather like this (pseudo-code):
//b.gradle
idea{
module{
scopes.COMPILE.plus += project(path: ':a', configuration: 'generatedJars').files
}
}
So far I have tried simply adding a project dependecy on :a's generatedJars in :b, but Idea plugin simply adds module :a as a module-dependency and assumes that it exports its generated jars (which is probably a correct assumption), therefore not adding the generated jars to :b's classpath.
Any help would be greatly appreciated!
First, do you need a separate configuration? That is, do you have clients of a that should not see the generated Jars? If not, you can add the generated Jars to the archives configuration, which will simplify things.
Second, the correct way to add the generated Jars to the configuration is (instead of the dependencies block):
artifacts {
generatedJars generateJars
}
This should make sure that the generateJars task gets run automatically when needed.
Third, I'd omit the += after outputs.files, although it might not make a difference. You should also add the necessary inputs.
Fourth, why do you need a JavaExec task to generate the Jars? Can you instead add the generated sources to some source set and let Gradle build them?
Fifth, IDEA doesn't have a concept corresponding to Gradle's project configuration dependencies. Either an IDEA module fully depends on another module, or not at all. You have two options: either use a module dependency and make the generated sources a source folder of the depended-on module (preferably both in the Gradle and the IDEA build), or pass the generated Jars as external dependencies to IDEA. In either case, you should probably add a task dependency from ideaModule to the appropriate generation task. If this still doesn't lead to a satisfactory IDEA setup, you could think about moving the generation of the Jars into a separate subproject.
For my use case, I had a C++ project which generated some native libraries which my java project needed to load in order to run.
In the project ':native' build.gradle:
task compile(type: Exec, group: 'build') {
dependsOn ...
outputs.files(fileTree('/some/build/directory') {
include 'mylib/libmy.so'
})
...
}
In project java application build.gradle:
configurations {
nativeDep
}
// Add dependency on the task that produces the library
dependencies {
nativeDep files(project(':native').tasks.findByPath('compile'))
}
// Unfortunately, we also have to do this because gradle will only
// run the ':native:compile' task if we needed the tasks inputs for another
// task
tasks.withType(JavaCompile) {
dependsOn ':native:compile'
}
run {
doFirst {
// Use the configuration to add our library to java.library.path
def libDirs = files(configurations.nativeDep.files.collect {it.parentFile})
systemProperty "java.library.path", libDirs.asPath
}
}

Resources