I’ve written a plugin (which currently just lives in buildSrc) that creates several tasks whose names are based on values provided by the user. How can I make it so that they execute whenever the build script that applies the plugin is run? It doesn't need to run at any specific point in the execution phase.
To start off with, you work around a basic Gradle concept. A Gradle task is not designed to run on every Gradle invocation. If you really need code to run on each Gradle invocation, execute it directly during configuration phase instead of wrapping it inside a task.
However, there are two causes for a task to run on a Gradle build:
direct selection (via command line or settings.startParameter.taskNames modification)
via one or more task dependencies (dependsOn / finalizedBy)
Of course you can use one of these methods to circumvent Gradle and execute your task on each build (#mkobit used the second method), but since your plugin would basically break basic Gradle principles, your solution may fail at some future time or for a more complex project (since plugins are supposed to be reusable).
As a summary, I would recommend to bundle all your generated tasks in one task with a constant name, so that your user can easily run the task on each Gradle invocation by putting a single line in his settings.gradle file:
startParameter.taskNames.add '<bundleTask>'
One way you could accomplish this is to use the all method on the TaskCollection to add a dependsOn/finalizedBy relationship to all (or some) tasks in the project.
Example to create a single myTask with every task in allproject depending on it:
class MyPlugin implements Plugin<Project> {
void apply(final Project project) {
final myTask = project.tasks.create('myTask')
project.allprojects.each { proj ->
proj.tasks.all {
// Make sure to not add a circular dependency
if (it != myTask) {
it.dependsOn(myTask)
}
}
}
}
}
Related
I have a Gradle-based library that is imported as a dependency into consuming applications. In other words, an application that consumes my library will have a build.gradle file with a list of dependencies that includes both my library as well as any other dependencies they wish to import.
From within my library's build.gradle file, I need to write a Gradle task that can access the full set of dependencies declared by the consuming application. In theory, this should be pretty straightforward, but hours of searching has not yielded a working solution yet.
The closest I've come is to follow this example and define an additional task in the library's build.gradle file that runs after the library is built:
build {
doLast {
project.getConfigurations().getByName('runtime')
.resolvedConfiguration
.firstLevelModuleDependencies
.each { println(it.name) }
}
}
I keep getting an error message that the 'runtime' configuration (passed into getByName and referenced in the Gradle forum post I linked) cannot be found. I have tried other common Gradle configurations that I can think of, but I never get any dependencies back from this code.
So: what is the best way to access the full set of dependencies declared by a consuming application from within the build file of one of those dependencies?
Okay, I mostly figured it out. The code snippet is essentially correct, but the configuration I should have been accessing was 'compileClasspath' or 'runtimeClasspath', not 'runtime'. This page helped me understand the configuration I was looking for.
The final build task in the library looks roughly like this:
build {
doLast {
// ...
def deps = project.getConfigurations().getByName('compileClasspath')
.resolvedConfiguration
.firstLevelModuleDependencies
.each {
// it.name will give you the dependency in the standard Gradle format (e.g."org.springframework.boot:spring-boot:1.5.22.RELEASE")
}
}
}
As can be seen from the top of documentation of class org.gradle.api.tasks.testing.Test, test tasks can be configured using the following piece of code:
test {
// configuration here. For example:
useJUnitPlatform()
}
From usage of method useJUnitPlatform we can assume that method test is called with a Closure which has an instance of aforementioned class Test as delegate.
In Gradle, there are other similar methods which take a Closure. For example, afterEvaluate. Documentation of method afterEvaluate is readily available in documentation of class Project. This is also mentioned in the user guide:
This example uses method Project.afterEvaluate() to add a closure which is executed after the project is evaluated.
Where is the documentation of method test? I could not find it. Or maybe this isn't a method in a class, but inserted via reflection into class Project at runtime or some similar DSL magic?
test in this context is not a method per se, but rather a task named test. To figure out what exactly is going on here requires diving into the internals of Gradle itself which is not part of any public documentation because well, it's not part of the public API.
The only way to figure exactly out what is going on is to debug Gradle during its execution. The easiest way to do that is to generate a plugin project via gradle init. Write a simple Gradle build file such as (build.gradle; I am assuming you are using the Groovy DSL):
plugins {
id("java")
}
test {
useJUnitPlatform()
}
Then write a basic functional test and start debugging. I was curious myself what is going and did just that.
In the following screenshot, you can see the stack trace in the bottom left corner. As you can see, there is a lot of methods called.
There is a mixture of Groovy specific methods and Gradle specific methods. Digging further in, you will come to:
You can see here bottom right that the list of objects is:
Project (root project)
Extra properties
Extensions
Tasks
This aligns with what I mentioned earlier: Gradle will go out of its way to match to what is being asked for. This is also explained in the "A Groovy Build Script Primer" in official documentation (starting from "If the name is unqualified [...]").
Eventually, you will land in some of the public API methods:
getByName is part of NamedDomainObjectContainer which is documented. However, what actually implements that interface is not as you can see from the Javadoc here. The implementation, from debugging, is DefaultTaskContainer.
The rest I will leave to you as an exercise. Hopefully this gives you an idea as to what is going on behind the scenes.
Indeed, test { ... } in this case is not calling a method with name test. This block is a feature of the Gradle API called "Groovy dynamic task configuration block". Per Gradle documentation version 6.1:
// Configure task using Groovy dynamic task configuration block
myCopy {
from 'resources'
into 'target'
}
myCopy.include('**/*.txt', '**/*.xml', '**/*.properties')
This works for any task. Task access is just a shortcut for the tasks.named() (Kotlin) or tasks.getByName() (Groovy) method. It is important to note that blocks used here are for configuring the task and are not evaluated when the task executes.
As such, per this shortcut convention, test { ... } block is used for configuring a task registered in the project – task with name test in this case.
Although nowadays I'd recommend using Gradle's Configuration Avoidance API to configure a task lazily instead of eagerly:
project.tasks.named('test', Test).configure {
it.useJunitPlatform()
}
See getByName replacement in the table "Existing vs New API overview".
We use our cusrom plugin and define the script in this way (This is an approximate pseudocode):
//It is common part for every script (1)
environments {
"env1" {
server mySettings("host1", "port1", "etc")
}
"env2" {
server mySettings("host2", "port2", "etc")
}
... //another common scopes
}
and
def defaultSettings(def envHost, def envPort = "15555" ...) {
return {
// Specific settings for the current script (package names, versions etc)
}
}
So in all my scripts (which are separate projects and are in separate git repositories) the common part (1) is repeated.
Is there any correct way to define the common part as a specific project (this can not be part of the plugin - the common part also changes periodically)?
I want to refer to this part when creating a new project and describe only the project-specific settings.
It looks like gradle multi-project builds, but common part should be in other git repository/Nexus.
Important clarification - the common part can also be in the Nexus, have a version ( to have POM descriptor).
It's quite common to have an "opinionated" plugin and a "base" plugin. Gradle uses this concept quite often.
One example is the java plugin automatically applies the java-base plugin. So the java-base plugin contains all of the tasks (logic) but doesn't actually do anything. The java plugin adds the tasks and configures them (eg it adds the src/main/java and src/test/java conventions). So the java-base plugin is not opinionated, the java plugin is opinionated.
So, you could do the same, have a base plugin and a opinionated plugin which
Applies the base plugin
Configures the environments specific for your use case
Note also that you can move logic from build.gradle to a plugin if you put the logic within a project.with { ... } closure. Eg:
class MyPlugin implements Plugin<Project> {
void apply(Project project) {
project.with {
subprojects { ... }
configurations { ... }
dependencies { ... }
task foo(type: Bar) { ... }
}
}
}
There is another solution to your problem. The approach may be less clean than using an opinionated plugin, but it allows you to manage simple Gradle scripts independently from your projects:
The apply from: term to include Gradle scripts is not limited to file paths, but can also handle URLs. This way, you can simply manage your scripts in a standalone repository and provide the newest version via a web server.
To test this way of script distribution and access, you can even use the raw file view feature provided by various repository platforms like GitHub or Bitbucket:
apply from: 'https://raw.githubusercontent.com/<user>/<repo>/<branch>/<file>'
The biggest disadvantage of this approach is the fact, that you need to have access to the local or even global web server for each build, if you need to ensure company-external or offline builds, you should stick to #LanceJavas solution and use a custom plugin.
I am using the native-artifacts plugin which defines a number of tasks that extract the Nar dependencies of dependencies that are built elsewhere (and stored in Nexus/Maven). I need to ensure these tasks are called before the build of a binary, otherwise the headers that these Nars include are not found.
My question is, how do I define a system/plugin-defined task as a dependency of one of my tasks?
I'd like something like:
binaries.all {binary ->
dependencies {
// this next line is now the same as the plugin-defined task I want to have called before
// before the build takes place
compile "extractNarDeps${binary.name.capitalize()}"
}
}
Sadly, this doesn't build. How can I achieve this please? I have a component called unitTests that is a C++ component and is used to create unitTestsExecutable. I want extractNarDepsxxx called before compileUnitTests is called.
First of all, do you have any list named binaries? If yes, then you can try something like below in your build.gradle file as dependencies will be executed before your defined task:
dependencies {
binaries.each {binary ->
// this next line is now the same as the plugin-defined task I want to have called before
// before the build takes place
compile "extractNarDeps${binary.name.capitalize()}"
}
}
// Your defined task here
The way I did it was to specify my tasks as dependencies of the plugin's relevant task. The missing syntax link was in using "tasks" as a means of referencing the tasks in the project.
The answer isn't mine, it was very well explained here: tasks._applied_plugin_task_name_here.dependsOn(myTask)
We’re using liquibase, gradle and the com.augusttechgroup:gradle-liquibase-plugin. We’re currently facing the bug
https://liquibase.jira.com/browse/CORE-1803
which kills our continuoes integration server. Until this problem is resolved, we would like to use a workaround so that we can run the „dropAll“ task in gradle twice.
gradle dropAll dropAll
doesn’t work and
gradle dropAll && gradle dropAll
is no option because of our continuos integration servers which can’t manage this.
Is there a way to make a task run twice or do I have to work around the gradle plugin and write my own dropAll task like #judoole proposed here Liquibase 3.0.1 Gradle integration?
The simplest solution I found was just to declare two databases.
liquibase {
activities {
main {
url 'jdbc:postgresql://localhost:5432/db'
username 'user'
password 'passwd'
...
}
secondRun {
url 'jdbc:postgresql://localhost:5432/db'
username 'user'
password 'passwd'
...
}
}
runList = 'main'
}
dropAll.doFirst { liquibase.runList = 'main,secondRun' }
dropAll.doLast { liquibase.runList = 'main' }
Per default the 'runList' only contains the main db. But for dropAll he sets both databases active, so that dropAll will run on both ... which are actually the same.
I would consider using JavaExec, like the linked StackOverflow question, especially if you're only using xml and not Groovy to write your changelog. The beauty of the liquibase-gradle-plugin is just these strange cases you describe. From the Github repo:
Let's suppose that for each deployment, you need to update the data
model for your application's database, and wou also need to run some
SQL statements in a separate database used for security.
Now I haven't used the plugin, because I didn't need those features. But I did run into bugs in liquibase and needed to use a spesific version, namely 3.0.4, as generateChangelog has a strange bug since 3.0.5. Using the JavaExec version I was able to track down a version that suited my needs.
Now, the liquibase-gradle-plugin doesn't need to be your only weapon of choice. Gradle has plenty of room for writing your own little tasks. Also those who do some sql. Maybe try something along these lines and see if that works:
configurations {
driver
}
dependencies {
driver '<your-sql-driver>'
}
//"Bug" in Gradle. Groovy classes are loaded first. They need to know about sql driver
//Or I think it's still so
URLClassLoader loader = GroovyObject.class.classLoader
configurations.driver.each { File file ->
loader.addURL(file.toURL())
}
task deleteFromTables(description: 'Deletes everything.') <<{
def props = [user: "<username>", password: "<password>", allowMultiQueries: 'true'] as Properties
def sql = Sql.newInstance("<url>", props, "<driver-classname>)
try {
//Here you can do your magic. Delete something, or simple drop the database.
//After dropping it, you'd probably want another task for creating
//it back up again
sql.execute("DELETE ...")
} finally {
sql.close()
}
}
Gradle makes sure that each task gets executed at most once per Gradle invocation. If you need to do some work twice, you need to declare two tasks. Chances are that the plugin provides a suitable task type for you to declare a second dropAll task.