We use our cusrom plugin and define the script in this way (This is an approximate pseudocode):
//It is common part for every script (1)
environments {
"env1" {
server mySettings("host1", "port1", "etc")
}
"env2" {
server mySettings("host2", "port2", "etc")
}
... //another common scopes
}
and
def defaultSettings(def envHost, def envPort = "15555" ...) {
return {
// Specific settings for the current script (package names, versions etc)
}
}
So in all my scripts (which are separate projects and are in separate git repositories) the common part (1) is repeated.
Is there any correct way to define the common part as a specific project (this can not be part of the plugin - the common part also changes periodically)?
I want to refer to this part when creating a new project and describe only the project-specific settings.
It looks like gradle multi-project builds, but common part should be in other git repository/Nexus.
Important clarification - the common part can also be in the Nexus, have a version ( to have POM descriptor).
It's quite common to have an "opinionated" plugin and a "base" plugin. Gradle uses this concept quite often.
One example is the java plugin automatically applies the java-base plugin. So the java-base plugin contains all of the tasks (logic) but doesn't actually do anything. The java plugin adds the tasks and configures them (eg it adds the src/main/java and src/test/java conventions). So the java-base plugin is not opinionated, the java plugin is opinionated.
So, you could do the same, have a base plugin and a opinionated plugin which
Applies the base plugin
Configures the environments specific for your use case
Note also that you can move logic from build.gradle to a plugin if you put the logic within a project.with { ... } closure. Eg:
class MyPlugin implements Plugin<Project> {
void apply(Project project) {
project.with {
subprojects { ... }
configurations { ... }
dependencies { ... }
task foo(type: Bar) { ... }
}
}
}
There is another solution to your problem. The approach may be less clean than using an opinionated plugin, but it allows you to manage simple Gradle scripts independently from your projects:
The apply from: term to include Gradle scripts is not limited to file paths, but can also handle URLs. This way, you can simply manage your scripts in a standalone repository and provide the newest version via a web server.
To test this way of script distribution and access, you can even use the raw file view feature provided by various repository platforms like GitHub or Bitbucket:
apply from: 'https://raw.githubusercontent.com/<user>/<repo>/<branch>/<file>'
The biggest disadvantage of this approach is the fact, that you need to have access to the local or even global web server for each build, if you need to ensure company-external or offline builds, you should stick to #LanceJavas solution and use a custom plugin.
Related
Can I create a Gradle plugin that adds a dependency based on an extension value?
I have a convention plugin that I use for libraries various projects, which brings in various dependencies, takes care of boilerplate configuration, configures other plugins etc etc. I want to add an extension to the plugin that can tell the plugin whether or not to add a certain dependency, in this case it happens to be Spock, as not every library module needs the Spock dependency.
So far, my plugin looks like this
interface BasePluginExtension {
Property<Boolean> getUseSpock()
}
class BasePlugin implements Plugin<Project> {
#Override
void apply(Project project) {
BasePluginExtension basePluginExtension = project.extensions.create('basePluginConfig', BasePluginExtension)
// If a value was supplied, use it, otherwise assume we want Spock
if (basePluginExtension?.useSpock?.get() ?: true) {
// Printing for debugging purposes
println "I'm using spock! ${basePluginExtension.useSpock.get()}"
// Currently apply a plugin that applies Spock but could also just add a dependency
project.plugins.apply("test-config")
}
}
}
Then in the build.gradle file that I want to pull my plugin into, I have
plugins {
id 'base-plugin'
}
basePluginConfig {
useSpock = true
}
I'm following the docs on configuring an extension but I am getting the following error:
Cannot query the value of extension 'basePluginConfig' property 'useSpock' because it has no value available.
I've also tried the method of making an abstract class for the extension but I want the ability to have multiple configurable parameters in the future.
Is adding a dependency after plugin extension values have been configured not allowed/out of order for how Gradle works? Or am I possibly missing something obvious?
I have a Gradle-based library that is imported as a dependency into consuming applications. In other words, an application that consumes my library will have a build.gradle file with a list of dependencies that includes both my library as well as any other dependencies they wish to import.
From within my library's build.gradle file, I need to write a Gradle task that can access the full set of dependencies declared by the consuming application. In theory, this should be pretty straightforward, but hours of searching has not yielded a working solution yet.
The closest I've come is to follow this example and define an additional task in the library's build.gradle file that runs after the library is built:
build {
doLast {
project.getConfigurations().getByName('runtime')
.resolvedConfiguration
.firstLevelModuleDependencies
.each { println(it.name) }
}
}
I keep getting an error message that the 'runtime' configuration (passed into getByName and referenced in the Gradle forum post I linked) cannot be found. I have tried other common Gradle configurations that I can think of, but I never get any dependencies back from this code.
So: what is the best way to access the full set of dependencies declared by a consuming application from within the build file of one of those dependencies?
Okay, I mostly figured it out. The code snippet is essentially correct, but the configuration I should have been accessing was 'compileClasspath' or 'runtimeClasspath', not 'runtime'. This page helped me understand the configuration I was looking for.
The final build task in the library looks roughly like this:
build {
doLast {
// ...
def deps = project.getConfigurations().getByName('compileClasspath')
.resolvedConfiguration
.firstLevelModuleDependencies
.each {
// it.name will give you the dependency in the standard Gradle format (e.g."org.springframework.boot:spring-boot:1.5.22.RELEASE")
}
}
}
I’ve written a plugin (which currently just lives in buildSrc) that creates several tasks whose names are based on values provided by the user. How can I make it so that they execute whenever the build script that applies the plugin is run? It doesn't need to run at any specific point in the execution phase.
To start off with, you work around a basic Gradle concept. A Gradle task is not designed to run on every Gradle invocation. If you really need code to run on each Gradle invocation, execute it directly during configuration phase instead of wrapping it inside a task.
However, there are two causes for a task to run on a Gradle build:
direct selection (via command line or settings.startParameter.taskNames modification)
via one or more task dependencies (dependsOn / finalizedBy)
Of course you can use one of these methods to circumvent Gradle and execute your task on each build (#mkobit used the second method), but since your plugin would basically break basic Gradle principles, your solution may fail at some future time or for a more complex project (since plugins are supposed to be reusable).
As a summary, I would recommend to bundle all your generated tasks in one task with a constant name, so that your user can easily run the task on each Gradle invocation by putting a single line in his settings.gradle file:
startParameter.taskNames.add '<bundleTask>'
One way you could accomplish this is to use the all method on the TaskCollection to add a dependsOn/finalizedBy relationship to all (or some) tasks in the project.
Example to create a single myTask with every task in allproject depending on it:
class MyPlugin implements Plugin<Project> {
void apply(final Project project) {
final myTask = project.tasks.create('myTask')
project.allprojects.each { proj ->
proj.tasks.all {
// Make sure to not add a circular dependency
if (it != myTask) {
it.dependsOn(myTask)
}
}
}
}
}
Our build uses a custom plugin extension in gradle that has dynamic methods. This worked fine in gradle 2.1, but methodMissing is no longer called in 2.2 and I get the following exception (here's the caused by part):
Caused by: org.gradle.api.internal.MissingMethodException: Could not find method common() for arguments [api] on org.gradle.api.internal.artifacts.dsl.DefaultComponentModuleMetadataHandler_Decorated#1bef1304.
at org.gradle.api.internal.AbstractDynamicObject.methodMissingException(AbstractDynamicObject.java:68)
at org.gradle.api.internal.AbstractDynamicObject.invokeMethod(AbstractDynamicObject.java:56)
at org.gradle.api.internal.CompositeDynamicObject.invokeMethod(CompositeDynamicObject.java:172)
at org.gradle.api.internal.artifacts.dsl.DefaultComponentModuleMetadataHandler_Decorated.invokeMethod(Unknown Source)
...
How do I get dynamic functions working in our build system with gradle 2.2?
The Background:
These dynamic methods are used for several things, but one is to simplify how projects depend on other projects (it is a very large system with over 80 subprojects that each may have multiple named APIs (public, internal, add-on, etc)).
This is my Plugin's apply:
void apply(Project project) {
project.subprojects.each { subproject ->
subproject.extensions.create("modules", ModuleExtension.class ) }
}
ModuleExtension has no variables or functions other than methodMissing:
def methodMissing(String name, args)
{
//return project dependency based on name/args. This no longer gets called in 2.2!
}
Sample usage in a gradle file:
dependencies {
compile module.nameOfModule( "name of api" )
}
I've also overrode the following in ModuleExtension just to see if they are getting called, but they are not:
def invokeMethod(String name, args)
def propertyMissing(String name)
def propertyMissing(String name, value)
I'm actually unable to reproduce this issue in Gradle 2.2. However, this is a somewhat misuse of Gradle extensions. If you simply want a globally available object I would simply add it as a project extra property. This has the added benefit of not having to be created for every subproject, since projects inherit properties from their parent.
ext {
modules = new ModuleExtension()
}
Edit: This is due to the new support for module replacements introduced in Gradle 2.2. The symbol modules within a dependencies block now delegates to a ComponentModuleMetadataHandler rather than your extension. You'll either have to rename your extension something other than modules or qualify the call by using project.modules.nameOfModule.
I'm setting up a multi-module Gradle build for a legacy system at work (replacing the current Ant build). However, I'm new to Gradle, and I'm not sure what's the best way to do it. And I want to do it right, because this build script will be around for a long time. I have found a way to do things that works, but when I google around and read answers on StackOverflow, I see people using a different way, which --in my case-- doesn't work. But maybe I'm doing something wrong. I've also been reading in the Gradle in Action book, but haven't found this particular issue there.
I have a build.gradle file in the root of my project, with a bunch of subdirectories that each contain a sub-project. Most of these are regular Java projects, but there are some Wars and Ears in there, too, which require some special packaging love. Each sub-project has its own build.gradle file which, at this point, only contains a list of dependencies, nothing more.
The build.gradle file in the root of my projects looks something like this (I left out the War stuff for brevity):
configure(javaProjects()) {
jar.doFirst {
manifest {
...
}
}
}
configure(earProjects()) {
apply plugin: 'ear'
ear.doFirst {
manifest {
...
}
}
}
Set<String> javaProjects() {
subprojects - earProjects()
}
Set<String> earProjects() {
subprojects.findAll { it.name.endsWith(".ear") }
}
The only reason why I'm doing things this way, is because it was the first solution I tried that I could get to work in my situation. Now that the script is growing, though, it starts to feel a little clunky. Also, the doFirst thing seems a little awkward.
But when I look on StackOverflow, I see recommendations of using constructs like this:
allprojects {
tasks.withType(Jar) {
manifest {
...
}
}
tasks.withType(Ear) {
manifest {
...
}
}
}
This seems much nicer, but I don't seem to be able to rewrite my script in that way. I get errors like this one:
Cannot change configuration ':some.subproject:compile' after it has been resolved.
I don't know what to do about this error, and I can't seem to google it either, for some reason.
So, my question is: have I indeed been doing things the wrong way? Or rather: in a way that is not idiomatic Gradle? For the sake of maintainability, I'd like to do things as idiomatically as possible. And if so: what can I do about the error message?
In general you should do things like described in your second snippet:
allprojects {
tasks.withType(Jar) {
manifest {
...
}
}
}
But there are some limitations where this isn't sufficient. The error message you get means that you modify the compile configuration AFTER the configuration is already resolved. That happens for example when you do something like
configurations.compile.files.each...
during the configuration phase (e.g. in your manifest block like seen above) and in another place (e.g. in one of your subprojects build.gradle files):
dependencies{
compile "org.acme:somelib:1.2.3"
}
The other problem with this is, that you resolve the compile dependencies every time you invoke your build script, even when no jar task is triggered.
The suggested workaround is to replace
tasks.withType(Jar) {
manifest {
...
}
}
with
tasks.withType(Jar) {
doFirst{
manifest {
...
}
}
}
That means that resolving the configuration is postponed to the execution phase of gradle and really just triggered when needed.
When you configure a project in a multiproject build you can think of that each snippet that is part of the whole configuration. you're not configuring the project 'twice' but you configure different aspects of the project at different places.
This is a known limitation of the current gradle configuration model.
You can still use
configure(earProjects()) {
}
that doesn't matter here. IMO it is just a matter of personal preference. The gradle project itself uses 'configure'.
Personally I prefer to apply the plugins like Ear or war on the projects build.gradle file to mark a project as a ear/war project.
To share common configurations among all ear projects, you could have something like this in your root build.gradle file:
allprojects{
plugins.withType(EarPlugin){
// only applied if it is a ear project
// put ear specific logic here
}
}