How to tell tasks and other code blocks apart in Gradle? - gradle

I have a build.gradle file cobbled together from examples online:
apply plugin: "java"
sourceSets {
java {
srcDirs = ['src']
}
}
repositories {
flatDir {
name "fileRepo"
dirs "repo"
}
}
uploadArchives {
repositories {
add project.repositories.fileRepo
}
}
When I run gradle tasks --all, I can see that "uploadArchives" is a task. How can I tell what is a task by looking at the build.gradle file? If "repositories" and "sourceSets" aren't considered tasks, what are they?

You simply can't.
But, the pure knowledge whether a closure configures a task or something else, won't give you anything. To understand a build script, you will need to understand the basic concept of Gradle and the used plugins, either built-in or third-party.
Each build.gradle script is executed against a Project instance. Everything you can access from the build script belongs to one of the following scopes:
The Project object itself. This scope includes any property getters and setters declared by the Project implementation class. For example, getRootProject() is accessible as the rootProject property. The properties of this scope are readable or writable depending on the presence of the corresponding getter or setter method.
The extra properties of the project. Each project maintains a map of extra properties, which can contain any arbitrary name -> value pair. Once defined, the properties of this scope are readable and writable. See extra properties for more details.
The extensions added to the project by the plugins. Each extension is available as a read-only property with the same name as the extension.
The convention properties added to the project by the plugins. A plugin can add properties and methods to a project through the project's Convention object. The properties of this scope may be readable or writable, depending on the convention objects.
The tasks of the project. A task is accessible by using its name as a property name. The properties of this scope are read-only. For example, a task called compile is accessible as the compile property.
The extra properties and convention properties are inherited from the project's parent, recursively up to the root project. The properties of this scope are read-only.
For your specific example, uploadArchives is a task, repositories belongs to the original Project object (it is available in each build script) and sourceSets is an extension of the java plugin.
Please note, that many plugins do not require or plan direct task configuration. They provide a DSL extension for configuration and then generate the tasks based on this configuration.

Related

Can I create a Gradle plugin that adds a dependency or another plugin based on a Gradle extension value?

Can I create a Gradle plugin that adds a dependency based on an extension value?
I have a convention plugin that I use for libraries various projects, which brings in various dependencies, takes care of boilerplate configuration, configures other plugins etc etc. I want to add an extension to the plugin that can tell the plugin whether or not to add a certain dependency, in this case it happens to be Spock, as not every library module needs the Spock dependency.
So far, my plugin looks like this
interface BasePluginExtension {
Property<Boolean> getUseSpock()
}
class BasePlugin implements Plugin<Project> {
#Override
void apply(Project project) {
BasePluginExtension basePluginExtension = project.extensions.create('basePluginConfig', BasePluginExtension)
// If a value was supplied, use it, otherwise assume we want Spock
if (basePluginExtension?.useSpock?.get() ?: true) {
// Printing for debugging purposes
println "I'm using spock! ${basePluginExtension.useSpock.get()}"
// Currently apply a plugin that applies Spock but could also just add a dependency
project.plugins.apply("test-config")
}
}
}
Then in the build.gradle file that I want to pull my plugin into, I have
plugins {
id 'base-plugin'
}
basePluginConfig {
useSpock = true
}
I'm following the docs on configuring an extension but I am getting the following error:
Cannot query the value of extension 'basePluginConfig' property 'useSpock' because it has no value available.
I've also tried the method of making an abstract class for the extension but I want the ability to have multiple configurable parameters in the future.
Is adding a dependency after plugin extension values have been configured not allowed/out of order for how Gradle works? Or am I possibly missing something obvious?

How can I access the dependencies of an application from within the build file of a dependency embedded in the application?

I have a Gradle-based library that is imported as a dependency into consuming applications. In other words, an application that consumes my library will have a build.gradle file with a list of dependencies that includes both my library as well as any other dependencies they wish to import.
From within my library's build.gradle file, I need to write a Gradle task that can access the full set of dependencies declared by the consuming application. In theory, this should be pretty straightforward, but hours of searching has not yielded a working solution yet.
The closest I've come is to follow this example and define an additional task in the library's build.gradle file that runs after the library is built:
build {
doLast {
project.getConfigurations().getByName('runtime')
.resolvedConfiguration
.firstLevelModuleDependencies
.each { println(it.name) }
}
}
I keep getting an error message that the 'runtime' configuration (passed into getByName and referenced in the Gradle forum post I linked) cannot be found. I have tried other common Gradle configurations that I can think of, but I never get any dependencies back from this code.
So: what is the best way to access the full set of dependencies declared by a consuming application from within the build file of one of those dependencies?
Okay, I mostly figured it out. The code snippet is essentially correct, but the configuration I should have been accessing was 'compileClasspath' or 'runtimeClasspath', not 'runtime'. This page helped me understand the configuration I was looking for.
The final build task in the library looks roughly like this:
build {
doLast {
// ...
def deps = project.getConfigurations().getByName('compileClasspath')
.resolvedConfiguration
.firstLevelModuleDependencies
.each {
// it.name will give you the dependency in the standard Gradle format (e.g."org.springframework.boot:spring-boot:1.5.22.RELEASE")
}
}
}

Gradle Custom String Notation in dependency

I'm creating a global repository for all the artifacts that I'm producing. Right now I've created an S3 bucket as repository where I'm storing the artifacts.
As there is no group (artifacts are global), I ended up duplicating the name as the group so the current notation for a dependency is Name:Name:Version. For this, I've created an extension function that takes two arguments, a name (the dependency that I want to add to the project) and the version.
For example, I would add implementation("Name", "1.0") if I wanted the dependency Name in my project. This is translated to implementation("Name:Name:1.0") and works fine but I feel that it is a little bit ugly and can be confusing, the dependency in the External Libraries tree in IntelliJ shows the dependency as Gradle: Name:Name:1.0, the longer the name is, the uglier it is.
The question is, is it possible to write a custom Notation that let me just do implementation("Name", "1.0") without an extension function so it only shows Gradle: Name:1.0 and everything else is handled in the background?
I have looked at the class ParsedModuleStringNotation and it seems the thing that I would need to change (create my own), but the creation of the objects is hardcoded, I am unsure how to proceed from there.
if you declare S3 repository as flatDir - there is an option for what you are asking
https://docs.gradle.org/current/dsl/org.gradle.api.artifacts.dsl.RepositoryHandler.html
flatDir(args)
Adds a resolver that looks into a number of directories for artifacts. The artifacts are expected to be located in the root of the specified directories. The resolver ignores any group/organization information specified in the dependency section of your build script. If you only use this kind of resolver you might specify your dependencies like ":junit:4.4" instead of "junit:junit:4.4".
was able to work as below
repositories {
def s = project.rootDir.toString() + "/lib/"
flatDir dirs: s
mavenCentral()
}
//compileOnly 'javax.jms:jms-api:1.1-rev-1'
compileOnly ':my-jms:1.2'

How to make Kotlin `internal` objects accessible to tests?

My project uses several Gradle source sets for its production code base instead of just main:
domain
dal
rest
test
dbUnitTest
This has proven very useful for limiting dependencies and enforcing separation of concern.
It comes with one downside however: we cannot access classes or methods with visibility internal from within test classes. The reason for this is that the Kotlin compiler places every source set in its own "module":
$ find . -name '*.kotlin_module'
./classes/kotlin/domain/META-INF/contact-management_domain.kotlin_module
./classes/kotlin/dal/META-INF/contact-management_dal.kotlin_module
./classes/kotlin/rest/META-INF/contact-management_dal.kotlin_module
./classes/kotlin/test/META-INF/contact-management.kotlin_module
./classes/kotlin/dbUnitTest/META-INF/contact-management_dbUnitTest.kotlin_module
I would like all sourceset to use the same module name "contact-management", as the main sourceset would by default.
I tried to override the name with the compiler option -module-name:
tasks.withType<KotlinCompile> {
kotlinOptions {
// place all source sets in the same kotlin module, making objects with 'internal' visibility available to every source sets of this project
freeCompilerArgs += listOf("-module-name \"contact-management\")
}
}
Upon running gradlew build, I get
> Task :contact-management:compileDomainKotlin FAILED
e: Invalid argument: -module-name "contact-management"
The reason being that -module-name "contact-management_domain" is set before by the Gradle code invoking the Kotlin compiler as well, but apparently this option is only accepted once.
In a Gradle build, how can I control what is being considered "one module" by the Kotlin compiler?
A related question where the test source set is to be split has no satisfactory answers so far.
You can do that using kotlin compilations. (As far as I understand, a compilation is simply a block of files that are compiled together. A good explanation can be found here)
When you create a sourceset in gradle, the kotlin plugin creates a compilation under the hood (with the same name as the sourceset).
What you can do now with compilations is create associations. If a compilation A is associated with another compilation B, source code in A gets access to internal code units of B.
So in your case, if the test sourceset should get access to the dal sourceset you can simply associate the test compilation with the dal compilation:
kotlin.target.compilations.getByName("test").associateWith(kotlin.target.compilations.getByName("dal"))
PS: It also works the other way around. If you create compilations explicitly, the corresponding sourcesets are created under the hood. So for custom sourcesets you can create compilations and associate them:
val domainCompilation = kotlin.target.compilations.create("domain")
val dalCompilation = kotlin.target.compilations.create("dal") {
associateWith(domainCompilation)
}
In above example, the sourceset domain will have access to internal code units of the sourceset dal.

Gradle - common part of DSL in separate (in other) git repository

We use our cusrom plugin and define the script in this way (This is an approximate pseudocode):
//It is common part for every script (1)
environments {
"env1" {
server mySettings("host1", "port1", "etc")
}
"env2" {
server mySettings("host2", "port2", "etc")
}
... //another common scopes
}
and
def defaultSettings(def envHost, def envPort = "15555" ...) {
return {
// Specific settings for the current script (package names, versions etc)
}
}
So in all my scripts (which are separate projects and are in separate git repositories) the common part (1) is repeated.
Is there any correct way to define the common part as a specific project (this can not be part of the plugin - the common part also changes periodically)?
I want to refer to this part when creating a new project and describe only the project-specific settings.
It looks like gradle multi-project builds, but common part should be in other git repository/Nexus.
Important clarification - the common part can also be in the Nexus, have a version ( to have POM descriptor).
It's quite common to have an "opinionated" plugin and a "base" plugin. Gradle uses this concept quite often.
One example is the java plugin automatically applies the java-base plugin. So the java-base plugin contains all of the tasks (logic) but doesn't actually do anything. The java plugin adds the tasks and configures them (eg it adds the src/main/java and src/test/java conventions). So the java-base plugin is not opinionated, the java plugin is opinionated.
So, you could do the same, have a base plugin and a opinionated plugin which
Applies the base plugin
Configures the environments specific for your use case
Note also that you can move logic from build.gradle to a plugin if you put the logic within a project.with { ... } closure. Eg:
class MyPlugin implements Plugin<Project> {
void apply(Project project) {
project.with {
subprojects { ... }
configurations { ... }
dependencies { ... }
task foo(type: Bar) { ... }
}
}
}
There is another solution to your problem. The approach may be less clean than using an opinionated plugin, but it allows you to manage simple Gradle scripts independently from your projects:
The apply from: term to include Gradle scripts is not limited to file paths, but can also handle URLs. This way, you can simply manage your scripts in a standalone repository and provide the newest version via a web server.
To test this way of script distribution and access, you can even use the raw file view feature provided by various repository platforms like GitHub or Bitbucket:
apply from: 'https://raw.githubusercontent.com/<user>/<repo>/<branch>/<file>'
The biggest disadvantage of this approach is the fact, that you need to have access to the local or even global web server for each build, if you need to ensure company-external or offline builds, you should stick to #LanceJavas solution and use a custom plugin.

Resources