Currently, I have a few utility functions defined in the top level build.gradle in a multi-project setup, for example like this:
def utilityMethod() {
doSomethingWith(project) // project is magically defined
}
I would like to move this code into a plugin, which will make the utilityMethod available within a project that applies the plugin. How do I do that? Is it a project.extension?
This seems to work using:
import org.gradle.api.Plugin
import org.gradle.api.Project
class FooPlugin implements Plugin<Project> {
void apply(Project target) {
target.extensions.create("foo", FooExtension)
target.task('sometask', type: GreetingTask)
}
}
class FooExtension{
def sayHello(String text) {
println "Hello " + text
}
}
Then in the client build.gradle file you can do this:
task HelloTask << {
foo.sayHello("DOM")
}
c:\plugintest>gradle -q HelloTask
Hello DOM
https://docs.gradle.org/current/userguide/custom_plugins.html
I implemented this recently, a full example is available at Github.
The injection basically boils down to
target.ext.utilityMethod = SomeClass.&utilityMethod
Beware:
This method could potentially conflict with some other plugin, so you should consider whether to use static imports instead.
Based on Answer 23290820.
Plugins are not meant to provide common methods but tasks.
When it comes to extensions they should be used to gather input for the applied plugins:
Most plugins need to obtain some configuration from the build script.
One method for doing this is to use extension objects.
More details here.
Have a look at Peter's answer, using closures carried via ext might be what you are looking for.
Related
I'm working on a Gradle (Groovy, not Kotlin) library using a bunch of external libraries, and we have a case where we want to implement double-inheritance in our code off of a custom Task provided by an external library. (For reference, that library is specifically the MarkLogic DataHub, and we're extending RunFlowTask, but I've generalized a bit for this example. There are a few restrictions that introduces, but I'm fairly certain all of them can be worked around.)
What I want is the following:
ClassA.gradle
class ClassA extends com.external.plugin.TaskA {}
ClassB.gradle
import com.fasterxml.jackson.databind.ObjectMapper
class ClassB extends ClassA {}
...with no restrictions on where things need to be, just that it works. Worth noting that we have a bunch of examples like ClassA that work on their own, I just need to extend ClassA again.
I've detailed a few attempts I made to get this to work below; any feedback on what I did wrong with any of them is more than welcome, or if there's any advice on an entirely new way to build things, that's totally appreciated as well.
First attempt:
apply from: './ClassA'
apply from: './ClassB'
=>
'unable to resolve class ClassA' in ClassB.gradle, which makes enough sense given what I know about how Gradle compilation works. I tried replacing class ClassB... with println ClassB just to see, and that printed without an issue. Made me think that ClassA needed to be compiled in advance, so I'm pretty sure there's nothing I'm doing wrong here, it just won't work.
Second attempt:
buildSrc/src/main/groovy/ClassA.gradle exists and is the same as above.
buildSrc/build.gradle
import com.fasterxml.jackson.databind.ObjectMapper
buildscript {
repositories { }
dependencies {
classpath fileTree(dir: '/path/to/dependencies', include: '*.jar') // includes jackson
}
}
println ObjectMapper
The println worked, but I get:
/path/to/src/main/groovy/ClassA.groovy: 1: unable to resolve class com.fasterxml.jackson.databind.ObjectMapper
# line 1, column 1.
import com.fasterxml.jackson.databind.ObjectMapper
...and the same thing if I duplicate the buildscript block into the ClassA submodule, or if I remove the import from the ClassA submodule. My question for this is is there anything I'm doing wrong with the imports? Seems like this should work and the imports should just work, but they're not.
Third attempt:
Sparing a few code examples here: I got everything very close to working with copying buildSrc into an includeBuild folder, and ClassA is accessible as a TaskReference in the top-level project, but I don't know how to to actually extend ClassA from there.
gradle.includedBuild('subbuild').task(':ClassA') => org.gradle.composite.internal.IncludedBuildTaskReference
gradle.includedBuild('subbuild').task(':UPMCRunFlowTask').resolveTask() => Task with path ':ClassA' not found in project ':subbuild'.
My question for this is is there a way to dig back to the class reference so it's actually extendable? I tried digging into setting up an include './subbuild' subproject and ran into similar issues.
Any help/advice is welcome - thanks!
Answer was on the second attempt. In build.gradle, it needed to be:
repositories {}
dependencies {
implementation fileTree(dir: '/path/to/deps', include: ['*.jar'])
}
...and nothing needed to be in the sub-files.
I'm trying to migrate a script from Groovy's gradle to Kotlin's gradle.
The function:
def Group(Closure closure) {
closure.delegate = dependencies
return closure
}
ext {
aGroup = Group {
implementation xyz
kapt xpto
...
}
...
}
What I manage to do:
object Group {
operator fun <T> invoke(project: Project, closure: Closure<T>):Closure<T> {
closure.delegate = project.dependencies
return closure
}
}
Notice that I've added the extra arg Project.
Groovy access the project property directly, I would like to know how can I get this directly from kotlin too, so I'll be able to remove that extra arg and keep the previous syntax.
Short answer: You can't
Long answer:
You can't due to the strong/statically typed nature of Kotlin (and Java). Groovy is a dynamic language and lets you get away with many things, to an extent. In addition to the dynamic nature of Groovy, Gradle implements a lot of the metaprogramming that Groovy offers.
So even though you do not explicitly pass in a Project in your Groovy script, behind the scenes it is being looked up dynamically.
You could make Group an extension of Project to have a reference of project.
I'm a little bit confused about the correct way to create custom tasks on Gradle. On the tutorial for the Creation of custom tasks, they use tasks.register like this:
def check = tasks.register("check")
def verificationTask = tasks.register("verificationTask") {
// Configure verificationTask
}
check.configure {
dependsOn verificationTask
}
Instead here (still official Gradle documentation), they create new tasks that way:
task('hello') {
doLast {
println "hello"
}
}
task('copy', type: Copy) {
from(file('srcDir'))
into(buildDir)
}
and
tasks.create('hello') {
doLast {
println "hello"
}
}
tasks.create('copy', Copy) {
from(file('srcDir'))
into(buildDir)
}
Finally, according to the document https://docs.gradle.org/current/userguide/task_configuration_avoidance.html, they suggest to move from the second/third case to the first one. Does it mean that the second/third cases are obsolete? If yes, why is Gradle still making massive usage of the old API inside its documentation?
Which variant should a user use?
The Gradle API has many ways to define tasks. There is no "right" or "wrong" way for application developers so long as you are consistent, but it does matter for Gradle plugin authors.
The Task Configuration Avoidance doc you linked states (emphasis mine):
As of Gradle 5.1, we recommend that the configuration avoidance APIs be used whenever tasks are created by custom plugins.
So if you are a plugin author, use task configuration avoidance wherever possible
For everyone else (application developers), it doesn't particularly matter, to an extent, so long as your as consistent across your entire application.
I am trying to write a shared libray which combines of global variables and shared functions to perform automated task of build and deployment for our project
The project layout as below:
The project has two major parts:
Global shared variables which are placed in vars folder
Supporting groovy scripts to abstract logics which in turn will be called in the global variable.
Inside the groovy class, I am using println to log debugging information
But it never got printed out when it is invoked through jenkins pipeline job
The log console of jenkins job as below:
Can someone show me how to propage logs from groovy class to jenkins job console as I can see only log from println in the global shared script is shown in log console.
I just found a way to do it by calling println step that is available in the jenkins job
Basically I create a wrapper function as below in Groovy class PhoenixEurekaService:
The steps is actually the jenkins job environment passed into Groovy class via constructor. By this way we can call any steps available in jenkins job in Groovy class.
In global groovy script PhoenixLib.groovy
I am not sure is there other way to do that...
All the commands/DSL e.g: println, sh, bat, checkout etc can't be accessed from shared library.
ref: https://jenkins.io/doc/book/pipeline/shared-libraries/.
You can access steps by passing them to shared library.
//your library
package org.foo
class Utilities implements Serializable {
def steps
Utilities(steps) {this.steps = steps}
def mvn(args) {
steps.println "Hello world"
steps.echo "Hello echo"
//steps.sh "${steps.tool 'Maven'}/bin/mvn -o ${args}"
}
}
jenkinsfile
#Library('utils') import org.foo.Utilities
def utils = new Utilities(this)
node {
utils.mvn '!!! so this how println can be worked out or any other step!!'
}
I am not a 100% sure if this what you are looking for, but printing things in a shared library can be achieved by passing steps and using echo. See Accessing steps in https://jenkins.io/doc/book/pipeline/shared-libraries/
This answer addresses those who need to log something from deep in the call stack. It might be cumbersome to pass pipeline "steps" object all the way in the stack of the shared library especially when the call hierarchy gets complex. One might therefore create a helper class with a static Closure that holds some logic to print to the console. For example:
class JenkinsUtils {
static Closure<Void> log = { throw new RuntimeException("Logger not configured") }
}
In the steps groovy, this needs to be initialized (ideally in a NonCPS block). For example your Jenkinsfile (or var file):
#NonCPS
def setupLogging() {
JenkinsUtils.log = { String msg-> println msg }
}
def call() {
...
setupLogging()
...
}
And then, from any arbitrary shared library class one can call print to console simply like:
class RestClient {
void doStuff() {
JenkinsUtils.log("...")
}
}
I know this is still hacky for a workaround, although I could not find any better working solution even though I spent quite some time researching.
Posted this also as a gist to my github profile
I'm having trouble figuring out what META-INF/gradle-plugins/plugin.properties has to match. There are two issues here. One is the name of the properties file and one is the value therein set to implementation-class. So here is my question as succinctly as I can put it.
If I have:
META-INF/gradle-plugins/com.my.plugin.properties
What portion of the source plugin does the file name map to? What are the implications of this name? It's source and destination? Am I doing the name correctly or should it not be so qualified?
If inside the file I have:
implementation-class=build.WeirdoPlugin
What are the source and destination of this name?
implementation-class should point to the your main plugin class that implementsPlugin<>
for example, in jmeter-gradle-plugin/src/main/resources/META-INF/gradle-plugins/net.foragerr.FooPlugin.properties
implementation-class=net.foragerr.soanswers.FooPluginImpl
I will then implement a class:
package net.foragerr.soanswers
class FooPluginImpl implements Plugin<Project>{
...
}
To use the plugin, you apply it as (in the new gradle format):
plugins {
id "net.foragerr.FooPlugin" version "xxx"
}