I have been looking around for a way to include test fixtures in my gradle publications.
https://developer.android.com/studio/publish-library/configure-test-fixtures#kts suggests that it should work automatically so long as the project name is set correctly, which I have done in the settings.gradle file. This seems to solve the issue in the case of https://github.com/slackhq/EitherNet/issues/44.
For context, my project is built with several sub modules and I have defined a custom publication for each (I suspect this is the clue to the issue) as shown here:
subprojects {
// ... some repos and unimportant plugin applications
tasks {
register("prepareKotlinBuildScriptModel") {}
withType<BootJar> {
enabled = false // this is enabled in the jar I wish to be bootable
}
withType<Test> {
useJUnitPlatform()
}
getByName<Jar>("jar") {
enabled = true
archiveClassifier.set("")
}
}
publishing {
publications {
create<MavenPublication>(project.name) {
version = projectVersion
artifactId = tasks.jar.get().archiveBaseName.get()
groupId = "${projectGroup}.${rootProject.name}"
from(components["kotlin"])
}
}
}
For ref, this is currently what my module structure and build.gradle looks like for the module in question:
module structure
plugins {
id("java-test-fixtures")
id("java-library")
}
dependencies {
testFixturesApi(project(":model"))
... unrelated stuff
The test fixtures work fine as internal dependencies in the project itself, but they do not get published so that they can be used in external projects.
So my question is if there is a way to bake the test fixtures into my submodule jars so they can be used in external projects?
Any input would be highly appreciated.
Tried, expected, result:
Publishing to local repo, expected the test fixtures to be bundled, they were not.
Related
I'd like an init script that lets me take arbitrary Gradle projects and change the Maven repository location that they publish artifacts to.
Adding a repository is easy enough when you edit the build file directly, just add a maven{} block inside publishing { repositories { } }. However, trying to do this generically leads to frustration and failure. I tried this:
allprojects {
beforeEvaluate {
pluginManager.withPlugin("maven-publish") {
extensions.getByType<PublishingExtension>().publications {
repositories {
maven {
url = uri("file:///my/path")
name = "myrepo"
}
}
}
}
}
}
but, no luck. No such repository appears. I suspect there is a timing issue here: although my code does run, it presumably runs after the publishing plugin has created these tasks. What I want to do is register a callback that is run before the publish plugin gets a chance to do that, but I don't know how.
I am very excited about the incubating Gradle's version catalogs and have been experimenting with it. I’ve found that the information in my gradle/libs.versions.toml is accessible in the build.gradle.kts scripts for my app and utility-lib projects.
However, I am unable to use the content of the toml file for buildSrc/build.gradle.kts or the convention files.
The only way that I could build was to hard-code the dependencies into those files, as I did before the version catalog feature.
In the buildSrc folder, I created a settings.gradle.kts file and inserted the dependencyResolutionManagement code for versionCatalogs, which is pointing to the same file as for my app and utility-lib projects.
Based on the Gradle7 docs, it seems that sharing a version catalog with buildSrc and modules is possible… I’d appreciate a nudge into getting it to work with buildSrc, if possible.
Here is a simple sample project, which I created via gradle init: my-version-catalog
Thank you for your time and help,
Mike
With Gradle 7.3.3, it is possible. Note version catalogs are GA since Gradle 7.4
The code snippet assumes Gradle is at least 7.4, but if you need them prior that version, insert enableFeaturePreview("VERSION_CATALOGS") at the beginning of each settings.gradle.kts.
Using buildSrc
buildSrc/settings.gradle.kts
dependencyResolutionManagement {
versionCatalogs {
create("libs") {
from(files("../gradle/libs.versions.toml"))
}
}
}
buildSrc/build.gradle.kts
dependencies {
implementation(libs.gradleplugin.intellij) // <- the lib reference
}
You can even use the version catalog for plugins
gradle/libs.versions.toml
...
[plugins]
kotlin-jvm = { id = "org.jetbrains.kotlin.jvm", version.ref = "kotlin" }
jetbrains-changelog = { id = "org.jetbrains.changelog", version.ref = "changelog-plugin" }
jetbrains-intellij = { id = "org.jetbrains.intellij", version.ref = "intellij-plugin" }
hierynomus-license = { id = "com.github.hierynomus.license", version.ref = "license-plugin" }
nebula-integtest = { id = "nebula.integtest", version.ref = "nebula-integtest-plugin" }
build.gradle.kts
plugins {
id("java")
alias(libs.plugins.kotlin.jvm)
alias(libs.plugins.nebula.integtest)
alias(libs.plugins.jetbrains.intellij)
alias(libs.plugins.jetbrains.changelog)
alias(libs.plugins.hierynomus.license)
}
Note for accessing the catalog within scripts, please refer to the below section, the trick is the same.
Using convention plugins and included build
In the main project include a the Gradle project that holds the convention plugins.
build.gradle.kts
includeBuild("convention-plugins") // here it's a subfolder
convention-plugins/settings.gradle.kts
dependencyResolutionManagement {
repositories {
gradlePluginPortal()
}
versionCatalogs {
create("libs") {
from(files("../gradle/libs.versions.toml"))
}
}
}
rootProject.name = "convention-plugins"
The trick to enable convention plugins to access the version catalog is split in two part, add an ugly implementation dependency that locate where the version catalog generated classes are located.
libs.javaClass.superclass.protectionDomain.codeSource.location
Then in the convention plugin refer to the libs extension via Project::the.
val libs = the<LibrariesForLibs>()
This is tracked by gradle/gradle#15383.
convention-plugins/build.gradle.kts
plugins {
`kotlin-dsl`
}
dependencies {
implementation(libs.gradleplugin.kotlin.jvm)
// https://github.com/gradle/gradle/issues/15383
implementation(files(libs.javaClass.superclass.protectionDomain.codeSource.location))
}
And in the actual convention plugin
import org.gradle.accessors.dm.LibrariesForLibs
plugins {
id("org.jetbrains.kotlin.jvm")
}
// https://github.com/gradle/gradle/issues/15383
val libs = the<LibrariesForLibs>()
dependencies {
detektPlugins(libs.bundles.kotlinStuff) // access catalog entries
}
The org.gradle.accessors.dm.LibrariesForLibs class is generated by gradle is somewhere in local gradle folder ./gradle/<version>/dependency-accessors/<hash>/classes
Quick note that older IntelliJ IDEA currently (2022.3) reports alias(libs.gradleplugin.thePlugin) as an error in the editor,
although the dependencies are correctly resolved.
This tracked by KTIJ-19369, the ticket indicates this is actually a bug in Gradle Kotlin DSL gradle/gradle#22797, and someone made a simple IntelliJ IDEA plugin to hide this error until resolved.
Brice, it looks like a can of worms to go down that path, particularly for my situation, where I'm trying to use a libs.version.toml file from an android project, but the custom plugin is of course from a java/kotlin project. I tried creating the libs file by hardwiring the path to the toml file in the custom plugin. It might work if both were java projects, but I never tried that since that's not what I'm after. The ideal solution would be for the plugin to use the libs file from the project it is applied to, but it looks like the version catalog needs to be created in the settings file, before you even have access to "Project", so that's why you would have to hardwire the path.
Short answer. No, but there are other techniques for a custom plugin to get project version data from the project it is applied to.
I have following setup:
val sourcePath by configurations.creating
tasks.register<Copy>("fetchSources") {
from(zipTree(sourcePath.files.first()))
into("${project.buildDir}/synced")
}
dependencies {
sourcePath(group = "shabunc.org.app.runtime", name = "runtime", version = "0.0.1", classifier = "sources")
}
kotlin {
...
sourceSets {
val commonMain by getting {
kotlin.srcDir(tasks.named<Copy>("fetchSources").get().destinationDir)
}
}
}
Which is working perfectly well apart one thing: gradle do not know that Kotlin configuration depends on existence of the folder downloaded with "fetchResources" task.
In other words, I'm supposed to call fetchResources every time before building. I, of course, can also add fetchResources as a dependency to one of the build tasks, say jsJar. However I want to ask:
Is there a way to guarantee that specific files will be present up to the moment when I'm building the project without explicitly introducing a task dependency but rather relying on tasks output somehow?
Say I'm using the palantir/gradle-git-version Gradle plugin, and have the following code in build.gradle.kts to determine the project version:
// If release branch, return after incrementing patch version.
// Else, return $lastTag-SNAPSHOT.
val projectVersion: String by lazy {
val versionDetails: groovy.lang.Closure<VersionDetails> by extra
with(versionDetails()) {
if (!lastTag.matches("^(?:(?:\\d+\\.){2}\\d+)\$".toRegex())) {
throw GradleException("Tag '$lastTag' doesn't match 'MAJOR.MINOR.PATCH' format")
}
// If it detached state, get branch name from GitLab CI env var
val branch = branchName ?: System.getenv("CI_COMMIT_REF_NAME")
if (branch?.startsWith("release/") == true) {
val tokens = lastTag.split('.')
"${tokens[0]}.${tokens[1]}.${tokens[2].toInt() + commitDistance}"
} else "$lastTag-SNAPSHOT"
}
}
This works, but the code is duplicated across all the projects, which is difficult to maintain except for a very small number of projects.
This is just one example, the same applies for other Gradle tasks that assume certain conventions within the company/team, like creating a Dockerfile.
What is a good way to centralize such code so that all projects can use them? Note that code like this don't usually stand on their own, but rely on Gradle plugins.
What is a good way to centralize such code so that all projects can use them?
You'll want to create a custom Gradle plugin to hold your project's conventions.
If you have Gradle installed locally, you can use the Build Init Plugin to create a skeleton plugin project. With Gradle installed locally, simple run gradle init in a new project directory and follow the prompts to create the plugin project.
As a concrete example (assuming you generated a plugin project as mentioned earlier), to apply your versioning conventions, a plugin could be:
// Plugin's build.gradle.kts
dependencies {
// Add dependency for plugin, GAV can be found on the plugins page:
// https://plugins.gradle.org/plugin/com.palantir.git-version
implementation("com.palantir.gradle.gitversion:gradle-git-version:0.12.3")
}
Then a versioning conventions plugin could be:
import com.palantir.gradle.gitversion.VersionDetails
import groovy.lang.Closure
import org.gradle.api.GradleException
import org.gradle.api.Plugin
import org.gradle.api.Project
class VersioningConventionsPlugin : Plugin<Project> {
override fun apply(project: Project) {
// Apply plugin to project as you would in the main Gradle build file.
project.pluginManager.apply("com.palantir.git-version")
// Configure version conventions
val projectVersion: String by lazy {
// Gradle generates some Kotlin DSL code on the fly, in a plugin implementation we don't have that.
// So we must convert the DSL to the Gradle API.
val versionDetails: Closure<VersionDetails> = project.extensions.extraProperties.get("versionDetails") as Closure<VersionDetails>
with(versionDetails.call()) {
if (!lastTag.matches("^(?:(?:\\d+\\.){2}\\d+)\$".toRegex())) {
throw GradleException("Tag '$lastTag' doesn't match 'MAJOR.MINOR.PATCH' format")
}
val branch = branchName ?: System.getenv("CI_COMMIT_REF_NAME")
if (branch?.startsWith("release/") == true) {
val tokens = lastTag.split('.')
"${tokens[0]}.${tokens[1]}.${tokens[2].toInt() + commitDistance}"
} else "$lastTag-SNAPSHOT"
}
}
// Set the version as an extra property on the project
// Accessible via extra["projectVersion"]
project.extensions.extraProperties["projectVersion"] = projectVersion
}
}
I gave a Kotlin example since your sample used the Kotlin DSL. Once you've finished development work of your conventions plugin, then you would publish to a repository such as the Gradle Plugins repository. If it's an internal company plugin, then publish it to an internal Nexus Repository or similar.
Follow the docs for the maven-publish plugin for more details on publishing. Gradle plugins can be published like any other artifact/JAR.
I have project wide settings in a plugin, called parent, that attempts to apply the maven-publish plugin and then programmatically configure the publishing extension. This seems to work but when I apply this plugin in a build.gradle script I can not configure publishing extension to set the project specific publications.
I receive the error:
Cannot configure the 'publishing' extension after it has been accessed.
My intent was to set up the publishing repository in the parent plugin and then let each build.gradle script add the appropriate publications.
Is there a way to do this?
Currently ParentPlugin.groovy looks like:
def void apply(Project project) {
project.getProject().apply plugin: 'maven-publish'
def publishingExtension = project.extensions.findByName('publishing')
publishingExtension.with {
repositories {
maven {
mavenLocal()
credentials {
username getPropertyWithDefault(project.getProject(), 'publishUserName', 'dummy')
password getPropertyWithDefault(project.getProject(), 'publishPassword', 'dummy')
}
}
}
}
}
My client build.gradle fails when it tries to configure the publishing extension.
apply plugin: 'parent'
publishing {
publications {
mavenJava(MavenPublication) {
groupId 'agroup'
artifactId 'anartifactid'
version '1.0.0-SNAPSHOT'
from components.java
}
}
}
Is this possible? Is there another way I should be approaching this?
NOTE regarding repositories{} and publications{} for plugin maven-publish:
Topic: How to workaround this perplexing gradle fatal error message:
Cannot configure the 'publishing' extension after it has been accessed
First thing to try (deep magic):
(note "project." prefix is optional)
-- Configure publications and repositories not like this:
project.publishing {publications {...}}
project.publishing {repositories {...}}
but instead like this recommended style:
project.publishing.publications {...}
project.publishing.repositories {...}
It would be instructive for a gradle guru to explain why this trick works.
Another known workaround is to make sure that each apply of plugin
maven-publish is in the same project code block as
project.publishing.repositories and project.publishing.publications.
But that is more complex and harder to do than the first thing to try,
since by default the CBF applies maven-publish and a second apply of it
may itself cause the same error.
maven-publish is normally applied in pub/scripts/publish-maven.gradle,
unless PUB_PUBLISH_MAVEN is set to override that file location,
in which case the caller should apply plugin maven-publish.
See https://orareview.us.oracle.com/29516818 for how this not-preferred
workaround can be done (for project emcapms) while still using the CBF.
P.S. Someday I'll write this up with minimal code examples. But I'm putting this hard-won knowedge out there now to save other folks from wasting days on this common maven-publish issue.
To deal with this, I wrote another plugin, which can delay modifications to the publication while also avoid a "reading" of the extension, which would put it in the "configured" state. The plugin is called nebula-publishing-plugin, the code for the "lazy" block can be found in the github repo. It looks like this:
/**
* All Maven Publications
*/
def withMavenPublication(Closure withPubClosure) {
// New publish plugin way to specify artifacts in resulting publication
def addArtifactClosure = {
// Wait for our plugin to be applied.
project.plugins.withType(PublishingPlugin) { PublishingPlugin publishingPlugin ->
DefaultPublishingExtension publishingExtension = project.getExtensions().getByType(DefaultPublishingExtension)
publishingExtension.publications.withType(MavenPublication, withPubClosure)
}
}
// It's possible that we're running in someone else's afterEvaluate, which means we need to run this immediately
if (project.getState().executed) {
addArtifactClosure.call()
} else {
project.afterEvaluate addArtifactClosure
}
}
You would then call it like this:
withMavenPublication { MavenPublication t ->
def webComponent = project.components.getByName('web')
// TODO Include deps somehow
t.from(webComponent)
}
The plugin is available in jcenter() as 'com.netflix.nebula:nebula-publishing-plugin:1.9.1'.
A little bit late, but I found a solution that does not require an additional plugin:
(This has been taken from one of my internal plugins, that can work with old and new publishing, thus the ...withType... stuff.
instead of:
project.plugins.withType(MavenPublishPlugin) {
project.publishsing {
publications {
myPub(MavenPublication) {
artifact myJar
}
}
}
}
do this:
project.plugins.withType(MavenPublishPlugin) {
project.extensions.configure PublishingExtension, new ClosureBackedAction( {
publications {
myPub(MavenPublication) {
artifact myJar
}
}
})
}
This will not resolve the Extension immediately, but will apply the configuration at the time when it gets first resolved by someone.
Of course it would perfectly make sense to use this style of configuration in your project-wide plugin to configure the repositories and use the publication extension in the build scripts as usual. This would avoid confusion for buildscript authors.