I haven't done anything with Gradle for a while, so it appears I've forgotten how configuration resolution works.
I'm trying to use the gretty plugin (instead of core, deprecated jetty), but I cannot seem to create a custom configuration.
I've boiled it down to a very short, simple script (using Gradle 3.4):
buildscript {
repositories {
maven {
url 'https://plugins.gradle.org/m2/'
}
}
dependencies {
classpath 'org.akhikhl.gretty:gretty:1.4.0'
}
}
plugins {
id 'org.akhikhl.gretty' version '1.4.0'
}
configurations {
fooTest
}
configurations.fooTest.each {
println it.toString()
}
It seems to not like me iterating over the fooTest configuration.
Assuming I need to know the dependencies for that configuration (I stripped that part from the code above)
What am I doing wrong here?
The script above gives me this:
org.gradle.api.InvalidUserDataException: Cannot change strategy of configuration ':fooTest' after it has been resolved.
The key point here was that I needed an unresolved configuration to loop over. Admittedly this information was neglected in the initial description as I didn't know it was critical information. We needed to loop over the files in the dependency and copy/unzip them into certain locations.
However, we cannot do that with a resolved configuration. That said, we can copy the configuration into a unresolved one, and loop over that instead:
configurations.fooTest.copy().each {
println it.toString()
}
This will successfully print out the files involved in the dependency (or unzip them, as my case needs).
Related
I have a project that builds into an SDK library.
Before building a release I would like to verify that all configured dependencies are available using a gradle task. Somehow current build operations pass, because it manages to resolve a different version of a misconfigured dependency as it gets used by some other library. However, this is not a guaranteed situation and therefor I'd like a task that verifies if all configured dependencies are actually available or not and otherwise, fail.
I started a Task like this:
abstract class CheckDependenciesTask : DefaultTask() {
#TaskAction
fun checkDependencies() {
project.configurations.forEach { config ->
if (config.isCanBeResolved) {
println("Resolving ${config.name}")
config.resolve()
} else {
println("Not resolving ${config.name}")
}
}
}
}
But that really aims to resolve configs, rather than dependencies. Somehow I'm not able to figure out how to check if dependencies are available. When running lintAnalyzeDebug --info it neatly prints that they are missing, so there must be some command that checks this, but then just fails instead of try to resolve it.
Any idea or pointers on how to achieve this would be appreciated. Thank you.
I'm trying to factor out common Gradle tasks in a reusable file. Here is an excerpt of a build-root.gradle file:
buildscript {
// Repository declaration
ext {
isSnapshot = version.endsWith("-SNAPSHOT")
repos = {
def mavenRepo = { repoName, repoUrl ->
maven {
credentials {
username System.env.<some env var>
password System.env.<some env var>
}
name repoName
url repoUrl
}
}
mavenLocal()
mavenRepo('repo1', 'https://repo1.url')
mavenRepo('repo2', 'https://repo2.url')
mavenRepo('repo3', 'https://repo3.url')
}
}
// Versions and libraries declaration
ext {
versions = [
... some stuff
// Gradle
gradleRelease : '2.8.1',
... more stuff
]
libs = [
... some stuff
// Gradle
gradleRelease : "net.researchgate:gradle-release:$versions.gradleRelease",
... more stuff
]
}
repositories repos
dependencies {
classpath libs.gradleRelease
}
apply plugin: 'net.researchgate.release'
}
... more common stuff
The idea is for subprojects to apply from that file and get all the goodies from it.
On the "apply plugin" line I get the following error - > Plugin with id 'net.researchgate.release' not found.
I printed the libs.gradleRelease string, it looks fine: net.researchgate:gradle-release:2.8.1
We are currently using Gradle 5.2.1, but I also tried 6.0.1 - same error. Any ideas why it can't find the plugin? BTW, this is not exclusive to this particular plugin, I tried others and still get the same error.
After pulling whatever was left of my hair and banging my head against the wall, I came across this => https://discuss.gradle.org/t/how-do-i-include-buildscript-block-from-external-gradle-script/7016
Relevant comment from #Peter_Niederwieser:
"Secondly, externalizing a build script block into a script plugin isn’t supported. (It’s a tough problem, and can’t think of a good way to implement this.) You may have to live with some duplication, at least for the time being. Remember that dependencies specified in a project’s ‘buildscript’ block are visible to all subprojects. Hence, as long as you don’t need dependencies to be available in a script plugin, you just need to declare them in the root project’s build script."
Which is exactly what I was trying to do. I'm not going to curse here...
I'm adding an upload archive directory for all subprojects via
uploadArchives {
repositories {
flatDir {dirs '../REPO'}
}
}
Now I need to specify a different directory for one subproject.
I've found out that doing it adds to the list, but I'd like to replace the directory. I know I could use subprojects.findAll, but I'll need the possibility to override a setting elsewhere, too.
Disclaimer: My question may sound stupid, but I'm using gradle since a few weeks and must confess, I know hardly anything about it. I like it and it works fine, but reading the manual is not an option (I'm just a BFU and I'd rather switch to makefile before reading it all).
Often, the cleanest solution is to configure things the right way from the start. There are several ways to do this. For example:
rootProject/build.gradle:
configure(subprojects - project(":foo")) {
uploadArchives {
repositories {
flatDir {dirs '../REPO'}
}
}
}
project(":foo") {
uploadArchives {
repositories {
flatDir {dirs '../OTHER'}
}
}
}
Alternatively, you could have two auxiliary scripts gradle/repoA.gradle and gradle/repoB.gradle, and each subproject build script would apply the appropriate script with apply from: "$rootDir/gradle/repoX.gradle".
Finally, overriding the value might work too (untested):
rootProject/subproject/build.gradle:
uploadArchives.repositories[0].dirs = ["../OTHER"]
PS: Now go and learn some Gradle. :-)
I have project wide settings in a plugin, called parent, that attempts to apply the maven-publish plugin and then programmatically configure the publishing extension. This seems to work but when I apply this plugin in a build.gradle script I can not configure publishing extension to set the project specific publications.
I receive the error:
Cannot configure the 'publishing' extension after it has been accessed.
My intent was to set up the publishing repository in the parent plugin and then let each build.gradle script add the appropriate publications.
Is there a way to do this?
Currently ParentPlugin.groovy looks like:
def void apply(Project project) {
project.getProject().apply plugin: 'maven-publish'
def publishingExtension = project.extensions.findByName('publishing')
publishingExtension.with {
repositories {
maven {
mavenLocal()
credentials {
username getPropertyWithDefault(project.getProject(), 'publishUserName', 'dummy')
password getPropertyWithDefault(project.getProject(), 'publishPassword', 'dummy')
}
}
}
}
}
My client build.gradle fails when it tries to configure the publishing extension.
apply plugin: 'parent'
publishing {
publications {
mavenJava(MavenPublication) {
groupId 'agroup'
artifactId 'anartifactid'
version '1.0.0-SNAPSHOT'
from components.java
}
}
}
Is this possible? Is there another way I should be approaching this?
NOTE regarding repositories{} and publications{} for plugin maven-publish:
Topic: How to workaround this perplexing gradle fatal error message:
Cannot configure the 'publishing' extension after it has been accessed
First thing to try (deep magic):
(note "project." prefix is optional)
-- Configure publications and repositories not like this:
project.publishing {publications {...}}
project.publishing {repositories {...}}
but instead like this recommended style:
project.publishing.publications {...}
project.publishing.repositories {...}
It would be instructive for a gradle guru to explain why this trick works.
Another known workaround is to make sure that each apply of plugin
maven-publish is in the same project code block as
project.publishing.repositories and project.publishing.publications.
But that is more complex and harder to do than the first thing to try,
since by default the CBF applies maven-publish and a second apply of it
may itself cause the same error.
maven-publish is normally applied in pub/scripts/publish-maven.gradle,
unless PUB_PUBLISH_MAVEN is set to override that file location,
in which case the caller should apply plugin maven-publish.
See https://orareview.us.oracle.com/29516818 for how this not-preferred
workaround can be done (for project emcapms) while still using the CBF.
P.S. Someday I'll write this up with minimal code examples. But I'm putting this hard-won knowedge out there now to save other folks from wasting days on this common maven-publish issue.
To deal with this, I wrote another plugin, which can delay modifications to the publication while also avoid a "reading" of the extension, which would put it in the "configured" state. The plugin is called nebula-publishing-plugin, the code for the "lazy" block can be found in the github repo. It looks like this:
/**
* All Maven Publications
*/
def withMavenPublication(Closure withPubClosure) {
// New publish plugin way to specify artifacts in resulting publication
def addArtifactClosure = {
// Wait for our plugin to be applied.
project.plugins.withType(PublishingPlugin) { PublishingPlugin publishingPlugin ->
DefaultPublishingExtension publishingExtension = project.getExtensions().getByType(DefaultPublishingExtension)
publishingExtension.publications.withType(MavenPublication, withPubClosure)
}
}
// It's possible that we're running in someone else's afterEvaluate, which means we need to run this immediately
if (project.getState().executed) {
addArtifactClosure.call()
} else {
project.afterEvaluate addArtifactClosure
}
}
You would then call it like this:
withMavenPublication { MavenPublication t ->
def webComponent = project.components.getByName('web')
// TODO Include deps somehow
t.from(webComponent)
}
The plugin is available in jcenter() as 'com.netflix.nebula:nebula-publishing-plugin:1.9.1'.
A little bit late, but I found a solution that does not require an additional plugin:
(This has been taken from one of my internal plugins, that can work with old and new publishing, thus the ...withType... stuff.
instead of:
project.plugins.withType(MavenPublishPlugin) {
project.publishsing {
publications {
myPub(MavenPublication) {
artifact myJar
}
}
}
}
do this:
project.plugins.withType(MavenPublishPlugin) {
project.extensions.configure PublishingExtension, new ClosureBackedAction( {
publications {
myPub(MavenPublication) {
artifact myJar
}
}
})
}
This will not resolve the Extension immediately, but will apply the configuration at the time when it gets first resolved by someone.
Of course it would perfectly make sense to use this style of configuration in your project-wide plugin to configure the repositories and use the publication extension in the build scripts as usual. This would avoid confusion for buildscript authors.
I'm using Gradle to build a jar containing an xml file in META-INF. This file has a row like
<property name="databasePlatform" value="${sqlDialect}" />
to allow for different SQL databases for different environments. I want to tell gradle to expand ${sqlDialect} from the project properties.
I tried this:
jar {
expand project.properties
}
but it fails with a GroovyRuntimeException that seems to me like the Jar task attempts to expand properties in .class files as well. So then I tried
jar {
from(sourceSets.main.resources) {
expand project.properties
}
}
which does not throw the above exception, but instead results in all resources being copied twice - once with property expansion and once without. I managed to work around this with
jar {
eachFile {
if(it.relativePath.segments[0] in ['META-INF']) {
expand project.properties
}
}
}
which does what I want, since in my use case I only need to expand properties of files in the META-INF directory. But this feels like a pretty ugly hack, is there a better way to do this?
I stumbled across this post in a thread about a different but closely related issue. Turns out you want to configure the processResources task, not the jar task:
processResources {
expand project.properties
}
For some reason, though, I did have to clean once before Gradle noticed the change.
In addition to #emil-lundberg 's excellent solution, I'd limit the resource processing to just the desired target file:
build.gradle
processResources {
filesMatching("**/applicationContext.xml") {
expand(project: project)
}
}
An additional note: if the ${...} parentheses are causing "Could not resolve placeholder" errors, you can alternatively use <%=...%>. N.B. tested with a *.properties file, not sure how this would work for an XML file.
I've had similar problems migrating from maven to gradle build. And so far the simplest/easiest solution was to simply do the filtering yourself such as:
processResources {
def buildProps = new Properties()
buildProps.load(file('build.properties').newReader())
filter { String line ->
line.findAll(/\$\{([a-z,A-Z,0-9,\.]+)\}/).each {
def key = it.replace("\${", "").replace("}", "")
if (buildProps[key] != null)
{
line = line.replace(it, buildProps[key])
}
}
line
}
}
This will load all the properties from the specified properties file and filter all the "${some.property.here}" type placeholders. Fully supports dot-separated properties in the *.properties file.
As an added bonus, it doesn't clash with $someVar type placeholders like expand() does. Also, if the placeholder could not be matched with a property, it's left untouched, thus reducing the possibility of property clashes from different sources.
here is what worked for me (Gradle 4.0.1) in a multi-module project:
in /webshared/build.gradle:
import org.apache.tools.ant.filters.*
afterEvaluate {
configure(allProcessResourcesTasks()) {
filter(ReplaceTokens,
tokens: [myAppVersion: MY_APP_VERSION])
}
}
def allProcessResourcesTasks() {
sourceSets*.processResourcesTaskName.collect {
tasks[it]
}
}
and my MY_APP_VERSION variable is defined in top-level build.gradle file:
ext {
// application release version.
// it is used in the ZIP file name and is shown in "About" dialog.
MY_APP_VERSION = "1.0.0-SNAPSHOT"
}
and my resource file is in /webshared/src/main/resources/version.properties :
# Do NOT set application version here, set it in "build.gradle" file
# This file is transformed/populated during the Gradle build.
version=#myAppVersion#
I took your first attempt and created a test project. I put a pom file from a jenkins plugin in ./src/main/resources/META-INF/. I assume it is a good enough xml example. I replaced the artifactId line to look like the following:
<artifactId>${artifactId}</artifactId>
My build.gradle:
apply plugin: 'java'
jar {
expand project.properties
}
When I ran gradle jar for the first time it exploded because I forgot to define a value for the property. My second attempt succeeded with the following commandline:
gradle jar -PartifactId=WhoCares
For testing purposes I just defined the property using -P. I'm not sure how you are trying to define your property, but perhaps that is the missing piece. Without seeing the stacktrace of your exception it's hard to know for sure, but the above example worked perfectly for me and seems to solve your problem.