Let's say I have the following in my gradle build script:
configurations.all {
resolutionStrategy {
failOnVersionConflict()
force 'com.google.guava:guava:18.0'
}
}
This will fail if more than one version of a jar is found, except for guava where it will force to version 18.0.
Now let's imagine that I want to have failOnVersionConflict() for all external jars, and be forced to use the force clause (so I know what I'm doing), but I want to use the default resolutionStrategy (newest version) for some specific group, like com.mycompany.
Is such a thing possible?
I was looking at this documentation page:
https://gradle.org/docs/current/dsl/org.gradle.api.artifacts.ResolutionStrategy.html
I found my own answer... But it involves a bit of "hacking"... But, after all, that's just what gradle offers...
def dependencyErrors = 0
configurations.all {
resolutionStrategy {
def thirdPartyPackages = [:]
def forced = [
'com.google.guava:guava' : '18.0'
//Here all forced dependencies...
]
eachDependency { DependencyResolveDetails details ->
if (!details.requested.group.startsWith('com.mycompany')) {
def key = details.requested.group + ":" + details.requested.name
if(!thirdPartyPackages.containsKey(key)) {
if(forced.containsKey(key)) {
details.useVersion forced.get(key)
}
else {
thirdPartyPackages.put(key, details.requested.version);
}
}
else {
def existing = thirdPartyPackages.get(key);
if(existing != details.requested.version) {
logger.error "Conflicting versions for [$key]"
logger.error " [$existing]"
logger.error " [$details.requested.version]"
dependencyErrors++
}
}
}
}
}
}
myTask.doFirst {
//here it might also be doLast, or whatever you need. I just put it before a war task, but it might depend on each need.
if(dependencyErrors > 0) {
ant.fail 'There are ' + dependencyErrors + ' conflicting jar versions in the build.'
}
}
Related
Consider this toy build.gradle file:
plugins {
id "java"
}
import org.gradle.internal.os.OperatingSystem;
apply plugin: 'java'
repositories {
mavenLocal()
mavenCentral()
}
def pathExists(pathname) {
// Returns true iff pathame is an existing file or directory
try {
// This may throw an error for a Windows pathname, c:/path/to/thing
if (file(pathname).exists()) {
return true;
}
} catch (GradleException e) {
// I don't care
}
if (OperatingSystem.current().isWindows()) {
try {
// If we're on Windows, try to make c:/path/to/thing work
if (file("file:///${pathname}").exists()) {
return true;
}
} catch (GradleException e) {
// I don't care
}
}
return false
}
def someVariable = "absent"
if (pathExists("/tmp")) {
someVariable = "present"
}
task someCommonTask() {
doLast {
println("Did some setup stuff: ${someVariable}")
}
}
task someATask(dependsOn: ["someCommonTask"]) {
doLast {
println("A: ${someVariable}")
}
}
task someBTask(dependsOn: ["someCommonTask"]) {
def otherVariable = someVariable == "absent" || pathExists("/etc")
doLast {
println("B: ${otherVariable}")
}
}
Is it possible to reorganize this build file so that someATask is in a.gradle and someBTask is in b.gradle? I've made some attempts to use apply from: without success and, if the answer is on this page: https://docs.gradle.org/current/userguide/organizing_gradle_projects.html it eludes me.
In the real project, there would be a couple of dozen tasks in each of those subordinate files and the goal of dividing up the build.gradle file isn't to have them be subprojcts, per se, they're just logical groupings of tasks.
This split can be done using apply from: ... to separate the files.
E.g.
build.gradle:
plugins {
id "java"
}
apply plugin: 'java'
repositories {
mavenLocal()
mavenCentral()
}
ext {
someVariable = "absent" // XXX
}
apply from: "setup.gradle" // XXX
apply from: "common.gradle" // XXX
task someATask(dependsOn: ["someCommonTask"]) {
doLast {
println("A: ${someVariable}")
}
}
setup.gradle:
task setup() {
someVariable = "present"
}
common.gradle:
task someCommonTask(dependsOn: ["setup"]) {
doLast {
println("Did some setup stuff: ${someVariable}")
}
}
# gradle someATask
> Task :someCommonTask
Did some setup stuff: present
> Task :someATask
A: present
BUILD SUCCESSFUL in 403ms
My guess, the problems you are facing are with toplevel code and not
using properties.
I'm trying to create an task that deletes all the old versions of my dependency, I got it working but the problem is to my dependency version has '.+' in version for build number, so when I get it all the folders get deleted instead of only the older ones. I currently have this:
task cleanTerraCore(type: Delete) {
doLast {
def dirName = new File("${gradle.gradleUserHomeDir}/caches/minecraft/deobfedDeps/deobf/terrails/terracore/TerraCore")
dirName.eachDir { dir ->
project.configurations.deobfCompile.dependencies.each {
System.out.println(it.version)
if (dir.name.contains("SNAPSHOT") && it.name.contains("TerraCore")) {
if (!dir.name.contains(it.version)) {
delete(dir)
}
}
}
}
}
}
dependencies {
deobfCompile("terrails.terracore:TerraCore:" + getMajorMC() + "-" + "${terracore_version}-SNAPSHOT.+")
}
'it.version' always prints out 'SNAPSHOT.+' so I'm not sure how to handle this, could I somehow efficiently check for the biggest build number?
Figured it out after couple of hours. It now gets the latest version without the '+' symbol
task cleanTerra(type: Delete) {
doLast {
def dirName = new File("${gradle.gradleUserHomeDir}/caches/minecraft/deobfedDeps/deobf/terrails/terracore/TerraCore")
dirName.eachDir { dir ->
configurations.deobfCompile.resolvedConfiguration.firstLevelModuleDependencies.each {
System.out.println(it.moduleVersion + ", ${it.moduleName}")
if (dir.name.contains("SNAPSHOT") && it.moduleName.contains("TerraCore")) {
if (!dir.name.contains(it.moduleVersion)) {
delete(dir)
}
}
}
}
}
}
So the println now prints out the full version (currently "1.12-2.1.10-SNAPSHOT.7") and the name of the dependency "TerraCore"
in my project dependency which I'm using has many similar version names, eg: 1.0.0, 1.0.0-dev, 1.0.0-dev2... Is there a way to list all versions starting with 1.0.0 and select interesting version from that list?
I was thinking about resolutionStrategy, but it doesn't contain list of possible versions
You could do this
configurations.all.resolutionStrategy {
List<DependencyResolveDetails> drdList = []
eachDependency { DependencyResolveDetails details ->
if (details.requested.group == 'foo' && details.requested.name = 'bar') {
drdList << details
}
}
if (drdList.size() > 1) {
List<String> versionOptions = drdList*.requested*.version
String selectedVersion = selectBestVersion(versionOptions) // TODO: implement
drdList.each { DependencyResolveDetails details ->
if (details.requested.version != selectedVersion) {
details.useVersion(selectedVersion).because("I picked $selectedVersion using a custom strategy")
}
}
}
}
Perhaps you could create a plugin for this so that applying a custom strategy for a group/name is a bit cleaner possibly by registering a Comparator<String> for a group/name combination
Eg:
apply plugin: 'x.y.custom-version-strategy'
Comparator<String> customVersionComparator = (String version1, String version2) -> { ... }
customVersionStrategy {
strategy 'foo', 'bar', customVersionComparator
}
It looks like, that finnaly I have found a solution:
configurations.all{
resolutionStrategy {
componentSelection {
all { ComponentSelection selection ->
if(notInteresting(selection.candidate.version))
selection.reject("")
}
}
}
}
Inspired by this neat TestNG task, and this SO question I thought I'd whip up something quick for re-running of only failed JUnit tests from Gradle.
But after searching around for awhile, I couldn't find anything analogous which was quite as convenient.
I came up with the following, which seems to work pretty well and adds a <testTaskName>Rerun task for each task of type Test in my project.
import static groovy.io.FileType.FILES
import java.nio.file.Files
import java.nio.file.Paths
// And add a task for each test task to rerun just the failing tests
subprojects {
afterEvaluate { subproject ->
// Need to store tasks in static temp collection, else new tasks will be picked up by live collection leading to StackOverflow
def testTasks = subproject.tasks.withType(Test)
testTasks.each { testTask ->
task "${testTask.name}Rerun"(type: Test) {
group = 'Verification'
description = "Re-run ONLY the failing tests from the previous run of ${testTask.name}."
// Depend on anything the existing test task depended on
dependsOn testTask.dependsOn
// Copy runtime setup from existing test task
testClassesDirs = testTask.testClassesDirs
classpath = testTask.classpath
// Check the output directory for failing tests
File textXMLDir = subproject.file(testTask.reports.junitXml.destination)
logger.info("Scanning: $textXMLDir for failed tests.")
// Find all failed classes
Set<String> allFailedClasses = [] as Set
if (textXMLDir.exists()) {
textXMLDir.eachFileRecurse(FILES) { f ->
// See: http://marxsoftware.blogspot.com/2015/02/determining-file-types-in-java.html
String fileType
try {
fileType = Files.probeContentType(f.toPath())
} catch (IOException e) {
logger.debug("Exception when probing content type of: $f.")
logger.debug(e)
// Couldn't determine this to be an XML file. That's fine, skip this one.
return
}
logger.debug("Filetype of: $f is $fileType.")
if (['text/xml', 'application/xml'].contains(fileType)) {
logger.debug("Found testsuite file: $f.")
def testSuite = new XmlSlurper().parse(f)
def failedTestCases = testSuite.testcase.findAll { testCase ->
testCase.children().find { it.name() == 'failure' }
}
if (!failedTestCases.isEmpty()) {
logger.info("Found failures in file: $f.")
failedTestCases.each { failedTestCase ->
def className = failedTestCase['#classname']
logger.info("Failure: $className")
allFailedClasses << className.toString()
}
}
}
}
}
if (!allFailedClasses.isEmpty()) {
// Re-run all tests in any class with any failures
allFailedClasses.each { c ->
def testPath = c.replaceAll('\\.', '/') + '.class'
include testPath
}
doFirst {
logger.warn('Re-running the following tests:')
allFailedClasses.each { c ->
logger.warn(c)
}
}
}
outputs.upToDateWhen { false } // Always attempt to re-run failing tests
// Only re-run if there were any failing tests, else just print warning
onlyIf {
def shouldRun = !allFailedClasses.isEmpty()
if (!shouldRun) {
logger.warn("No failed tests found for previous run of task: ${subproject.path}:${testTask.name}.")
}
return shouldRun
}
}
}
}
}
Is there any easier way to do this from Gradle? Is there any way to get JUnit to output a consolidated list of failures somehow so I don't have to slurp the XML reports?
I'm using JUnit 4.12 and Gradle 4.5.
Here is one way to do it. The full file will be listed at the end, and is available here.
Part one is to write a small file (called failures) for every failed test:
test {
// `failures` is defined elsewhere, see below
afterTest { desc, result ->
if ("FAILURE" == result.resultType as String) {
failures.withWriterAppend {
it.write("${desc.className},${desc.name}\n")
}
}
}
}
In part two, we use a test filter (doc here) to restrict the tests to any that are present in the failures file:
def failures = new File("${projectDir}/failures.log")
def failedTests = []
if (failures.exists()) {
failures.eachLine { line ->
def tokens = line.split(",")
failedTests << tokens[0]
}
}
failures.delete()
test {
filter {
failedTests.each {
includeTestsMatching "${it}"
}
}
// ...
}
The full file is:
apply plugin: 'java'
repositories {
jcenter()
}
dependencies {
testCompile('junit:junit:4.12')
}
def failures = new File("${projectDir}/failures.log")
def failedTests = []
if (failures.exists()) {
failures.eachLine { line ->
def tokens = line.split(",")
failedTests << tokens[0]
}
}
failures.delete()
test {
filter {
failedTests.each {
includeTestsMatching "${it}"
}
}
afterTest { desc, result ->
if ("FAILURE" == result.resultType as String) {
failures.withWriterAppend {
it.write("${desc.className},${desc.name}\n")
}
}
}
}
The Test Retry Gradle plugin is designed to do exactly this. It will rerun each failed test a certain number of times, with the option of failing the build if too many failures have occurred overall.
plugins {
id 'org.gradle.test-retry' version '1.2.1'
}
test {
retry {
maxRetries = 3
maxFailures = 20 // Optional attribute
}
}
I want to download the dependency artifacts manually in the future after Gradle has all the dependency artifacts available, hence I would like to get the URLs which Gradle used to download those artifacts.
Is there a way to get the URL of dependencies which artifacts have been downloaded by Gradle?
use gson for a example:
dependencies {
// https://mvnrepository.com/artifact/com.google.code.gson/gson
compile 'com.google.code.gson:gson:2.8.6'
}
create a task to print url:
task getURLofDependencyArtifact() {
doFirst {
project.configurations.compile.dependencies.each { dependency ->
for (ArtifactRepository repository : project.repositories.asList()) {
def url = repository.properties.get('url')
//https://repo.maven.apache.org/maven2/com/google/code/gson/gson/2.8.6/gson-2.8.6.jar
def jarUrl = String.format("%s%s/%s/%s/%s-%s.jar", url.toString(),
dependency.group.replace('.', '/'), dependency.name, dependency.version,
dependency.name, dependency.version)
try {
def jarfile = new URL(jarUrl)
def inStream = jarfile.openStream();
if (inStream != null) {
println(String.format("%s:%s:%s", dependency.group, dependency.name, dependency.version)
+ " -> " + jarUrl)
return
}
} catch (Exception ignored) {
}
}
}
}
}
run ./gradlew getURLofDependencyArtifact
Task :getURLofDependencyArtifact
com.google.code.gson:gson:2.8.6 -> https://jcenter.bintray.com/com/google/code/gson/gson/2.8.6/gson-2.8.6.jar
PS:the result dependency your project's
repositories {
jcenter()
mavenCentral()
}
so, the result maybe:
Task :getURLofDependencyArtifact
com.google.code.gson:gson:2.8.6 -> https://repo.maven.apache.org/maven2/com/google/code/gson/gson/2.8.6/gson-2.8.6.jar
using Gradle version 6.0 or above, another way of outputting the URLs is to mix --refresh-dependencies with --info
// bash/terminal
./gradlew --info --refresh-dependencies
// cmd
gradlew --info --refresh-dependencies
or output to file
// bash/terminal
./gradlew --info --refresh-dependencies > urls.txt
// cmd
gradlew --info --refresh-dependencies > urls.txt
note on --refresh-dependencies
It’s a common misconception to think that using --refresh-dependencies
will force download of dependencies. This is not the case: Gradle will
only perform what is strictly required to refresh the dynamic
dependencies. This may involve downloading new listing or metadata
files, or even artifacts, but if nothing changed, the impact is
minimal.
source: https://docs.gradle.org/current/userguide/dependency_management.html
see also: How can I force gradle to redownload dependencies?
Wanted something similar but on Android and Kotlin DSL so based on #andforce's answer developed this which hopefully will be useful for others also,
import org.jetbrains.kotlin.utils.addToStdlib.firstNotNullResult
import java.net.URL
val dependenciesURLs: Sequence<Pair<String, URL?>>
get() = project.configurations.getByName(
"implementation"
).dependencies.asSequence().mapNotNull {
it.run { "$group:$name:$version" } to project.repositories.mapNotNull { repo ->
(repo as? UrlArtifactRepository)?.url
}.flatMap { repoUrl ->
"%s/%s/%s/%s/%s-%s".format(
repoUrl.toString().trimEnd('/'),
it.group?.replace('.', '/') ?: "", it.name, it.version,
it.name, it.version
).let { x -> listOf("$x.jar", "$x.aar") }
}.firstNotNullResult { url ->
runCatching {
val connection = URL(url).openConnection()
connection.getInputStream() ?: throw Exception()
connection.url
}.getOrNull()
}
}
tasks.register("printDependenciesURLs") {
doLast {
dependenciesURLs.forEach { (dependency: String, url: URL?) -> println("$dependency => $url") }
}
}
Update: It might not able to find indirect dependencies however.
We need to take care about aar also.
project.configurations.getByName(
"implementation"
).dependencies.each { dependency ->
for (ArtifactRepository repository : rootProject.repositories.asList()) {
def url = repository.properties.get('url')
def urlString = url.toString()
if (url.toString().endsWith("/")) {
urlString = url.toString()
} else {
urlString = url.toString() + "/"
}
def jarUrl = String.format("%s%s/%s/%s/%s-%s.jar", urlString,
dependency.group.replace('.', '/'), dependency.name, dependency.version,
dependency.name, dependency.version)
def aarUrl = String.format("%s%s/%s/%s/%s-%s.aar", urlString,
dependency.group.replace('.', '/'), dependency.name, dependency.version,
dependency.name, dependency.version)
try {
def jarfile = new URL(jarUrl)
def inStreamJar = jarfile.openStream();
if (inStreamJar != null) {
println(String.format("%s:%s:%s", dependency.group, dependency.name, dependency.version)
+ " -> " + jarUrl)
return
}
} catch (Exception ignored) {
}
try {
def aarfile = new URL(aarUrl).setURLStreamHandlerFactory()
def inStreamAar = aarfile.openStream();
if (inStreamAar != null) {
println(String.format("%s:%s:%s", dependency.group, dependency.name, dependency.version)
+ " -> " + aarUrl)
return
}
} catch (Exception ignored) {
}
}
}