Gradle - get URL of dependency artifact - gradle

I want to download the dependency artifacts manually in the future after Gradle has all the dependency artifacts available, hence I would like to get the URLs which Gradle used to download those artifacts.
Is there a way to get the URL of dependencies which artifacts have been downloaded by Gradle?

use gson for a example:
dependencies {
// https://mvnrepository.com/artifact/com.google.code.gson/gson
compile 'com.google.code.gson:gson:2.8.6'
}
create a task to print url:
task getURLofDependencyArtifact() {
doFirst {
project.configurations.compile.dependencies.each { dependency ->
for (ArtifactRepository repository : project.repositories.asList()) {
def url = repository.properties.get('url')
//https://repo.maven.apache.org/maven2/com/google/code/gson/gson/2.8.6/gson-2.8.6.jar
def jarUrl = String.format("%s%s/%s/%s/%s-%s.jar", url.toString(),
dependency.group.replace('.', '/'), dependency.name, dependency.version,
dependency.name, dependency.version)
try {
def jarfile = new URL(jarUrl)
def inStream = jarfile.openStream();
if (inStream != null) {
println(String.format("%s:%s:%s", dependency.group, dependency.name, dependency.version)
+ " -> " + jarUrl)
return
}
} catch (Exception ignored) {
}
}
}
}
}
run ./gradlew getURLofDependencyArtifact
Task :getURLofDependencyArtifact
com.google.code.gson:gson:2.8.6 -> https://jcenter.bintray.com/com/google/code/gson/gson/2.8.6/gson-2.8.6.jar
PS:the result dependency your project's
repositories {
jcenter()
mavenCentral()
}
so, the result maybe:
Task :getURLofDependencyArtifact
com.google.code.gson:gson:2.8.6 -> https://repo.maven.apache.org/maven2/com/google/code/gson/gson/2.8.6/gson-2.8.6.jar

using Gradle version 6.0 or above, another way of outputting the URLs is to mix --refresh-dependencies with --info
// bash/terminal
./gradlew --info --refresh-dependencies
// cmd
gradlew --info --refresh-dependencies
or output to file
// bash/terminal
./gradlew --info --refresh-dependencies > urls.txt
// cmd
gradlew --info --refresh-dependencies > urls.txt
note on --refresh-dependencies
It’s a common misconception to think that using --refresh-dependencies
will force download of dependencies. This is not the case: Gradle will
only perform what is strictly required to refresh the dynamic
dependencies. This may involve downloading new listing or metadata
files, or even artifacts, but if nothing changed, the impact is
minimal.
source: https://docs.gradle.org/current/userguide/dependency_management.html
see also: How can I force gradle to redownload dependencies?

Wanted something similar but on Android and Kotlin DSL so based on #andforce's answer developed this which hopefully will be useful for others also,
import org.jetbrains.kotlin.utils.addToStdlib.firstNotNullResult
import java.net.URL
val dependenciesURLs: Sequence<Pair<String, URL?>>
get() = project.configurations.getByName(
"implementation"
).dependencies.asSequence().mapNotNull {
it.run { "$group:$name:$version" } to project.repositories.mapNotNull { repo ->
(repo as? UrlArtifactRepository)?.url
}.flatMap { repoUrl ->
"%s/%s/%s/%s/%s-%s".format(
repoUrl.toString().trimEnd('/'),
it.group?.replace('.', '/') ?: "", it.name, it.version,
it.name, it.version
).let { x -> listOf("$x.jar", "$x.aar") }
}.firstNotNullResult { url ->
runCatching {
val connection = URL(url).openConnection()
connection.getInputStream() ?: throw Exception()
connection.url
}.getOrNull()
}
}
tasks.register("printDependenciesURLs") {
doLast {
dependenciesURLs.forEach { (dependency: String, url: URL?) -> println("$dependency => $url") }
}
}
Update: It might not able to find indirect dependencies however.

We need to take care about aar also.
project.configurations.getByName(
"implementation"
).dependencies.each { dependency ->
for (ArtifactRepository repository : rootProject.repositories.asList()) {
def url = repository.properties.get('url')
def urlString = url.toString()
if (url.toString().endsWith("/")) {
urlString = url.toString()
} else {
urlString = url.toString() + "/"
}
def jarUrl = String.format("%s%s/%s/%s/%s-%s.jar", urlString,
dependency.group.replace('.', '/'), dependency.name, dependency.version,
dependency.name, dependency.version)
def aarUrl = String.format("%s%s/%s/%s/%s-%s.aar", urlString,
dependency.group.replace('.', '/'), dependency.name, dependency.version,
dependency.name, dependency.version)
try {
def jarfile = new URL(jarUrl)
def inStreamJar = jarfile.openStream();
if (inStreamJar != null) {
println(String.format("%s:%s:%s", dependency.group, dependency.name, dependency.version)
+ " -> " + jarUrl)
return
}
} catch (Exception ignored) {
}
try {
def aarfile = new URL(aarUrl).setURLStreamHandlerFactory()
def inStreamAar = aarfile.openStream();
if (inStreamAar != null) {
println(String.format("%s:%s:%s", dependency.group, dependency.name, dependency.version)
+ " -> " + aarUrl)
return
}
} catch (Exception ignored) {
}
}
}

Related

Jacoco Kotlin Dsl Multimodule exclude

I am trying to exclude some files from the merged jacoco report. I am using:
(root gradle)
tasks.register<JacocoReport>("codeCoverageReport") {
subprojects {
val subProject = this
subProject.plugins.withType<JacocoPlugin>().configureEach {
subProject.tasks.matching { it.extensions.findByType<JacocoTaskExtension>() != null }.configureEach {
val testTask = this
sourceSets(subProject.sourceSets.main.get())
executionData(testTask)
}
subProject.tasks.matching { it.extensions.findByType<JacocoTaskExtension>() != null }.forEach {
rootProject.tasks["codeCoverageReport"].dependsOn(it)
}
}
}
reports {
xml.isEnabled = false
html.isEnabled = true
csv.isEnabled = false
}
}
And for the every module exclusion jacoco report (e.g. for common module):
tasks.withType<JacocoReport> {
classDirectories.setFrom(
sourceSets.main.get().output.asFileTree.matching {
exclude(JacocoExcludes.commonModule)
}
)
}
For each module this is working but when trying to interact with root gradle task either the gradle sync fails or it only add the files from the last module. Any help ?
Thanks
I had the same problem and used the following code :
tasks.jacocoTestReport {
// tests are required to run before generating the report
dependsOn(tasks.test)
// print the report url for easier access
doLast {
println("file://${project.rootDir}/build/reports/jacoco/test/html/index.html")
}
classDirectories.setFrom(
files(classDirectories.files.map {
fileTree(it) {
exclude("**/generated/**", "**/other-excluded/**")
}
})
)
}
as suggested here : https://github.com/gradle/kotlin-dsl-samples/issues/1176#issuecomment-610643709

Gradle zip task with lazy include property includes itself

Hi I got this zip task which works great:
def dir = new File("${projectDir.parentFile}/test/")
task testZip(type: Zip) {
from dir
destinationDirectory = dir
include 'toast/**'
archiveFileName = 'test.zip'
}
but then when I make the include property lazy (because I need to in my real case)
def dir = new File("${projectDir.parentFile}/test/")
task testZip(type: Zip) {
from dir
destinationDirectory = dir
include {
'toast/**'
}
archiveFileName = 'test.zip'
}
then it creates a zip that includes everything in the folder, (so the generated archive too). In this test case the inner zip is just corrupted (doesn't run infinitely) but in the real world case it does make an infinite zip. (Not sure why, maybe my best case has too few or small files). Either way the test case shows the problem, the generated zip contains a zip even though it should only contain the toast directory and all of its content.
How do I fix this? I need a lazy include because the directory I want to include is computed by other tasks. I get the exact same problem with Tar except it refuses to create the archive since it includes itself.
Using exclude '*.zip' is a dumb workaround which makes the archive include other folders I don't want. I only want to include a specific folder, lazyly.
Here's what the monster looks like in the real world case. I basically need to retrieve the version of the project from Java to then use that version to name the folders I'm packaging. (Making a libGDX game and packaging it with a jre using packr). The problematic tasks are 'makeArchive_' + platform.
String jumpaiVersion;
task fetchVersion(type: JavaExec) {
outputs.upToDateWhen { jumpaiVersion != null }
main = 'net.jumpai.Version'
classpath = sourceSets.main.runtimeClasspath
standardOutput new ByteArrayOutputStream()
doLast {
jumpaiVersion = standardOutput.toString().replaceAll("\\s+", "")
}
}
def names = [
'win64' : "Jumpai-%%VERSION%%-Windows-64Bit",
'win32' : "Jumpai-%%VERSION%%-Windows-32Bit",
'linux64' : "Jumpai-%%VERSION%%-Linux-64Bit",
'linux32' : "Jumpai-%%VERSION%%-Linux-32Bit",
'mac' : "Jumpai-%%VERSION%%-Mac.app"
]
def platforms = names.keySet() as String[]
def jdks = [
'win64' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-win_x64.zip',
'win32' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-win_i686.zip',
'linux64' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-linux_x64.tar.gz',
'linux32' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-linux_i686.tar.gz',
'mac' : 'https://cdn.azul.com/zulu/bin/zulu9.0.7.1-jdk9.0.7-macosx_x64.zip'
]
def formats = [
'win64' : 'ZIP',
'win32' : 'ZIP',
'linux64' : 'TAR_GZ',
'linux32' : 'TAR_GZ',
'mac' : 'ZIP'
]
File jdksDir = new File(project.buildscript.sourceFile.parentFile.parentFile, 'out/jdks')
File gameJar = new File("${projectDir.parentFile}/desktop/build/libs/Jumpai.jar")
File gameData = new File("${projectDir.parentFile}/desktop/build/libs/Jumpai.data")
File packrDir = new File("${projectDir.parentFile}/out/packr/")
File minimalTmpDir = new File("${projectDir.parentFile}/desktop/build/libs/minimal-tmp")
task minimizeGameJar {
dependsOn ':desktop:dist'
doFirst {
minimalTmpDir.mkdirs()
copy {
from zipTree(gameJar)
into minimalTmpDir
}
for(file in minimalTmpDir.listFiles())
if(file.getName().contains("humble"))
file.delete()
}
}
task makeMinimal(type: Zip) {
dependsOn minimizeGameJar
dependsOn fetchVersion
from minimalTmpDir
include '**'
archiveFileName = provider {
"Jumpai-${->jumpaiVersion}-Minimal.jar"
}
destinationDir packrDir
doLast {
minimalTmpDir.deleteDir()
}
}
task copyGameJar(type: Copy) {
outputs.upToDateWhen { gameData.exists() }
dependsOn ':desktop:dist'
from gameJar.getAbsolutePath()
into gameData.getParentFile()
rename("Jumpai.jar", "Jumpai.data")
}
task setWindowsIcons(type: Exec) {
dependsOn fetchVersion
workingDir '.'
commandLine 'cmd', '/c', 'set_windows_icons.bat', "${->jumpaiVersion}"
}
for(platform in platforms) {
task("getJdk_" + platform) {
String url = jdks[platform]
File jdkDir = new File(jdksDir, platform + "-jdk")
File jdkFile = new File(jdkDir, url.split("/").last())
outputs.upToDateWhen { jdkFile.exists() }
doFirst {
if(!jdkDir.exists())
jdkDir.mkdirs()
if(jdkFile.exists())
{
println jdkFile.getName() + " is already present"
return
}
else
{
println "Downloading " + jdkFile.getName()
new URL(url).withInputStream {
i -> jdkFile.withOutputStream { it << i }
}
}
for(file in jdkDir.listFiles()) {
if(file.equals(jdkFile))
continue
if(file.isFile()) {
if (!file.delete())
println "ERROR: could not delete " + file.getAbsoluteFile()
} else if(!file.deleteDir())
println "ERROR: could not delete content of " + file.getAbsoluteFile()
}
if(url.endsWith(".tar.gz"))// don't mix up archive type of what we downloaded vs archive type of what we compress (in formats)
{
copy {
from tarTree(resources.gzip(jdkFile))
into jdkDir
}
}
else if(url.endsWith(".zip"))
{
copy {
from zipTree(jdkFile)
into jdkDir
}
}
}
}
File packrInDir = new File(packrDir, platform)
String platformRawName = names[platform]
task("packr_" + platform, type: JavaExec) {
outputs.upToDateWhen { new File(packrDir, platformRawName.replace("%%VERSION%%", jumpaiVersion)).exists() }
dependsOn fetchVersion
dependsOn copyGameJar
dependsOn 'getJdk_' + platform
main = 'com.badlogicgames.packr.Packr'
classpath = sourceSets.main.runtimeClasspath
args 'tools/res/packr_config/' + platform + '.json'
workingDir = project.buildscript.sourceFile.parentFile.parentFile
doLast {
File packrOutDir = new File(packrDir, platformRawName.replace("%%VERSION%%", jumpaiVersion));
packrOutDir.deleteDir()
if(packrOutDir.exists())
{
println "ERROR Could not delete packr output " + packrOutDir.getAbsolutePath()
return
}
if(!packrInDir.renameTo(packrOutDir))
println "ERROR Could not rename packr output dir for " + packrInDir.getName()
}
}
if(formats[platform] == 'ZIP')
{
task('makeArchive_' + platform, type: Zip) {
if(platform.contains("win"))
dependsOn setWindowsIcons
dependsOn fetchVersion
dependsOn 'packr_' + platform
from packrDir
destinationDirectory = packrDir
include {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + "/"
}
archiveFileName = provider {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + ".zip"
}
}
}
else if(formats[platform] == 'TAR_GZ')
{
task('makeArchive_' + platform, type: Tar) {
dependsOn 'packr_' + platform
from packrDir
destinationDirectory = packrDir
include {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + '/**'
}
archiveFileName = provider {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + ".tar.gz"
}
extension 'tar'
compression = Compression.GZIP
}
}
else
println 'Unsupported format for ' + platform
}
task deploy {
dependsOn makeMinimal
for(platform in platforms)
dependsOn 'makeArchive_' + platform
}
How do I fix this? I need a lazy include because the directory I want to include is computed by other tasks. I get the exact same problem with Tar except it refuses to create the archive since it includes itself.
You can get what you want by using the doFirst method and modifiying the tasks properties with the passed action.
task('makeArchive_' + platform, type: Zip) {
if(platform.contains("win"))
dependsOn setWindowsIcons
dependsOn fetchVersion
dependsOn 'packr_' + platform
from packrDir
destinationDirectory = packrDir
archiveFileName = provider {
platformRawName.replace("%%VERSION%%", jumpaiVersion) + ".zip"
}
doFirst {
def includeDir = platformRawName.replace("%%VERSION%%", jumpaiVersion)
// Include only files and directories from 'includeDir'
include {
it.relativePath.segments[ 0 ].equalsIgnoreCase(includeDir)
}
}
}
Please have also a look at this answer to a similar question. My solution is just a workaround. If you know your version at configuration phase you can achieve what you want more easily. Writing your own custom tasks or plugins can also help to clean up your build script.

why android gradle maven publish artifact bundleRelease not found

When i sync project, android studio warn could not get unknown property 'bundleRelease' for object of type org.gradle.api.publish.maven.internal.publication.DefaultMavenPublication.
I add project.afterEvaluate{//block},but it doesn't work. What should i do to set the artifact
Android Gradle Plugin 3.3.x (at least -alpha releases at the time of writing this answer) has breaking change, task bundleRelease was renamed to bundleReleaseAar
So the solution is to use: bundleReleaseAar instead of bundleRelease.
Note: "release" in the task name is buildType/flavor combination, thus it might be different in your setup.
Generic answer: bundleRelease is a task, to find its new name you can run ./gradlew tasks --all
So the answer from Artem Zunnatullin is correct. Just one addition, the project.afterEvaluate{//block} is necessary to make it work. This information can be overlooked very easily.
Complete example:
project.afterEvaluate {
publishing {
publications {
mavenDebugAAR(MavenPublication) {
artifact bundleDebugAar
pom.withXml {
def dependenciesNode = asNode().appendNode('dependencies')
configurations.api.allDependencies.each { ModuleDependency dp ->
def dependencyNode = dependenciesNode.appendNode('dependency')
dependencyNode.appendNode('groupId', dp.group)
dependencyNode.appendNode('artifactId', dp.name)
dependencyNode.appendNode('version', dp.version)
if (dp.excludeRules.size() > 0) {
def exclusions = dependencyNode.appendNode('exclusions')
dp.excludeRules.each { ExcludeRule ex ->
def exclusion = exclusions.appendNode('exclusion')
exclusion.appendNode('groupId', ex.group)
exclusion.appendNode('artifactId', ex.module)
}
}
}
}
}
mavenReleaseAAR(MavenPublication) {
artifact bundleReleaseAar
pom.withXml {
def dependenciesNode = asNode().appendNode('dependencies')
configurations.api.allDependencies.each { ModuleDependency dp ->
def dependencyNode = dependenciesNode.appendNode('dependency')
dependencyNode.appendNode('groupId', dp.group)
dependencyNode.appendNode('artifactId', dp.name)
dependencyNode.appendNode('version', dp.version)
if (dp.excludeRules.size() > 0) {
def exclusions = dependencyNode.appendNode('exclusions')
dp.excludeRules.each { ExcludeRule ex ->
def exclusion = exclusions.appendNode('exclusion')
exclusion.appendNode('groupId', ex.group)
exclusion.appendNode('artifactId', ex.module)
}
}
}
}
}
}
repositories {
maven {
name 'nexusSnapshot'
credentials {
username '<User with deployment rights>'
password '<User password>'
}
url '<URL to nexus>'
}
maven {
name 'nexusRelease'
credentials {
username '<User with deployment rights>'
password '<User password>'
}
url '<URL to nexus>'
}
}
}
}

Defining custom ‘build’ task is deprecated when using standard lifecycle plugin has been deprecated and is scheduled to be removed in Gradle 3.0

I am facing an issue in my build.gradle script My script is bascially generatiing the POM file and then copy the artifact to other location as well under build/lib with different name.The issue which I am facing How to call the below build task beacuse it is generating the error.I am using gradle 2.3
Error:"Defining custom ‘build’ task is deprecated when using standard lifecycle plugin has been deprecated and is scheduled to be removed in Gradle 3.0"
My task build will build the artifact and then generate the POM and move the artifact to different location but I am getting below error.
My full script is
apply plugin: 'cpp'
apply plugin: 'java'
//-- set the group for publishing
group = 'com.tr.anal'
/**
* Initializing GAVC settings
*/
def buildProperties = new Properties()
file("version.properties").withInputStream {
stream -> buildProperties.load(stream)
}
//add the jenkins build version to the version
def env = System.getenv()
if (env["BUILD_NUMBER"]) buildProperties.analBuildVersion += "_${env["BUILD_NUMBER"]}"
version = buildProperties.analBuildVersion
println "${version}"
//name is set in the settings.gradle file
group = "com.t.anal"
version = buildProperties.analBuildVersion
println "Building ${project.group}:${project.name}:${project.version}"
repositories {
maven {
url "http://cm.thon.com:900000/artifactory/libs-snapshot-local"
}
maven {
url "http://cm.thon.com:9000000/artifactory/libs-release"
}
}
dependencies {
compile ([
"com.tr.anal:analytics-engine-common:4.+"
])
}
model {
repositories {
libs(PrebuiltLibraries) {
jdk {
headers.srcDirs "${System.properties['java.home']}/../include",
"${System.properties['java.home']}/../include/win32",
"${System.properties['java.home']}/../include/darwin",
"${System.properties['java.home']}/../include/linux"
}
}
}
}
model {
platforms {
x64 { architecture "x86_64" }
x86 { architecture "x86" }
}
}
model {
components {
main(NativeLibrarySpec) {
sources {
cpp {
source {
lib library: 'main', linkage: 'static'
lib library: 'jdk', linkage: 'api'
srcDir "src/main/c++/native"
include "**/JniSupport.cpp"
include "**/DiseaseStagingJni.cpp"
}
}
}
}
}
}
def nativeHeadersDir = file("$buildDir/nativeHeaders")
//def compilePath = configurations.compile.resolve().collect {it.absolutePath}.join(";")
binaries.all {
// Define toolchain-specific compiler and linker options
if (toolChain in Gcc) {
cppCompiler.args "-I${nativeHeadersDir}"
cppCompiler.args "-g"
linker.args '-Xlinker', '-shared -LNativeJNI/src/main/resources/DSresources/DSLib -lds64 -Wl'
}
}
//def nativeHeadersDir = file("$buildDir/nativeHeaders")
task nativeHeaders {
// def nativeHeadersDir = file("$buildDir/nativeHeaders")
def outputFile = file("$nativeHeadersDir/DiseaseStagingJniWrapper.h")
def classes = [
'com.truvenhealth.analyticsengine.common.diseasestaging.DiseaseStagingJniWrapper'
]
inputs.files sourceSets.main.output
inputs.property('classes', classes)
outputs.file outputFile
doLast {
outputFile.parentFile.mkdirs()
def compilePath = configurations.compile.resolve().collect {it.absolutePath}.join(":")
println "Using Compile Path: ${compilePath}"
exec {
executable org.gradle.internal.jvm.Jvm.current().getExecutable('javah')
args '-o', outputFile
args '-classpath', compilePath
args classes
}
}
}
tasks.withType(CppCompile) { task ->
task.dependsOn nativeHeaders
}
/*****************************
* Packaging
*****************************/
apply plugin: "maven"
// Workaround for Jenkins-Artifactory plugin not picking up the POM file
def pomFile = file("${buildDir}/libs/${archivesBaseName.toLowerCase()}-${version}.pom")
task newPom << {
pom {
project {
groupId project.group
artifactId project.name
version project.version
description = "Configuration Management Gradle Plugin"
}
}.writeTo(pomFile)
}
//disabling the install task since we're not using maven for real
install.enabled = false
//for publishing to artifactory via jenkins
if(project.hasProperty('artifactoryPublish')) {
artifactoryPublish {
mavenDescriptor pomFile
}
}
def filechange = file("build/libs/NativeJNI-${project.version}.so")
task copyfile(type: Copy) {
from 'build/binaries/mainSharedLibrary'
into 'build/libs'
include('libmain.so')
rename ('libmain.so', "$filechange")
}
//build.dependsOn copyfile
task build (dependsOn: ["newPom","copyfile"]) << {
println "build in progress"
}
def someFile = file("build/libs/NativeJNI-${project.version}.so")
artifacts {
archives someFile
}
Looks like this was a bug in 3.0,
https://github.com/GradleFx/GradleFx/issues/235

How to mix ResolutionStrategies with Gradle

Let's say I have the following in my gradle build script:
configurations.all {
resolutionStrategy {
failOnVersionConflict()
force 'com.google.guava:guava:18.0'
}
}
This will fail if more than one version of a jar is found, except for guava where it will force to version 18.0.
Now let's imagine that I want to have failOnVersionConflict() for all external jars, and be forced to use the force clause (so I know what I'm doing), but I want to use the default resolutionStrategy (newest version) for some specific group, like com.mycompany.
Is such a thing possible?
I was looking at this documentation page:
https://gradle.org/docs/current/dsl/org.gradle.api.artifacts.ResolutionStrategy.html
I found my own answer... But it involves a bit of "hacking"... But, after all, that's just what gradle offers...
def dependencyErrors = 0
configurations.all {
resolutionStrategy {
def thirdPartyPackages = [:]
def forced = [
'com.google.guava:guava' : '18.0'
//Here all forced dependencies...
]
eachDependency { DependencyResolveDetails details ->
if (!details.requested.group.startsWith('com.mycompany')) {
def key = details.requested.group + ":" + details.requested.name
if(!thirdPartyPackages.containsKey(key)) {
if(forced.containsKey(key)) {
details.useVersion forced.get(key)
}
else {
thirdPartyPackages.put(key, details.requested.version);
}
}
else {
def existing = thirdPartyPackages.get(key);
if(existing != details.requested.version) {
logger.error "Conflicting versions for [$key]"
logger.error " [$existing]"
logger.error " [$details.requested.version]"
dependencyErrors++
}
}
}
}
}
}
myTask.doFirst {
//here it might also be doLast, or whatever you need. I just put it before a war task, but it might depend on each need.
if(dependencyErrors > 0) {
ant.fail 'There are ' + dependencyErrors + ' conflicting jar versions in the build.'
}
}

Resources