In a Maven project, it is easy to add extra deps and include extra source codes via defining a new Maven profile.
How to do the following things in a Gradle project.
Includes extra deps
Includes another source codes directory
And for example, use an extra property existence(eg. add to command line) to decide to activate it or not. I am not sure the best way in Gradle world.
I am not recommending your approach.
But it can be done via - project properties from gradle command line and groovy if (condition) { } for dependencies and multiple sourceset defs
on command line
gradle build -PbProfile=extra1
ext.buildFlag = 'default'
if (project.hasProperty('bProfile')) {
ext.buildFlag = property('bProfile')
}
println "running profile - ${buildFlag}"
dependencies {
//common-deps
if ("extra1".equals(buildFlag)) {
//extra deps
}
}
if ("extra1".equals(buildFlag)) {
//custom sourceset def
} // more else if needed
I use conditionally applied sub-configurations. This is done thru the apply from directive:
if (project.hasProperty('browsers')) {
ext.browsers.split(',').each {
def browser = it.trim()
if (browser) {
apply from: "${browser}Deps.gradle"
}
}
}
This block checks for specification of the browsers property (either from gradle.properties or the -P command line argument). If this property is defined, I split the property's value on commas and apply sub-configurations whose names conform to the pattern <browser>Deps.gradle.
The project in which I use this pattern is here
Related
Module replacement works well in Gradle, however it only applies when there is a conflict.
Although I understand the reason, it breaks my use-case where there is extension of configurations and the conflict happens in some but not others that I need to consume.
I have two special configurations and some module replacement:
configurations {
lib // what should be bundled
provided // what should not be bundled
implementation.extendsFrom(lib)
implementation.extendsFrom(provided)
}
dependencies {
modules {
module('javax.annotation:javax.annotation-api') {
replacedBy('jakarta.annotation:jakarta.annotation-api', 'Javax to Jakarta')
}
}
}
task collectLibs(type: Copy) {
// bundle everything from lib which is not provided (not even transitively)
from configurations.lib - configurations.provided
into "$buildDir/lib"
}
I also use company BOM, here for example: api platform('org.springframework.boot:spring-boot-dependencies:2.5.4') and so I don't want to specify versions anywhere in my project.
Let's assume these dependencies:
dependencies {
lib 'javax.annotation:javax.annotation-api'
provided 'jakarta.annotation:jakarta.annotation-api'
}
the task dependencies then correctly resolves compileClasspath and runtimeClasspath to jakarta.annotation-api, however the collected files in build/lib contain javax.annotation-api-1.3.2.jar even though it "should have been replaced and subtracted"
If I use module substitution instead, it works:
configurations.all {
resolutionStrategy.dependencySubstitution {
substitute module('javax.annotation:javax.annotation-api') using module('jakarta.annotation:jakarta.annotation-api:1.3.5')
}
}
However there I must specify version. Is there any possibility to force module replacement to always act?
My problem is caused by the subtraction, maybe there is a better way to find all dependencies that come from provided but not lib by looking at runtimeClasspath?
I tried something but it gets too complicated very quickly.
I found a solution. Instead of subtracting provided configuration, I can exclude everything from resolved provided configuration. The tricky part is to exclude not too much and not too little:
platform must remain otherwise resolution of versions will fail
both requested and selected must be excluded
This is not a general solution; it still requires some fiddling with configurations (provided must declare both javax and jakarta) but it works for me.
private static excludeFromConfiguration(Configuration configuration, Configuration toExclude) {
toExclude.incoming.resolutionResult.allDependencies.each { dep ->
if (dep instanceof ResolvedDependencyResult && dep.requested instanceof ModuleComponentSelector) {
def isPlatform = dep.requested.attributes.keySet().any {
// asking for org.gradle.api.attributes.Category.CATEGORY_ATTRIBUTE does not work
def attribute = dep.requested.attributes.getAttribute(it)
return attribute == org.gradle.api.attributes.Category.ENFORCED_PLATFORM ||
attribute == org.gradle.api.attributes.Category.REGULAR_PLATFORM
}
if (!isPlatform) {
// we exclude both - the requested and selected because there could have been some:
// module replacement, dependency substitution, capability matching
configuration.exclude(group: dep.requested.group, module: dep.requested.module)
configuration.exclude(group: dep.selected.moduleVersion.group, module: dep.selected.moduleVersion.name)
}
}
}
}
I have multi-project Gradle build that contains also non-Java projects.
I want to declare the artifacts create by one such project in a way that I can use project/configuration dependencies to get them, e.g.
consumer:
dependencies {
myConf project(path: ':producer', configuration: 'myConf')
}
What I currently have is this:
producer:
configurations {
myConf
}
task produceFile {
//... somehow create the file...
outputs.file file('path/to/file')
}
artifacts.add('myConf', produceFile.outputs.files.singleFile, { builtBy produceFile })
Is there a better way to declare the artifact than my clumsy version?
I couldn't figure out a way to pass the task dependency from the artifact to the producing task in one go.
According to the documentation article on Legacy publishing and the javadoc on the ArtifactHandler, for your simple example it should be sufficient to just pass the task, as long as the task type extends AbstractArchiveTask (e.g. Zip or Jar):
artifacts.add('myConf', produceFile)
... or in the more Gradle-ish way:
artifacts {
myConf produceFile
}
The article mentioned above has another example, where a File is passed directly to the add method, which requires you to specify the task to build the file in the way you did in your example.
However, let me propose other ideas for syntax that may be experienced more 'lightweight':
artifacts {
myConf files(produceFile).singleFile { buildBy produceFile }
// or
myConf file: files(produceFile).singleFile, buildBy: [produceFile]
}
These two examples use the Project.files(...) method to resolve the output(s) of the task instead of accessing them manually. The second example makes use of the map syntax often provided by Gradle.
If you want to somehow standardize your way to publish your custom artifacts, I would propose to create a custom task type that offers any of the different arguments the ArtifactHandler can process as a method or property:
class MyTaskType extends DefaultTask {
// ... other stuff ... of course this should be part of a plugin
def getArtifact() {
return ... // either a (Configurable)PublishArtifact (if constructor is available) or a map representation
}
}
task produceFile(type: MyTaskType) {
// configure somehow
}
artifacts {
myConf produceFile.artifact
}
Background: Running Android Studio 3.0-beta7 and trying to get a javadoc task to work for an Android library (the fact that this is not available as a ready-made task in the first place is really strange), and I managed to tweak an answer to a different question for my needs, ending up with this code (https://stackoverflow.com/a/46810617/1226020):
task javadoc(type: Javadoc) {
failOnError false
source = android.sourceSets.main.java.srcDirs
// Also add the generated R class to avoid errors...
// TODO: debug is hard-coded
source += "$buildDir/generated/source/r/debug/"
// ... but exclude the R classes from the docs
excludes += "**/R.java"
// TODO: "compile" is deprecated in Gradle 4.1,
// but "implementation" and "api" are not resolvable :(
classpath += configurations.compile
afterEvaluate {
// Wait after evaluation to add the android classpath
// to avoid "buildToolsVersion is not specified" error
classpath += files(android.getBootClasspath())
// Process AAR dependencies
def aarDependencies = classpath.filter { it.name.endsWith('.aar') }
classpath -= aarDependencies
aarDependencies.each { aar ->
System.out.println("Adding classpath for aar: " + aar.name)
// Extract classes.jar from the AAR dependency, and add it to the javadoc classpath
def outputPath = "$buildDir/tmp/exploded-aar/${aar.name.replace('.aar', '.jar')}"
classpath += files(outputPath)
// Use a task so the actual extraction only happens before the javadoc task is run
dependsOn task(name: "extract ${aar.name}").doLast {
extractEntry(aar, 'classes.jar', outputPath)
}
}
}
}
// Utility method to extract only one entry in a zip file
private def extractEntry(archive, entryPath, outputPath) {
if (!archive.exists()) {
throw new GradleException("archive $archive not found")
}
def zip = new java.util.zip.ZipFile(archive)
zip.entries().each {
if (it.name == entryPath) {
def path = new File(outputPath)
if (!path.exists()) {
path.getParentFile().mkdirs()
// Surely there's a simpler is->os utility except
// the one in java.nio.Files? Ah well...
def buf = new byte[1024]
def is = zip.getInputStream(it)
def os = new FileOutputStream(path)
def len
while ((len = is.read(buf)) != -1) {
os.write(buf, 0, len)
}
os.close()
}
}
}
zip.close()
}
This code tries to find all dependency AAR:s, loops through them and extracts classes.jar from them, and puts them in a temp folder that is added to the classpath during javadoc generation. Basically trying to reproduce what the really old android gradle plugin used to do with "exploded-aar".
However, the code relies on using compile dependencies. Using api or implementation that are recommended with Gradle 4.1 will not work, since these are not resolvable from a Gradle task.
Question: how can I get a list of dependencies using the api or implementation directives when e.g. configuration.api renders a "not resolvable" error?
Bonus question: is there a new, better way to create javadocs for a library with Android Studio 3.0 that doesn't involve 100 lines of workarounds?
You can wait for this to be merged:
https://issues.apache.org/jira/browse/MJAVADOC-450
Basically, the current Maven Javadoc plugin ignores classifiers such as AAR.
I ran in to the same problem when trying your answer to this question when this error message kept me from resolving the implementation dependencies:
Resolving configuration 'implementation' directly is not allowed
Then I discovered that this answer has a solution that makes resolving of the implementation and api configurations possible:
configurations.implementation.setCanBeResolved(true)
I'm not sure how dirty this workaround is, but it seems to do the trick for the javadocJar task situation.
I want to extract the dependencies defined in a particular gradle configuration; My code is like :
project.configurations.myConfig.files.each { src ->
logger.debug "Extracting ${src.absolutePath} to ${to}"
project.copy {
eachFile { fileCopyDetails ->
logger.debug("Extracting file : ${fileCopyDetails.file.path}")
}
from project.zipTree(src)
into to
}
But this is extracting the ALL the files including dependencies defined in the pom files. My requirement is to just extract the first level dependencies as defined in dependencies{ myConfig ... }
Solution 1
I tried with setting transitive = false and it works but that breaks the build because we are removing the dependent libraries from classpath.
Solution 2
Tried with creating a new configuration which is copy of myConfig but set transitive = false; And it works
I'm looking for any better solution where I do not have to copy the configuration.
You can make a copy of a configuration, and then set the copy to non-transitive. This way you only have to specify the dependencies once:
// This will include dependencies from superconfigurations. If that is not
// what you want, use "copy()" instead.
def nonTransitiveMyConfig = configurations.myConfig.copyRecursive()
nonTransitiveMyConfig.transitive = false
nonTransitiveMyConfig.files.each { src ->
// ...
}
See:
copyRecursive()
copy()
In gradle (1.9), I have multiple subprojects. Each one uses the application plugin to create a tar and cli. I am trying to get all these tars into a unified tar, but I am having a lot of trouble.
Here is the tar format I am looking for:
${project.name}/${subproject.name}.tar
I have tried using both the Tar task and the distribution plugin, but for each one, I am not able to find a clean way to just get the generated tars (or any tar), and put them at top level, excluding everything else.
Here is a sample using the distirbution pluging, but its not giving the output I like
apply plugin: 'distribution'
distributions {
testing {
contents {
from(".")
exclude "*src*"
exclude "*idea*"
exclude "*.jar"
exclude ".MF"
filesMatching("**/build/distributions/*.tar") {
if(file.name == "${project.name}-testing.tar") {
exclude()
} else {
name file.name
}
}
}
}
}
Here is what I would like (but not working):
apply plugin: 'distribution'
distributions {
testing {
contents {
include "**/*.tar" // shows up at top level
}
}
}
EDIT:
Getting closer.
distributions {
testing {
contents {
from subprojects.buildDir
includeEmptyDirs false
include "**/*.tar"
exclude "**/${project.name}-testing.tar"
}
}
}
This will give me ${project.name}/distribution/${subproject.name}.tar
Here is the solution for your problem. Put the following to the root project:
task distTar(type: Tar) {
destinationDir = new File("$buildDir/distributions")
baseName = 'unifiedTar'
}
subprojects {
// definitions common to subprojects...
afterEvaluate {
def distTar = tasks.findByName('distTar')
if(distTar) {
rootProject.distTar.dependsOn distTar
rootProject.distTar.inputs.file distTar.archivePath
rootProject.distTar.from distTar.archivePath
}
}
}
then invoke "build distTar" on the root project - it will assemble "unifiedTar.tar" in "build/distributions" subfolder (of the root project).
How it works:
"task distTar(...)" declares a new task of type Tar in the root project.
"subprojects" applies the specified closure to each subproject.
"afterEvaluate" ensures that the specified closure is called AFTER the current project (in this case subproject) is evaluated. This is very important, because we are going to use properties of the subproject which are defined only after it's evaluation.
"tasks.findByName" allows us to determine, whether the given task is defined for given project. If not, it returns null and the following code is not performed. This way we stay agnostic regarding the nature of the subproject.
"dependsOn" ensures that distTar of the root project depends on distTar of the given project (and, therefore, is executed later than it).
"inputs.file" ensures that distTar on root project is not executed, if none of the constituent tars has changed.
"from" adds constituent tar to unified tar.