Although similar questions are already present on internet, but I am unable to find any solution to this problem with regards to android platform. My project has multiple product flavors, uses kotlin and hilt. I believe byte code transformation while compiling the project is the root cause of disrupture.
I first thought probably Hilt is injecting code inside classes, therefore I made sure to copy classes before Hilt tasks execution into a separate folder, then use those classes as source for jacoco. But it didn't work.
Error
[ant:jacocoReport] Classes in bundle 'app' do not match with execution data. For report generation the same class files must be used as at runtime.
[ant:jacocoReport] Execution data for class com/company/myapp/Data$callApi$1 does not match.
[ant:jacocoReport] Execution data for class com/company/myapp/factory/SomeFactory$SomeGenerator does not match.
and the list continues for whole bunch of classes in the app. Due to these errors, code coverage is always zero although there are bunch of unit test already written in the app.
gradle.projectsEvaluated {
def hiltTaskPattern = ~/\bhilt.*\w+FlavorADebug\b/
def tasksList = getSubprojects()
.collect { it.tasks }
.flatten()
def copyFilesTask = tasksList.find { it.name == "copyClassFilesForJacoco" }
if (copyFilesTask != null) {
tasksList.findAll { hiltTaskPattern.matcher(it.name).matches() }
.each { it.dependsOn copyFilesTask }
}
}
task copyClassFilesForJacoco(dependsOn: "compileDebugJavaWithJavac", type: Copy) {
def javaDebugTree = fileTree(dir: "${buildDir}/intermediates/javac/flavorADebug/classes", excludes: androidFilesExcluded)
def kotlinDebugTree = fileTree(dir: "${buildDir}/tmp/kotlin-classes/flavorADebug", excludes: androidFilesExcluded)
from javaDebugTree, kotlinDebugTree
into layout.buildDirectory.dir(classFilesPathForInstrumentation)
doLast {
println("Files copied to ${classFilesPathForInstrumentation}")
}
}
Then in the testCoverage task of type JacocoReport, classDirectories point out to copied files
Also, kotlinx-kover seemed interesting, but it's in pre-mature state and lacks support of multiple product flavors along with outdated documentation which makes its usage unfavourable.
This answer explains very well reason for the problem, and also provide potential solution. But it is old, and not applicable to android project since it uses java plugin and jacoco-ant agent which is not compatible with android.
Can someone guide towards potential solutions for the aforementioned problem? TIA
Related
I'm working with a large, multi-module Android application, and I'm trying to define a Gradle task that collects the jars of all runtime dependencies. I'm trying something like this in app/build.gradle:
task collectDeps {
doLast {
configurations.releaseRuntimeClasspath.resolvedConfiguration.resolvedArtifacts.each {
// do stuff
}
}
}
I've used this snippet in the past on other Java projects, so I know that it conceptually works; this is just my first time trying it on a project with multiple build types and/or variants.
When run on the Android project, executing this task throws a variant resolution error:
Execution failed for task ':app:collectDeps'.
> Could not resolve all dependencies for configuration ':app:releaseRuntimeClasspath'.
> The consumer was configured to find a runtime of a component, preferably optimized for Android, as well as attribute 'com.android.build.api.attributes.BuildTypeAttr' with value 'release', attribute 'com.android.build.api.attributes.AgpVersionAttr' with value '7.1.1', attribute 'org.jetbrains.kotlin.platform.type' with value 'androidJvm'. However we cannot choose between the following variants of project :myModule:
- Configuration ':myModule:releaseRuntimeElements' variant android-aar-metadata declares a runtime of a component, preferably optimized for Android, as well as attribute 'com.android.build.api.attributes.AgpVersionAttr' with value '7.1.1', attribute 'com.android.build.api.attributes.BuildTypeAttr' with value 'release', attribute 'org.jetbrains.kotlin.platform.type' with value 'androidJvm':
- Unmatched attributes:
- Provides attribute 'artifactType' with value 'android-aar-metadata' but the consumer didn't ask for it
- Provides attribute 'com.android.build.gradle.internal.attributes.VariantAttr' with value 'release' but the consumer didn't ask for it
- Provides a library but the consumer didn't ask for it
- Configuration ':myModule:releaseRuntimeElements' variant android-art-profile declares a runtime of a component, preferably optimized for Android, as well as attribute 'com.android.build.api.attributes.AgpVersionAttr' with value '7.1.1', attribute 'com.android.build.api.attributes.BuildTypeAttr' with value 'release', attribute 'org.jetbrains.kotlin.platform.type' with value 'androidJvm':
- Unmatched attributes:
- Provides attribute 'artifactType' with value 'android-art-profile' but the consumer didn't ask for it
- Provides attribute 'com.android.build.gradle.internal.attributes.VariantAttr' with value 'release' but the consumer didn't ask for it
- Provides a library but the consumer didn't ask for it
I've trimmed the error for brevity; there are ~20 variants in total. Note that myModule is a project dependency of the top-level app; if I remove that dependency, the error is the same but comes from a different module.
I should also note here that every other build target works fine; the application is quite mature, and the only change I've made is to add this new task to app/build.gradle. So I assume there's something about the way I'm resolving the dependencies that Gradle doesn't like, but I'm struggling to figure out what, or how to resolve it.
Googling this error is not very helpful; the Gradle documentation is quite vague about exactly how to resolve variants, and the solutions that are provided seem to focus on changing how dependencies are added to the project; but I don't necessarily want to do that, because the build works fine for every other use case.
Ideally, I'd like to be able to force a variant for the resolution specifically within my collectDeps task (in fact, ideally collectDeps would be defined in a plugin). Is this possible to do?
In case it matters, the build is using Gradle 7.2 and v7.1.1 of the Android Gradle Plugin
There may be a better way to handle this, but I ultimately managed to resolve my problem by taking inspiration from Sonatype's open source Nexus scanning plugin. The code looks like (this is in Kotlin, but can be modified to Groovy without much difficulty):
project.allprojects.forEach { project ->
val cfg = project.configurations.releaseRuntimeClasspath
try {
cfg.resolvedConfiguration.resolvedArtifacts.forEach {
// do stuff
}
} catch(e: Exception) {
when(e) {
is ResolveException, is AmbiguousVariantSelectionException -> {
val copyConfiguration = createCopyConfiguration(project)
cfg.allDependencies.forEach {
if(it is ProjectDependency) {
project.evaluationDependsOn(it.dependencyProject.path)
} else {
copyConfiguration.dependencies.add(it)
}
}
copyConfiguration.resolvedConfiguration.resolvedArtifacts.forEach {
// do stuff
}
}
else -> throw(e)
}
}
}
private fun createCopyConfiguration(project: Project): Configuration {
var configurationName = "myCopyConfiguration"
var i = 0
while(project.configurations.findByName(configurationName) != null) {
configurationName += i
i++
}
val copyConfiguration = project.configurations.create(configurationName)
copyConfiguration.attributes {
val factory = project.objects
this.attribute(Usage.USAGE_ATTRIBUTE, factory.named(Usage::class.java, Usage.JAVA_RUNTIME))
}
return copyConfiguration
}
The basic idea is that, if a configuration can't be resolved because of ambiguous variant selection, I create and inject a new parent configuration that specifies the attribute org.gradle.usage='java-runtime'; this is sufficient to disambiguate the variants.
Note that I didn't test this with any other attributes, so it's possible that it could work by setting, for example, the artifactType attribute instead; but my use case is more specifically related to the runtime classpath, so this worked for me
I have a Gradle build script that successfully builds my project and compiles all the artifacts I need.
However, in a couple of cases I'd like to give other developers the option to patch some of the files. For example, in one of the archives there's an xml file with information about database hooks - some of the devs use other versions (or even engines) and need to change these before they can use the build output.
Instead of having them make changes to a version-controlled file, which they might commit by mistake, I'd like to give them the option to have a local, individual patch file which the build script applies.
In an old ant script, we did something like this
<target name="appcontext-patch" if="applicationContext.patch.file">
<patch patchfile="${applicationContext.patch.file}" originalfile="${dist.dir}/applicationContext.xml"/>
</target>
but I can't figure out how to do the equivalent in Gradle. Is there a better (i.e. more idiomatic) way of doing this than trying to directly convert this into a call to ant.patch?
Some context
This is how the file ends up in the archive in the first place:
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { fn -> "jboss-spring.xml" }
}
}
It would be fantabulous if I could just do something like
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { fn -> "jboss-spring.xml' }
patch 'local/applicationContext.xml.patch'
}
}
and have the patch file applied before the file is put in the archive. I don't mind writing some code to make this possible, but I'm quite new to Gradle and I have no idea where to begin.
You should be able to translate your ant call into gradle pretty directly.
The gradle doc on how to do this generically. Basically attributes become named arguments and child tags become closures. The documentation has a bunch of good examples.
Once you have your translated ant task you can put in in a doFirst or doLast block on an appropriate task.
My first guess would be something like this:
apply plugin: 'java'
assemble.doFirst {
ant.patch(patchfile: applicationContext.patch.file,
originalFile: "${dist.dir}/applicationContext.xml")
}
That's untested, so but I'm pretty sure it will get you started on the right path. The intent is that just before the java plugin assembles your archive you want gradle to call a closure. In this case the closure will perform an ant action that patches your xml.
Alternately you could use the task you have above that performs a copy and tag onto that.
task myCopyTask(type: Copy) {
...
} << {
ant.patch(patchfile: applicationContext.patch.file,
originalFile: "${dist.dir}/applicationContext.xml")
}
In this case you are writing the task yourself and the left-shift operator (<<) is equivalent to .doLast but a whole lot cooler. I'm not sure which method you prefer, but if you already have a copy task that gets the file there in the first place, I think doLast keeps the relevant code blocks as close to each other as possible.
RFC 5621 defines an XML patching language that uses XPath to target the location in the document to patch. It's great for tweaking config files.
There is an open source implementation in Java (Disclaimer: I am the author). It includes a filter that can be used from Gradle to patch XML files during any task that implements CopySpec. For example:
buildscript {
repositories { jcenter() }
dependencies { classpath "com.github.dnault:xml-patch:0.3.0" }
}
import com.github.dnault.xmlpatch.filter.XmlPatch
task copyAndPatch(type: Copy) {
// Patch file in RFC 5621 format
def patchPath = 'local/applicationContext-patch.xml'
inputs.file patchPath
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { 'jboss-spring.xml' }
filter(XmlPatch, patch: patchPath)
}
}
}
If you'd like to do this more on the fly I can think of two main techniques. Both involve writing some code, but they may be more appealing to you and I'm pretty confident gradle doesn't have this behavior built-in anywhere.
Personally I think #1 is the better solution, since you don't need to muck around with the internals of the Copy task. A custom filter feels cleaner and more reusable.
1) Write a custom filter that you specify in your copy task. I can't help with the details of how to write a custom filter, but I'd start here. You should be able to put the custom filter in buildSrc (lots of info about that at gradle.org) and then you simply need to import it at the top of your gradle file. If you write it in groovy I think you can even just use ant.patch() again.
task copyAndPatch() {
into('META-INF') {
from 'deployment', {
include 'applicationContext.xml'
rename { fn -> "jboss-spring.xml' }
filter(MyCustomFilterThatDoesAPatch, patchFile: 'local/applicationContext.xml.patch')
}
}
2) Write a custom task. Again, I'll leave the details to the experts but you can probably get away with subclassing the Copy task, adding a 'patch' property, and then jumping in during execution to do the dirty work.
Following on from this question.
If I have a build with two instances of the Test task, what is the best (cleanest, least code, most robust) way to completely separate those two tasks so that their outputs don't overlap?
I've tried setting their testResultsDir and testReportsDir properties, but that didn't seem to work as expected. (That is, the output got written to separate directories, but still the two tasks re-ran their respective tests with each run.)
Update for the current situation as of gradle 1.8: The testReportDir and reportsDir properties in dty's answer are deprecated since gradle 1.3. Test results are now separated automatically in the "test-results" directory and to set different destination directories for the HTML reports, call
tasks.withType(Test) {
reports.html.destination = file("${reporting.baseDir}/${name}")
}
Yet again, Rene has pointed me in the right direction. Thank you, Rene.
It turns out that this approach does work, but I must have been doing something wrong.
For reference, I added the following to my build after all the Test tasks had been defined:
tasks.withType(Test) {
testReportDir = new File("${reportsDir}/${testReportDirName}/${name}")
testResultsDir = new File("${buildDir}/${testResultsDirName}/${name}")
}
This will cause all instances of the Test task to be isolated from each other by having their task name as part of their directory hierarchy.
However, I still feel that this is a bit evil and there must be a cleaner way of achieving this that I haven't yet found!
Ingo Kegel's answer doesn't address the results directory, only the reports directory. Which means that a test report for a particular test type could be built that includes more test results than just that type. This can be addressed by setting the results directory as well.
tasks.withType(Test) {
reports.html.destination = file("${reporting.baseDir}/${name}")
reports.junitXml.destination = file("${testResultsDir}/${name}")
}
Just an update. The reports.html.destination way is deprecated.
This is the "new" way (Gradle > 4.x):
tasks.withType(Test) {
reports.html.setDestination file("${reporting.baseDir}/${name}")
}
I could really use some help with this!
The gradle docs say that to make the up-to-date logic to function, just do this:
task transform {
ext.srcFile = file('mountains.xml')
ext.destDir = new File(buildDir, 'generated')
inputs.file srcFile
outputs.dir destDir
This is all well and good for tasks you are defining. However, I am using the eclipse plugin to do some modification to the .classpath file. Up-to-date does not work. That is, it runs the task over and over again out of the box (at least for me). Here is what I have:
eclipse {
classpath {
//eclipseClasspath.inputs.file // something like this??? but what to set it to?
//eclipseClasspath.outputs.file // here too
file {
withXml {
def node = it.asNode()
// rest of my stuff here
I tried a couple of things where I have the two commented out lines. Since those didn't work, I realized I didn't really have a clue and could use some help! Thanks in advance!
In my experience, the Eclipse tasks should not rerun every single time. That makes me think that you are doing something to cause either the inputs or outputs to change. If you are modifying your Eclipse project after Gradle generates it or changing dependencies, etc, you would naturally be triggering the upToDate checks.
If you really do need to force it to run every time, you might be able to get it to work with this. I'm not sure if I've ever tried using this when other outputs are already defined.
eclipseClasspath {
outputs.upToDateWhen { true } //there isn't an equivalent for inputs
}
One important note is that what you were using is the Eclipse model that describes your project, not the actual task itself:
eclipse { //this is the eclipse model
classpath {
}
}
eclipseClasspath {
//this is a task
}
eclipseProject {
//this is a task
}
I have a multi-project C++ Gradle build, which produces a number of libraries and executables. I'm trying to get the executables (but not the libraries) subprojects to get compiled in with a 'fingerprint' object. This works fine if I sprinkle smth like this in individual subprojects' build.gradle:
compileMain.doFirst {
// code to generate a 'BuildInfo.cpp' from from a template.
// embeds name of executable in so has to be generated anew for each exe
}
Following DRY principles, I'd much rather do this once and for all in a top level build.gradle. This is my attempt, to apply it to just the subprojects that use the cpp-exe plugin, following these instructions:
configure(subprojects.findAll { it.plugins.hasPlugin('cpp-exe') }) {
compileMain.doFirst {
// same code as above
}
}
Alas, this doesn't get triggered. However, if I put smth like this in a less restrictive configure, block, this demonstrates that the idea of querying the plugin should work:
configure(subprojects.findAll { true }) {
task mydebug << {
if ( project.plugins.hasPlugin( 'cpp-exe' ) ) {
println ">>> $project.name has it!"
}
}
}
Could it be that the plugins don't get applied to the subprojects at the time the configure closure is evaluated (in the top-level build.gradle)? There may well be a much simpler way of achieving this altogether?
You probably apply the cpp-exe plugin in the child projects' build scripts. By default, a parent build script gets evaluated before its children, which explains why it's not finding any projects that have cpp-exe applied.
There are several ways to solve this problem. One way is to move all configuration that's specific to a cpp-exe project (like applying the plugin and adding the action) to the same spot. Either you do all such configuration from the parent build script (for example by enumerating the cpp-exe subprojects and configuring them with a single configure(cppExeProjects) { ... }), or you move the cpp-exe specific configuration into its own build script (say gradle/cpp-exe.gradle) and apply it from selected subprojects like so: apply from: "$rootDir/gradle/cpp-exe.gradle".
Another solution is to change the evaluation order of build scripts. But I would only use this as a last resort, and it is certainly not necessary here.
Gradle 1.5 is recently out, I am not sure if this is a new feature but as it looks, you can solve the issue by using afterEvaluate.
Take a look at section 53.6.1 in http://www.gradle.org/docs/current/userguide/build_lifecycle.html
Something like:
subprojects {subProject ->
afterEvaluate {
if ( subProject.plugins.hasPlugin('cpp-exe')){
println "Project $subProject.name has plugin cpp-exe"
}
}
}
would give you a start.