I am trying to exclude some files from the merged jacoco report. I am using:
(root gradle)
tasks.register<JacocoReport>("codeCoverageReport") {
subprojects {
val subProject = this
subProject.plugins.withType<JacocoPlugin>().configureEach {
subProject.tasks.matching { it.extensions.findByType<JacocoTaskExtension>() != null }.configureEach {
val testTask = this
sourceSets(subProject.sourceSets.main.get())
executionData(testTask)
}
subProject.tasks.matching { it.extensions.findByType<JacocoTaskExtension>() != null }.forEach {
rootProject.tasks["codeCoverageReport"].dependsOn(it)
}
}
}
reports {
xml.isEnabled = false
html.isEnabled = true
csv.isEnabled = false
}
}
And for the every module exclusion jacoco report (e.g. for common module):
tasks.withType<JacocoReport> {
classDirectories.setFrom(
sourceSets.main.get().output.asFileTree.matching {
exclude(JacocoExcludes.commonModule)
}
)
}
For each module this is working but when trying to interact with root gradle task either the gradle sync fails or it only add the files from the last module. Any help ?
Thanks
I had the same problem and used the following code :
tasks.jacocoTestReport {
// tests are required to run before generating the report
dependsOn(tasks.test)
// print the report url for easier access
doLast {
println("file://${project.rootDir}/build/reports/jacoco/test/html/index.html")
}
classDirectories.setFrom(
files(classDirectories.files.map {
fileTree(it) {
exclude("**/generated/**", "**/other-excluded/**")
}
})
)
}
as suggested here : https://github.com/gradle/kotlin-dsl-samples/issues/1176#issuecomment-610643709
Related
I have a Gradle build file which uses ProtoBuffer plugin and runs some tasks. At some point some tasks are run for some files, which are inputs to tasks.
I want to modify the set of files which is the input to those tasks. Say, I want the tasks to be run with files which are listed, one per line, in a particular file. How can I do that?
EDIT: Here is a part of rather big build.gradle which provides some context.
configure(protobufProjects) {
apply plugin: 'java'
ext {
protobufVersion = '3.9.1'
}
dependencies {
...
}
protobuf {
generatedFilesBaseDir = "$projectDir/gen"
protoc {
if (project.hasProperty('protocPath')) {
path = "$protocPath"
}
else {
artifact = "com.google.protobuf:protoc:$protobufVersion"
}
}
plugins {
...
}
generateProtoTasks {
all().each { task ->
...
}
}
sourceSets {
main {
java {
srcDirs 'gen/main/java'
}
}
}
}
clean {
delete protobuf.generatedFilesBaseDir
}
compileJava {
File generatedSourceDir = project.file("gen")
project.mkdir(generatedSourceDir)
options.annotationProcessorGeneratedSourcesDirectory = generatedSourceDir
}
}
The question is, how to modify the input file set for existing task (which already does something with them), not how to create a new task.
EDIT 2: According to How do I modify a list of files in a Gradle copy task? , it's a bad idea in general, as Gradle makes assumptions about inputs and outputs dependencies, which can be broken by this approach.
If you would have added the gradle file and more specific that would have been very helpful. I will try to give an example from what I have understood:
fun listFiles(fileName: String): List<String> {
val file = file(fileName).absoluteFile
val listOfFiles = mutableListOf<String>()
file.readLines().forEach {
listOfFiles.add(it)
}
return listOfFiles
}
tasks.register("readFiles") {
val inputFile: String by project
val listOfFiles = listFiles(inputFile)
listOfFiles.forEach {
val file = file(it).absoluteFile
file.readLines().forEach { println(it) }
}
}
Then run the gradle like this: gradle -PinputFile=<path_to_the_file_that_contains_list_of_files> readFiles
Seems all ok! But...
I have created a Kotlin Multiplatform Project in Js and Java. I worked on create the right tests for all target and the right configuration for the build. Seems all go right, i managed to create the build and set it in the right way. Today i have opened the project and it stop, the build complete successfull but test and java compilations don't be executed.
So how i can configure it?
With code and result all will be more clear
build.gradle.kts
import com.moowork.gradle.node.npm.NpmTask
import com.moowork.gradle.node.task.NodeTask
import org.jetbrains.kotlin.gradle.dsl.KotlinJsCompile
buildscript {
repositories {
mavenCentral()
jcenter()
maven {
url = uri("https://plugins.gradle.org/m2/")
}
}
dependencies {
classpath("org.jetbrains.kotlin:kotlin-gradle-plugin:1.3.50")
}
}
plugins {
kotlin("multiplatform") version "1.3.50"
id("com.moowork.node") version "1.3.1"
}
repositories {
mavenCentral()
jcenter()
}
group = "com.example"
version = "0.0.1"
kotlin {
jvm()
js()
jvm {
withJava()
}
js {
nodejs()
}
sourceSets {
val commonMain by getting {
dependencies {
implementation(kotlin("stdlib-common"))
}
}
val commonTest by getting {
dependencies {
implementation(kotlin("test-common"))
implementation(kotlin("test-annotations-common"))
}
}
jvm {
compilations["main"].defaultSourceSet {
dependencies {
implementation(kotlin("stdlib-jdk8"))
}
}
compilations["test"].defaultSourceSet {
dependencies {
implementation(kotlin("test"))
implementation(kotlin("test-junit"))
}
}
}
js {
sequenceOf("", "Test").forEach {
tasks.getByName<KotlinJsCompile>("compile${it}KotlinJs") {
kotlinOptions {
moduleKind = "umd"
noStdlib = true
metaInfo = true
}
}
}
compilations["main"].defaultSourceSet {
dependencies {
implementation(kotlin("stdlib-js"))
}
}
compilations["test"].defaultSourceSet {
dependencies {
implementation(kotlin("test-js"))
}
}
}
}
}
val compileKotlinJs = tasks.getByName("compileKotlinJs")
val compileTestKotlinJs = tasks.getByName("compileTestKotlinJs")
val libDir = "$buildDir/lib"
val compileOutput = compileKotlinJs.getOutputs().getFiles()
val testOutput = compileTestKotlinJs.getOutputs().getFiles()
val populateNodeModules = tasks.create<Copy>("populateNodeModules") {
afterEvaluate {
from(compileOutput)
from(testOutput)
configurations["testCompile"].forEach {
if (it.exists() && !it.isDirectory) {
from(zipTree(it.absolutePath).matching { include("*.js") })
}
}
for (sourceSet in kotlin.sourceSets) {
from(sourceSet.resources)
}
into("$buildDir/node_modules")
}
dependsOn("compileKotlinJs")
}
node {
download = true;
}
tasks.create<NpmTask> ("installJest") {
setArgs(setOf("install", "jest"))
}
tasks.create<NodeTask> ("runJest") {
setDependsOn(setOf("installJest", "populateNodeModules", "compileTestKotlinJs"))
setScript(file("node_modules/jest/bin/jest.js"))
setArgs(compileTestKotlinJs.outputs.files.toMutableList().map {projectDir.toURI().relativize(it.toURI())})
}
tasks.getByName("test").dependsOn("runJest")
Look how many task are skiped! and how build Dir is created
buid result
build dir
I use jest to test in js.
Thanks in advance for support me
This is working so far (at least it looks like it works), but I don't feel that it is idiomatic or would stand the test of time even for a few months. Is there any way to do it better or more "by the book"?
We have a few multi project builds in Gradle where a project's test could touch another one's code, so it is important to see the coverage even if it wasn't in the same project, but it was in the same multiproject. I might be solving non existing problems, but in earlier SonarQube versions I had to "jacocoMerge" coverage results.
jacocoTestReport {
reports {
executionData (tasks.withType(Test).findAll { it.state.upToDate || it.state.executed })
xml.enabled true
}
}
if(!project.ext.has('jacocoXmlReportPathsForSonar')) {
project.ext.set('jacocoXmlReportPathsForSonar', [] as Set<File>)
}
task setJacocoXmlReportPaths {
dependsOn('jacocoTestReport')
doLast {
project.sonarqube {
properties {
property 'sonar.coverage.jacoco.xmlReportPaths', project.
ext.
jacocoXmlReportPathsForSonar.
findAll { d -> d.exists() }.
collect{ f -> f.path}.
join(',')
}
}
}
}
project.rootProject.tasks.getByName('sonarqube').dependsOn(setJacocoXmlReportPaths)
sonarqube {
properties {
property "sonar.java.coveragePlugin", "jacoco"
property "sonar.tests", []
property "sonar.junit.reportPaths", []
}
}
afterEvaluate { Project evaldProject ->
JacocoReport jacocoTestReportTask = (JacocoReport) evaldProject.tasks.getByName('jacocoTestReport')
evaldProject.ext.jacocoXmlReportPathsForSonar += jacocoTestReportTask.reports.xml.destination
Set<Project> dependentProjects = [] as Set<Project>
List<String> configsToCheck = [
'Runtime',
'RuntimeOnly',
'RuntimeClasspath',
'Compile',
'CompileClasspath',
'CompileOnly',
'Implementation'
]
evaldProject.tasks.withType(Test).findAll{ it.state.upToDate || it.state.executed }.each { Test task ->
logger.debug "JACOCO ${evaldProject.path} test task: ${task.path}"
sonarqube {
properties {
properties["sonar.junit.reportPaths"] += task.reports.junitXml.destination.path
properties["sonar.tests"] += task.testClassesDirs.findAll { d -> d.exists() }
}
}
configsToCheck.each { c ->
try {
Configuration cfg = evaldProject.configurations.getByName("${task.name}${c}")
logger.debug "JACOCO ${evaldProject.path} process config: ${cfg.name}"
def projectDependencies = cfg.getAllDependencies().withType(ProjectDependency.class)
projectDependencies.each { projectDependency ->
Project depProj = projectDependency.dependencyProject
dependentProjects.add(depProj)
}
} catch (UnknownConfigurationException uc) {
logger.debug("JACOCO ${evaldProject.path} unknown configuration: ${task.name}Runtime", uc)
}
}
}
dependentProjects.each { p ->
p.plugins.withType(JacocoPlugin) {
if (!p.ext.has('jacocoXmlReportPathsForSonar')) {
p.ext.set('jacocoXmlReportPathsForSonar', [] as Set<File>)
}
p.ext.jacocoXmlReportPathsForSonar += jacocoTestReportTask.reports.xml.destination
JacocoReport dependentJacocoTestReportTask = (JacocoReport) p.tasks.getByName('jacocoTestReport')
dependentJacocoTestReportTask.dependsOn(jacocoTestReportTask)
setJacocoXmlReportPaths.dependsOn(dependentJacocoTestReportTask)
}
}
}
Inspired by this neat TestNG task, and this SO question I thought I'd whip up something quick for re-running of only failed JUnit tests from Gradle.
But after searching around for awhile, I couldn't find anything analogous which was quite as convenient.
I came up with the following, which seems to work pretty well and adds a <testTaskName>Rerun task for each task of type Test in my project.
import static groovy.io.FileType.FILES
import java.nio.file.Files
import java.nio.file.Paths
// And add a task for each test task to rerun just the failing tests
subprojects {
afterEvaluate { subproject ->
// Need to store tasks in static temp collection, else new tasks will be picked up by live collection leading to StackOverflow
def testTasks = subproject.tasks.withType(Test)
testTasks.each { testTask ->
task "${testTask.name}Rerun"(type: Test) {
group = 'Verification'
description = "Re-run ONLY the failing tests from the previous run of ${testTask.name}."
// Depend on anything the existing test task depended on
dependsOn testTask.dependsOn
// Copy runtime setup from existing test task
testClassesDirs = testTask.testClassesDirs
classpath = testTask.classpath
// Check the output directory for failing tests
File textXMLDir = subproject.file(testTask.reports.junitXml.destination)
logger.info("Scanning: $textXMLDir for failed tests.")
// Find all failed classes
Set<String> allFailedClasses = [] as Set
if (textXMLDir.exists()) {
textXMLDir.eachFileRecurse(FILES) { f ->
// See: http://marxsoftware.blogspot.com/2015/02/determining-file-types-in-java.html
String fileType
try {
fileType = Files.probeContentType(f.toPath())
} catch (IOException e) {
logger.debug("Exception when probing content type of: $f.")
logger.debug(e)
// Couldn't determine this to be an XML file. That's fine, skip this one.
return
}
logger.debug("Filetype of: $f is $fileType.")
if (['text/xml', 'application/xml'].contains(fileType)) {
logger.debug("Found testsuite file: $f.")
def testSuite = new XmlSlurper().parse(f)
def failedTestCases = testSuite.testcase.findAll { testCase ->
testCase.children().find { it.name() == 'failure' }
}
if (!failedTestCases.isEmpty()) {
logger.info("Found failures in file: $f.")
failedTestCases.each { failedTestCase ->
def className = failedTestCase['#classname']
logger.info("Failure: $className")
allFailedClasses << className.toString()
}
}
}
}
}
if (!allFailedClasses.isEmpty()) {
// Re-run all tests in any class with any failures
allFailedClasses.each { c ->
def testPath = c.replaceAll('\\.', '/') + '.class'
include testPath
}
doFirst {
logger.warn('Re-running the following tests:')
allFailedClasses.each { c ->
logger.warn(c)
}
}
}
outputs.upToDateWhen { false } // Always attempt to re-run failing tests
// Only re-run if there were any failing tests, else just print warning
onlyIf {
def shouldRun = !allFailedClasses.isEmpty()
if (!shouldRun) {
logger.warn("No failed tests found for previous run of task: ${subproject.path}:${testTask.name}.")
}
return shouldRun
}
}
}
}
}
Is there any easier way to do this from Gradle? Is there any way to get JUnit to output a consolidated list of failures somehow so I don't have to slurp the XML reports?
I'm using JUnit 4.12 and Gradle 4.5.
Here is one way to do it. The full file will be listed at the end, and is available here.
Part one is to write a small file (called failures) for every failed test:
test {
// `failures` is defined elsewhere, see below
afterTest { desc, result ->
if ("FAILURE" == result.resultType as String) {
failures.withWriterAppend {
it.write("${desc.className},${desc.name}\n")
}
}
}
}
In part two, we use a test filter (doc here) to restrict the tests to any that are present in the failures file:
def failures = new File("${projectDir}/failures.log")
def failedTests = []
if (failures.exists()) {
failures.eachLine { line ->
def tokens = line.split(",")
failedTests << tokens[0]
}
}
failures.delete()
test {
filter {
failedTests.each {
includeTestsMatching "${it}"
}
}
// ...
}
The full file is:
apply plugin: 'java'
repositories {
jcenter()
}
dependencies {
testCompile('junit:junit:4.12')
}
def failures = new File("${projectDir}/failures.log")
def failedTests = []
if (failures.exists()) {
failures.eachLine { line ->
def tokens = line.split(",")
failedTests << tokens[0]
}
}
failures.delete()
test {
filter {
failedTests.each {
includeTestsMatching "${it}"
}
}
afterTest { desc, result ->
if ("FAILURE" == result.resultType as String) {
failures.withWriterAppend {
it.write("${desc.className},${desc.name}\n")
}
}
}
}
The Test Retry Gradle plugin is designed to do exactly this. It will rerun each failed test a certain number of times, with the option of failing the build if too many failures have occurred overall.
plugins {
id 'org.gradle.test-retry' version '1.2.1'
}
test {
retry {
maxRetries = 3
maxFailures = 20 // Optional attribute
}
}
I have two gradle files, setup.gradle and tests.gradle; each having two gradle tasks of custom type 'EMTest';
test.gradle applies the 'setup.gradle' as apply from: 'setup.gradle'
I want to configure all tasks of type EMTest; For that I added below code at end of tests.gradle
tasks.withType(EMTest) {
println it.name
}
But this only prints the name of tasks in the tests.gradle;
When I run
tasks.all {
println it.name + " " + it.class
}
It however lists the tasks name defined in setup.gradle and the type as EMTest_Decorated (for all 4 types)
NOTE: I use gradle 1.11 (no control over upgrade); What is the issue here ?
UPDATE On 09/Jun
Here is the main file :
apply plugin: 'java';
apply plugin: 'maven'
apply from: 'emcpsrvs_3n_setup.gradle'
buildscript {
repositories {
maven {
url = "${artifactory_contextUrl}/repo" }
}
dependencies {
classpath group:"com.mycompany.myprod.mymodule", name: "TestInfraPlugin", version: "${testinfraVersion}", transitive: true
classpath group: 'org.codehaus.jackson', name: 'jackson-mapper-asl', version:'1.9.13'
classpath group: 'com.mycompany.myprod', name: 'common',version:'0.1'
classpath group: 'org.codehaus.jackson', name: 'jackson-core-asl', version:'1.9.13'
classpath group: 'com.oracle.weblogic',name: 'jettison-1.1', version: '12.1.2-0-0'
}
}
repositories {
/* To check if the jar is available in local maven repository */
mavenLocal()
maven {
url = "${artifactory_contextUrl}/repo"
}
}
apply plugin: 'TestInfraPlugin'
import com.mycompany.myprod.gradle.testinfra.tasks.EMTest;
repositories {
maven {
url = "${artifactory_contextUrl}/repo"
}
}
dependencies {
testConfig group:'com.mycompany.myprod',name:'ui-integ-tests', version: '1.+'
testConfig group: 'com.mycompany.myprod', name: 'emaas-platform-tenant-sdk', version: '0.1+'
}
task unitTests(type: EMTest){
}
// Three tests are disabled due to JIRA-900
task tenantMgmtUITests(type: EMTest,dependsOn: [cleanSmall_deploy_3n_block,small_deploy_3n_block]) {
useWebdriver = true
small_deploy_3n_block.mustRunAfter ([cleanSmall_deploy_3n_block])
options.suiteXmlBuilder().suite('parallel': 'none','name': 'TenantManagementUI') {
test('name': 'TenantManagementUI') {
classes([:]) {
'class'('name': 'com.mycompany.package.MyTest')
}
}
}
}
small_deploy_3n_cleanup.mustRunAfter ([tenantMgmtUITests])
task emcpsrvs_tenant_mgmt_ui_3n(dependsOn: [tenantMgmtUITests,small_deploy_3n_cleanup])
Here is the 'emcpsrvs_3n_setup.gradle' which is being applied above
buildscript {
repositories {
maven {
url = "${artifactory_contextUrl}/repo"
}
}
dependencies {
classpath group: 'com.mycompany.myprod.emdi', name: 'TestInfraPlugin', version: "${testinfraVersion}", transitive: true
}
}
apply plugin: 'TestInfraPlugin'
repositories {
maven {
url = "${artifactory_contextUrl}/repo"
}
}
import com.mycompany.myprod.gradle.testinfra.tasks.EMTest;
ext.integDeployVersion='1.1+'
dependencies {
testConfig group: 'com.mycompany.myprod.test', name: 'smalldeployment', version: "${integDeployVersion}"
}
/* Setup EMaaS Small Deployment */
task small_deploy_3n_block(type: EMTest) {
outputs.upToDateWhen { false }
onlyIf {!System.env.SMALLDEPLOY_IGNORESETUP}
options.suiteXmlBuilder().suite('name': 'setup_3n_env') {
test('name': 'emaas_setup_small_deploy') {
classes([:]) {
'class'('name': 'mycompany.sysman.test.emaas.integ.EmaasSmallDeploy3n') {
methods([:]) {
'include' ('name': 'setupEmaasSmallDeploy')
}
}
}
}
}
useWebdriver = true
useRestAssured = true
}
/* Cleanup EMaaS Small Deployment */
task small_deploy_3n_cleanup(type: EMTest) {
onlyIf {!System.env.SMALLDEPLOY_IGNORESETUP}
options.suiteXmlBuilder().suite('name': 'setup_3n_env') {
test('name': 'emaas_setup_small_deploy') {
classes([:]) {
'class'('name': 'mycompany.sysman.test.emaas.integ.EmaasSmallDeploy3n') {
methods([:]) {
'include' ('name': 'logCollectionAndPostCleanup')
}
}
}
}
}
mustRunAfter ([small_deploy_3n_block])
}
And finally here is the snippet from TestInfraPlugin.groovy (The gradle plugin):
logger.debug "Configuring the EMTest task with default values."
project.afterEvaluate {
project.ext.testClassesDir = new File(project.properties['emdi.T_WORK'] + '/testClasses')
def testTasks = project.tasks.withType(EMTest)
if (testTasks != null && testTasks.size() == 0) {
logger.info "There are no tasks of type EMTest."
return
}
def extractTask = project.tasks.findByPath('extractTestClasses') ?:
project.task('extractTestClasses', type: ExtractConfiguration) {
configuration = project.configurations.testConfig
to = project.testClassesDir
}
/*
* 1. Adding the 'extractTask' to all EMTest, to ensure that 'extractTask' is run before any 'EMTest'.
* 2. For lazy evaluation of lrgConfig, we are NOT running the task here, but just adding as dependent task.
*/
testTasks.each { task ->
logger.debug "Adding dependsOn extractTask for task: ${task.name}"
task.dependsOn extractTask
}
} // end afterEvaluate
}
What is afterEvaluate{} block doing:
It checks if there are any tasks of type EMTest and if there are creates a task to extract a configuration (named testConfig). This extract task is added as dependency on all the tasks of type EMTest such that the extract task run as first task before running any other task.
What is happening
The extractTestClasses task is added as dependency to ONLY the two tasks unitTests and tenantMgmtUITests and thus the small_deploy_3n_block is getting executed before extractTestClasses resulting setup to fail, which in turn results test to fail.
When you reference tasks with no object it is implicitly using project.tasks.
tasks.withType(EMTest) {
println it.name
}
That is why you'll only get tasks from your current project.
To include all projects you can use:
allprojects {
tasks.withType(EMTest) { println it.name }
}
In this closure you are referencing it.tasks, where it loops over every individual project.
This will not work as expected since Gradle will probably not have loaded all of your sub-projects at this point, and not have reached every task define found during the full configuration phase. Therefore you must define the closure to run after all the projects are fully evaluated:
allprojects {
afterEvaluate {
tasks.withType(EMTest) { println it.name }
}
}