I have a simple gradle plugin that I'd like to test drive but I'm unsure how to "ask" gradle to run my tests
Here is the block I'd like to modify (at the top of my gradle build file)
buildscript {
dependencies {
classpath('org.testng.testng:6.8.8')
}
}
when I run "gradle" from the command line I do get the following in my output
:buildSrc:compileJava UP-TO-DATE :buildSrc:compileGroovy UP-TO-DATE
:buildSrc:processResources UP-TO-DATE :buildSrc:classes UP-TO-DATE
:buildSrc:jar UP-TO-DATE :buildSrc:assemble UP-TO-DATE
:buildSrc:compileTestJava UP-TO-DATE :buildSrc:compileTestGroovy
UP-TO-DATE :buildSrc:processTestResources UP-TO-DATE
:buildSrc:testClasses UP-TO-DATE :buildSrc:test UP-TO-DATE
:buildSrc:check UP-TO-DATE :buildSrc:build UP-TO-DATE :help
This makes me think I am running the "tests" in my directory (structured like the below).
.
├── main
│ └── groovy
│ └── foobar
│ └── gradle
│ ├── cat
│ │ ├── CatFile.groovy
│ └── TaskHelper.groovy
└── test
└── groovy
└── foobar
└── gradle
└── cat
└── CatFileTest.groovy
Yet nothing is run (I'd expect to see my CatFileTest fail) -here is my test file
package foobar.gradle.cat
import org.testng.annotations.Test
class CatFileTest {
#Test
void shouldBlowUp() {
assert 1 == 2
}
}
Sounds like the classes of your plugin sit under buildSrc. If that's the case you'll have to declare the dependency on TestNG in a build.gradle file under buildSrc. The contents would look as such:
repositories {
mavenCentral()
}
dependencies {
testCompile 'org.testng.testng:6.8.8'
}
To use TestNG as test framework, you'll have to reconfigure the default test task provided by the Java plugin as described in Mark's answer.
Default testing framework is JUnit. You'll have to explicitly enable TestNG support.
test {
useTestNG()
}
Related
I have a following Gradle project structure:
project-root/
├── adapters/
│ ├── adapter1/
│ │ ├── main
│ │ └── test
│ ├── adapter2/
│ │ ├── main
│ │ └── test
│ └── adapter3/
│ ├── main
│ └── test
└── app-spring-boot/
├── main
├── test
└── integrationTest
In the app-spring-boot module, the adapters are included only as runtime dependency:
// project-root/app-spring-boot/build.gradle.kts
dependencies {
runtimeOnly(project(":adapters:adapter1")
runtimeOnly(project(":adapters:adapter2")
runtimeOnly(project(":adapters:adapter3")
}
In the app-spring-boot module for integrationTest source set, I would like to be able to access all dependencies at compile time not only directly from app-spring-boot, but from all of the included :adapters projects as well.
I've used following configuration:
// project-root/app-spring-boot/build.gradle.kts
plugins {
`jvm-test-suite`
}
testing {
suites {
val test by getting(JvmTestSuite::class)
val integrationTest by registering(JvmTestSuite::class) {
useJUnitJupiter()
dependencies {
implementation(project())
}
sources {
compileClasspath += sourceSets.main.get().runtimeClasspath
}
}
}
}
compileClasspath += sourceSets.main.get().runtimeClasspath does the trick and all dependencies from included runtimeOnly projects are accessible at compile time, but I'm wondering what it is the correct and idiomatic Gradle way of doing it, especially since I saw #chalimartines comment.
I agree with the comment you found, saying that adding to the compile classpath is not the right way as you end up with duplicated dependencies.
When applying the test suites plugin, it will create a set of configurations similar to the ones from the main and test source sets, prefixed with the name of the test suite. Because your test suite is called integrationTest, the "implementation" configuration is named integrationTestImplementation.
With this, you can add the runtime dependencies to the compile classpath by making this implementation configuration of the test suite extend the regular runtimeClasspath configuration from the main source set. E.g.:
testing {
// ...
}
configurations["integrationTestImplementation"].extendsFrom(configurations["runtimeClasspath"])
I am building a jenkins shared library (in groovy) and testing this with JenkinsPipelineUnit and in gradle. Running ./gradlew test jacocoTestReport runs fine, but the report is almost empty (just headers); no coverage is present.
Here are the relevant parts of my build.gradle:
plugins {
id 'groovy'
id 'application'
id 'jacoco'
}
dependencies {
compile 'org.codehaus.groovy:groovy-all:2.5.4'
testCompile 'junit:junit:4.12'
testCompile 'com.lesfurets:jenkins-pipeline-unit:1.1.1-custom' // minor adaptations, but that's another story
}
test {
systemProperty "pipeline.stack.write", System.getProperty("pipeline.stack.write")
}
jacocoTestReport {
group = "Reporting"
reports {
xml.enabled true
csv.enabled false
}
additionalSourceDirs = files('vars')
sourceDirectories = fileTree(dir: 'vars')
}
I think the trouble resides in the fact that my "source" files reside in the vars directory and not in src/groovy as expected in a normal groovy project. This is however a requirement for a Jenkins shared library.
I tried specifying
sourceSets {
main {
groovy {
srcDir 'vars'
}
}
}
but then gradle would start compiling this shared library while it's supposed to be loaded upon use; and this breaks everything...
My folder structure looks like this:
├── build.gradle
├── src
│ └── test
│ ├── groovy
│ │ └── TestSimplePipeline.groovy
│ └── resources
│ └── simplePipeline.jenkins
└── vars
├── MyPipeline.groovy
└── sh.groovy
I think my problem is linked to https://github.com/jenkinsci/JenkinsPipelineUnit/issues/119 , but I wouldn't know how to use the changes proposed for maven in gradle (not even sure they apply to jacoco).
The problem is that JenkinsPipelineUnit evaluates your scripts in runtime. It means jacoco agent cannot instrument the byte-code generated in runtime.
To overcome this issue you need to do two changes.
Use jacoco offline instrumentalisation
In my case I used maven, so I cannot provide you with a specific example of a gradle configuration.
Load compiled classes instead of groovy scripts in your test. Something like this:
def scriptClass = helper.getBaseClassloader().loadClass("fooScript")
def binding = new Binding()
script = InvokerHelper.createScript(scriptClass, binding)
InterceptingGCL.interceptClassMethods(script.metaClass, helper, binding)
Here fooScript is the name of the class (say you have a source file called fooScript.groovy in this case).
Now you can call methods of this class via
def result = script.invokeMethod(methodName, args)
I have following simple setup:
File structure
$ tree
.
├── build.gradle
├── modules
│ ├── rest-model
│ └── rest-resource
└── settings.gradle
File contents
settings.gradle
def MODULES = 'modules'
file(MODULES).eachDir {
include ":${MODULES}:${it.name}"
}
build.gradle
task hello {
doLast {
subprojects.each {
println it.name
}
}
}
The task hello above will print out all subprojects. I was expecting only two subprojects: rest-model and rest-resource. However, I am getting three: modules, rest-module, and rest-resource. Here is gradle output:
$ gradle hello
Starting a Gradle Daemon (subsequent builds will be faster)
> Task :hello
modules
rest-model
rest-resource
BUILD SUCCESSFUL in 2s
1 actionable task: 1 executed
So, why does gradle automatically includes the parent folder modules as a subproject? Can I prevent that?
Use includeFlat instead of include:
def MODULES = 'modules'
file(MODULES).eachDir {
includeFlat ":${MODULES}:${it.name}"
}
Consider the following multi-project build script:
build.gradle
subprojects {
apply plugin: 'java'
apply plugin: 'maven'
group = "myorg"
version = "1.0.0-SNAPSHOT"
}
project(':client') {
dependencies {
compile 'myorg:shared:1.0.0-SNAPSHOT'
}
}
With the following files:
├── build.gradle
├── client
│ └── src
│ └── main
│ └── java
│ └── myorg
│ └── client
│ └── MyOrgClient.java
├── settings.gradle
└── shared
└── src
└── main
└── java
└── myorg
└── shared
└── MyOrgObj.java
In the above files MyOrgClient.java includes myorg.shared.MyOrgObj and settings.gradle has the single line include 'client', 'shared'
Problem
The project/task build order for maven related tasks like installing locally and deploying to remote repositories does not take into account the implied project dependency. Because gradle does not know that 'myorg:shared:1.0.0-SNAPSHOT' is created by project(':shared'), the build order is :client -> :shared and causes errors like the one below:
$ gradle install
:client:compileJava
FAILURE: Build failed with an exception.
* What went wrong:
Could not resolve all dependencies for configuration ':client:compile'.
> Could not find myorg:shared:1.0.0-SNAPSHOT.
Required by:
myorg:client:1.0.0-SNAPSHOT
Question:
Is there a standard way to deal with this problem? I have tried these solutions without success:
Using mustRunAfter but ran into problems with tasks not existing yet. I also don't think this would scale well with a large number of projects
Adding archives project(':shared') to the client's dependencies
Adding compile project(':shared') to the client's dependencies and then removing it from the generated pom. Unfortunately this doesn't add the dependency to the install task or artifactoryPublish Edit: This actually was the solution. A project dependency will provide the correct version/name/group in the generated pom.xml so the explicit group:name:version dependency is not needed
You have to define the dependencies between the projects more or less the same way as in Maven:
For example like this:
project(':app') {
apply plugin: 'ear'
dependencies {
compile project (':webgui')
compile project (':service')
}
}
But you need to define the settings.gradle which contains the modules like this:
include 'app'
include 'domain'
include 'service'
include 'service-client'
include 'webgui'
In a Maven multi-module project, should I consider the integration/performance tests projects also as modules?
multi-module-project
├── module-war
├── some-other-module
└── etc...
Should I include?
└── performance-tests
ADD:
What if these tests take hours to finish?
What if they require real servers running, like a JBoss cluster?
I would suggest in case of a web application to have a separate module as you suggested. I see performance tests as a kind of integration tests. Like this:
multi-module-project
├── module-war
├── some-other-module
└── integration-tests
└── performance-tests
└── etc...
EDIT:
As mentioned in the comments: "You can control the execution of the integration test by a profile"