I am in love with JBoss TattleTale. Typically, in my Ant builds, I follow the docs to define the Tattletale tasks and then run them like so:
<taskdef name="report"
classname="org.jboss.tattletale.ant.ReportTask"
classpathref="tattletale.lib.path.id"/>
...
<tattletale:report source="${src.dir]" destination="${dest.dir}"/>
I am now converting my builds over to Gradle and am struggling to figure out how to get Tattletale running in Gradle. There doesn't appear to be a Gradle-Tattletale plugin, and I'm not experienced enough with Gradle to contribute one. But I also know that Gradle can run any Ant plugin and can also executing stuff from the system shell; I'm just not sure how to do this in Gradle because there aren't any docs on this (yet).
So I ask: How do I run the Tattletale ReportTask from inside a Gradle build?
Update
Here is what the Gradle/Ant docs show as an example:
task loadfile << {
def files = file('../antLoadfileResources').listFiles().sort()
files.each { File file ->
if (file.isFile()) {
ant.loadfile(srcFile: file, property: file.name)
println " *** $file.name ***"
println "${ant.properties[file.name]}"
}
}
}
However, no where in here do I see how/where to customize this for Tattletale and its ReportTask.
The following is adapted from https://github.com/roguePanda/tycho-gen/blob/master/build.gradle
It bypasses ant and directly invokes the Tattletale Java class.
It was changed to process a WAR, and mandates a newer javassist in order to handle Java 8 features such as lambdas.
configurations {
tattletale
}
configurations.tattletale {
resolutionStrategy {
force 'org.javassist:javassist:3.20.0-GA'
}
}
dependencies {
// other dependencies here...
tattletale "org.jboss.tattletale:tattletale:1.2.0.Beta2"
}
task createTattletaleProperties {
ext.props = [reports:"*", enableDot:"true"]
ext.destFile = new File(buildDir, "tattletale.properties")
inputs.properties props
outputs.file destFile
doLast {
def properties = new Properties()
properties.putAll(props)
destFile.withOutputStream { os ->
properties.store(os, null)
}
}
}
task tattletale(type: JavaExec, dependsOn: [createTattletaleProperties, war]) {
ext.outputDir = new File(buildDir, "reports/tattletale")
outputs.dir outputDir
inputs.files configurations.runtime.files
inputs.file war.archivePath
doFirst {
outputDir.mkdirs()
}
main = "org.jboss.tattletale.Main"
classpath = configurations.tattletale
systemProperties "jboss-tattletale.properties": createTattletaleProperties.destFile
args([configurations.runtime.files, war.archivePath].flatten().join("#"))
args outputDir
}
The previous answers either are incomplete or excessively complicated. What I did was use the ant task from gradle which works fine. Let's assume your tattletale jars are beneath rootDir/tools/...
ant.taskdef(name: "tattleTaleTask", classname: "org.jboss.tattletale.ant.ReportTask", classpath: "${rootDir}/tools/tattletale-1.1.2.Final/tattletale-ant.jar:${rootDir}/tools/tattletale-1.1.2.Final/tattletale.jar:${rootDir}/tools/tattletale-1.1.2.Final/javassist.jar")
sources = "./src:./src2:./etcetera"
ant.tattleTaleTask(
source: sources,
destination: "tattleTaleReport",
classloader: "org.jboss.tattletale.reporting.classloader.NoopClassLoaderStructure",
profiles: "java5, java6",
reports: "*",
excludes: "notthisjar.jar,notthisjareither.jar,etcetera.jar"
){
}
So the above code will generate the report beneath ./tattleTaleReport. It's that simple. The annoyance is that the source variable only accepts directories so if there are jars present in those directories you do not wish to scan you need to add them to the excludes parameter.
Related
We have a multi modular setup and we are sharing some tests classes between the modules (mainly Fakes implementations). Our current solution (that you can find below) works just for classes written in Java, but we are looking at supporting also shared kotlin classes.
if (isAndroidLibrary()) {
task compileTestCommonJar(type: JavaCompile) {
classpath = compileDebugUnitTestJavaWithJavac.classpath
source sourceSets.testShared.java.srcDirs
destinationDir = file('build/testCommon')
}
taskToDependOn = compileDebugUnitTestSources
} else {
task compileTestCommonJar(type: JavaCompile) {
classpath = compileTestJava.classpath
source sourceSets.testShared.java.srcDirs
destinationDir = file('build/testCommon')
}
taskToDependOn = testClasses
}
task testJar(type: Jar, dependsOn: taskToDependOn) {
classifier = 'tests'
from compileTestCommonJar.outputs
}
How can I modify the compileTestCommonJar so it supports kotlin?
Here is what we do:
In the module with shared test classes, pack the test source set output into a jar
configurations { tests }
...
task testJar(type: Jar, dependsOn: testClasses) {
baseName = "test-${project.archivesBaseName}"
from sourceSets.test.output
}
artifacts { tests testJar }
In a module that depends on the shared classes
dependencies {
testCompile project(path: ":my-project-with-shared-test-classes", configuration: "tests")
}
PS: Honestly, I would prefer to have a separate Gradle module with common test classes as it's more explicit solution.
The task compileTestCommonJar(type: JavaCompile) compiles .java files only because its the task of JavaCompile type.
There's KotlinCompile task aswell, so you would need to merge it, it basically works similary to JavaCompile but compiles .kt files only.
Said that i wouldn't use task system to share the dependencies across, i would use separate module and work with default compileTestKotlin and compileTestJava task's outputs
When performing gradle clean and then gradle swagger a ClassNotFoundException is thrown. If gradle swagger is then run again (basically after the api build is done in previous run), it works fine.
build.gradle looks as below:
buildscript {
repositories {
maven { url hydraMavenRepo }
maven { url hydraPluginsRepo }
}
dependencies {
classpath "com.github.kongchen:swagger-maven-plugin:3.1.4"
}
}
apply plugin: 'java'
configurations {
addclasspath
}
dependencies {
addclasspath files(project(':api:desktop-api').configurations['runtime'].files)
addclasspath files(project(':api:desktop-api').sourceSets['main'].output)
addclasspath files(project(':api:desktop-api').sourceSets.main.output.classesDir)
runtime project(':api:desktop-api')
}
sourceSets {
main {
runtimeClasspath += files(project(':api:desktop-api').sourceSets['main'].output)
runtimeClasspath += files(project(':api:desktop-api').sourceSets['main'].output.classesDir)
runtimeClasspath += files(project(':api:desktop-api').configurations['runtime'].files)
}
}
import com.github.kongchen.swagger.docgen.mavenplugin.ApiDocumentMojo
import com.github.kongchen.swagger.docgen.mavenplugin.ApiSource
import io.swagger.models.Info
task swagger(dependsOn: [':api:desktop-api:build']) {
doLast {
logger.info 'Swagger GenDoc...'
project.file(reportsDir).mkdirs()
// a trick to have all needed classes in the classpath
def customClassLoader = new GroovyClassLoader()
buildscript.configurations.classpath.each {
//println it.toURI().toURL()
customClassLoader.addURL(it.toURI().toURL())
}
configurations.addclasspath.each {
customClassLoader.addURL(it.toURI().toURL())
}
// the same settings as in the swagger-maven-example/pom.xml
final ApiDocumentMojo mavenTask = Class.forName('com.github.kongchen.swagger.docgen.mavenplugin.ApiDocumentMojo', true, customClassLoader).newInstance(
apiSources: [
new ApiSource(
springmvc: false,
locations: ['com/vmware/vdi/hydra'],
schemes: ['http', 'https'],
host: 'vmware.com',
basePath: '/api',
info: new Info(
title: "Hydra DS-REST API's",
version: 'v100',
description: "Hydra DS-REST API's",
),
swaggerDirectory: reportsDir
)
]
)
mavenTask.execute()
logger.info 'Swagger GenDoc task is completed'
}
}
You have several flaws in your build script.
You should not depend on built things in build script dependencies. This is a hen and egg problem. You need to execute the build to get the classes that are necessary to execute the build. This cannot work reliably, if at all.
Instead you should declare them as dependencies outside the buildscript block. The buildscript block is only for dependencies that are needed by your build script to run itself, like Gradle Tasks and Gradle Plugins or classes you use during the build, like the swagger-maven-plugin stuff which is correct in the buildscript block.
Besides that, you execute part of your swagger stuff (the instanciation, execution and printing) during the configuration phase instead of during the execution phase. Everything you do in a task closure, but outside any doFirst or doLast blocks is run during the configuration phase, when the build is configured and thus always, no matter what tasks you actually want to execute and no matter whether the task may already be up-to-date or not. For the up-to-date check to work and save your time you need to declare all inputs like files and properties that might change between executions and all outputs that you generate, then Gradle can do its magic to only execute the task when actually necessary.
Also you should not use println in build scripts. That is like using System.out.println in Java programs. Instead you should use the provided logging facility directly, e. g. doing logger.info 'Swagger GenDoc task is completed'.
buildscript.classloader is what I was looking for.
Below is the code that works:
buildscript {
repositories {
maven { url mavenRepo }
}
dependencies {
classpath "com.github.kongchen:swagger-maven-plugin:3.1.4"
}
}
import com.github.kongchen.swagger.docgen.mavenplugin.ApiDocumentMojo
import com.github.kongchen.swagger.docgen.mavenplugin.ApiSource
import io.swagger.models.Info
task swagger(dependsOn: ':api:build') {
doLast {
logger.info 'Swagger GenDoc...'
project.file(<dir>).mkdirs()
FileCollection apiRuntimeFiles = files(project(':api').configurations['runtime'].files)
apiRuntimeFiles.each {
buildscript.classLoader.addURL(it.toURI().toURL())
}
FileCollection apiClassFiles =files(project(':api').sourceSets['main'].output)
apiClassFiles.each {
buildscript.classLoader.addURL(it.toURI().toURL())
}
final ApiDocumentMojo mavenTask = Class.forName('com.github.kongchen.swagger.docgen.mavenplugin.ApiDocumentMojo', true, buildscript.classLoader).newInstance(
apiSources: [
new ApiSource(
springmvc: false,
locations: ['<loc>'],
schemes: ['http', 'https'],
host: '<host>',
basePath: '/api',
info: new Info(
title: "REST API's",
version: 'v1',
description: "REST API's",
),
swaggerDirectory: <dir>
)
]
)
mavenTask.execute()
logger.info 'Swagger GenDoc task is completed'
}
}
I have a Kotlin Gradle project, and I would like to include Kotlin's runtime and stdlib in the jar file. I'm currently using this, but it's not including the runtime or stdlib when I build the project using the build.gradle configuration.
compileKotlin {
kotlinOptions {
includeRuntime = true
noStdlib = false
}
}
This is the Gradle code I'm using to include the runtime/stdlib in the jar, but it isn't working like I expect it to. Here's the full build.gradle file for some context:
https://github.com/BenWoodworth/CrossPlatformGreeter/blob/bd1da79f36e70e3d88ed871bc35502ecc3a852fb/build.gradle#L35-L43
Kotlin's Gradle documentation seems to indicate that setting kotlinOptions.includeRuntime to true should include the Kotlin runtime in the resulting .jar.
https://kotlinlang.org/docs/reference/using-gradle.html#attributes-specific-for-kotlin
Edit:
This might be related. When I run compileKotlin, I'm getting a couple of warnings related to the runtime:
:compileKotlin
w: Classpath entry points to a non-existent location: <no_path>\lib\kotlin-runtime.jar
BUILD SUCCESSFUL
Here's an alternative I came up with. It'll add the Kotlin runtime and stdlib to the jar using the jar task.
jar {
from {
String[] include = [
"kotlin-runtime-${version_kotlin}.jar",
"kotlin-stdlib-${version_kotlin}.jar"
]
configurations.compile
.findAll { include.contains(it.name) }
.collect { it.isDirectory() ? it : zipTree(it) }
}
}
Gradle Kotlin DSL:
tasks.withType<Jar> {
val include = setOf("kotlin-stdlib-1.4.0.jar")
configurations.runtimeClasspath.get()
.filter { it.name in include }
.map { zipTree(it) }
.also { from(it) }
}
Try this:
Build script
Unpack jar
Add kotlin runtime and rapack it
type gradle packJar to create jar with kotlin runtime in it
or
type gradle runJar to create and run the jar file
build Script
The question is specific, but it's more of a general 'how to do this in gradle' question.
I have a demo java web app that I can run using the gretty plugin. I would like to selectively control whether a javaagent is applied to the jvmArgs of the gretty process based on a command line flag. The agent jar location is known by getting its path from a dummy configuration:
configurations {
agent
}
dependencies {
...
agent group: 'com.foo', name: 'foo-agent', version: '1.0'
}
I know I can access the jar file location using something like:
project.configurations.agent.find { it.name.startsWith("foo-agent") }
How can I selectively apply that to the gretty jvmArgs configuration based on a command line property such as
gradle -PenableAgent
I ended up solving this by creating a task and simply calling it before I run the war:
task agent {
doFirst {
def agentJar = project.configurations.agent.find { it.name.startsWith("foo-agent") }
gretty.jvmArgs << "-javaagent:" + agentJar
}
}
Then I can simply call:
gradle agent appRunWar
In my project I use Spring Instrument as java agent so this was my solution.
You can make appRun task dependent on agent task then no additional gradle run parameter needed.
dependencies {
...
agent 'org.springframework:spring-instrument:4.2.4.RELEASE'
}
configurations {
dev
agent
}
gretty {
...
contextPath = '/'
jvmArgs=[]
springBoot = true
...
}
task agent {
doFirst {
def agentJar = project.configurations.agent.find{it.name.contains("spring-instrument") }
gretty.jvmArgs << "-javaagent:" + agentJar
}
}
project.afterEvaluate {
tasks.appRun.dependsOn agent
}
My gradle project generates some java code inside gen/main/java using annotation processor. When I import this project into Eclipse, Eclipse will not automatically add gen/main/java as source folder to buildpath. I can do it manually. But is there a way to automate this?
Thanks.
You can easily add the generated folder manually to the classpath by
eclipse {
classpath {
file.whenMerged { cp ->
cp.entries.add( new org.gradle.plugins.ide.eclipse.model.SourceFolder('gen/main/java', null) )
}
}
}
whereby null as a second constructor arg means that Eclipse should put the compiled "class" files within the default output folder. If you want to change this, just provide a String instead, e.g. 'bin-gen'.
I think it's a little bit cleaner just to add a second source directory to the main source set.
Add this to your build.gradle:
sourceSets {
main {
java {
srcDirs += ["src/gen/java"]
}
}
}
This results in the following line generated in your .classpath:
<classpathentry kind="src" path="src/gen/java"/>
I've tested this with Gradle 4.1, but I suspect it'd work with older versions as well.
Andreas' answer works if you generate Eclipse project from command line using gradle cleanEclipse eclipse. If you use STS Eclipse Gradle plugin, then you have to implement afterEclipseImport task. Below is my full working snippet:
project.ext {
genSrcDir = projectDir.absolutePath + '/gen/main/java'
}
compileJava {
options.compilerArgs += ['-s', project.genSrcDir]
}
compileJava.doFirst {
task createGenDir << {
ant.mkdir(dir: project.genSrcDir)
}
createGenDir.execute()
println 'createGenDir DONE'
}
eclipse.classpath.file.whenMerged {
classpath - >
def genSrc = new org.gradle.plugins.ide.eclipse.model.SourceFolder('gen/main/java', null)
classpath.entries.add(genSrc)
}
task afterEclipseImport(description: "Post processing after project generation", group: "IDE") {
doLast {
compileJava.execute()
def classpath = new XmlParser().parse(file(".classpath"))
new Node(classpath, "classpathentry", [kind: 'src', path: 'gen/main/java']);
def writer = new FileWriter(file(".classpath"))
def printer = new XmlNodePrinter(new PrintWriter(writer))
printer.setPreserveWhitespace(true)
printer.print(classpath)
}
}