I use vendor library to generate Java sources from an Xml. This source xml imports other xml which exists in some jar file. I know the coordinates of this Jar file. The vendor library is a blackbox to me but I know that it uses ThreadContextClassLoader to load the imports from a jar. However, it fails because it cannot find the imported xmls from the classpath/jars.
What is the gradle way of accomplishing this?
// body of gradle task
#TaskInput
void execute(IncrementalTaskInputs inputs) {
inputs.outOfDate { changes ->
// CodeGenerator is the vendor library
CodeGenerator generator = new CodeGenerator();
// call some setter methods to set the inputs.
//
generators.setXml(file("<path/to/the-file"))
generator.generate();
}
}
From my other answer we have ascertained that there's no option to set the classloader for the CodeGenerator. So, the only option is to have the jar with the xml files loaded by the same classloader as the CodeGenerator`.
Option 1: Add the jar to the buildscript classpath in a buildscript { ... } block
buildscript {
dependencies {
classpath 'com.group:jar-with-xmls:1.0'
}
}
Option 2: Add the jar to the buildscript classpath via buildSrc/build.gradle
dependencies {
compile 'com.vendor:code-generator:1.0'
runtime 'com.group:jar-with-xmls:1.0'
}
Does the CodeGenerator have a setClassloader(Classloader) method? You could likely use a Configuration and a URLClassloader. Eg:
configurations {
codeGenerator
}
dependencies {
codeGenerator 'foo:bar:1.0'
codeGenerator 'baz:biff:2.0'
}
task generateCode {
inputs.files configurations.codeGenerator
outputs.dir "$buildDir/codeGenerator"
doLast {
URL[] urls = configurations.codeGenerator.files.collect { it.toUri().toUrl() }
Classloader cl = new URLClassLoader(urls, null)
CodeGenerator generator = new CodeGenerator()
generator.setClassLoader(cl)
...
generator.generateTo("$buildDir/codeGenerator")
}
}
See a similar concept here
jar files are zip archives. You can open them as zipfiles using java.util.zip classes. In gradle, you might want to use ziptree to extract, as explained here:
http://mrhaki.blogspot.jp/2012/06/gradle-goodness-unpacking-archive.html
Related
Can I tell the Quarkus Gradle plugin (gradle quarkusDev or gradlew quarkusBuild -Dquarkus.package.uber-jar=true), to use resources provided by myself instead of choosing resources from dependency jars when they are duplicate?
I get these messages when building an uber-jar:
Duplicate entry META-INF/org.apache.uima.fit/types.txt entry from de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.segmentation-asl::jar:1.10.0(runtime) will be ignored. Existing file was provided by de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.syntax-asl::jar:1.10.0(runtime)
Duplicate entry META-INF/org.apache.uima.fit/types.txt entry from de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.lexmorph-asl::jar:1.10.0(runtime) will be ignored. Existing file was provided by de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.syntax-asl::jar:1.10.0(runtime)
Duplicate entry META-INF/org.apache.uima.fit/types.txt entry from de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.metadata-asl::jar:1.10.0(runtime) will be ignored. Existing file was provided by de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.syntax-asl::jar:1.10.0(runtime)
Duplicate entry META-INF/org.apache.uima.fit/types.txt entry from de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.ner-asl::jar:1.10.0(runtime) will be ignored. Existing file was provided by de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.syntax-asl::jar:1.10.0(runtime)
These DKPro / uimaFIT libraries are NLP libraries that bring all their own META-INF/org.apache.uima.fit/types.txt file. You are supposed to merge these files yourself and adding your own types, and then only include this newly merged file in your uber-jar, or as first one in your classpath.
There is an option quarkus.package.user-configured-ignored-entries in application.properties, but it also removes my own provided files. So that's not what I want (see also https://github.com/quarkusio/quarkus/blob/master/core/deployment/src/main/java/io/quarkus/deployment/pkg/steps/JarResultBuildStep.java#L186 ). I haven't checked the sources of gradle quarkusDev, but it results in the same runtime exceptions.
For reference for other people using uimaFIT, this incorrect META-INF/org.apache.uima.fit/types.txt file results in an error like
org.apache.uima.analysis_engine.AnalysisEngineProcessException: JCas type "org.apache.uima.conceptMapper.support.tokenizer.TokenAnnotation" used in Java code, but was not declared in the XML type descriptor..
So my question is, how do I tell Gradle or Quarkus to use this file provided by myself instead of randomly choosing a file from a dependency jar?
The example Gradle script written in Kotlin DSL. The task generateNlpFiles and the function joinResources automatically generate Java source files from XML files in src/main/typesystem into build/generated/sources/jcasgen/main/, as required by uimaFIT, and joins the duplicate resources like META-INF/org.apache.uima.fit/types.txt into /generated/resources/uimafit/. You don't need to look at them too hard.
import java.io.FileOutputStream
import java.net.URLClassLoader
import org.apache.commons.io.IOUtils
plugins {
id("java")
id("io.quarkus")
id("eclipse")
}
repositories {
jcenter()
// required for downloading OpenNLP models
maven("https://zoidberg.ukp.informatik.tu-darmstadt.de/artifactory/public-releases/")
}
group = "com.example"
version = "0.0.0-SNAPSHOT"
java.sourceCompatibility = JavaVersion.VERSION_11
java.targetCompatibility = JavaVersion.VERSION_11
dependencies {
val quarkusPlatformGroupId: String by project
val quarkusPlatformArtifactId: String by project
val quarkusPlatformVersion: String by project
// Quarkus dependencies
implementation(enforcedPlatform("${quarkusPlatformGroupId}:${quarkusPlatformArtifactId}:${quarkusPlatformVersion}"))
implementation("io.quarkus:quarkus-jaxb")
implementation("io.quarkus:quarkus-jackson")
implementation("io.quarkus:quarkus-resteasy")
implementation("io.quarkus:quarkus-jdbc-mariadb")
implementation("io.quarkus:quarkus-resteasy-jsonb")
implementation("io.quarkus:quarkus-smallrye-openapi")
implementation("io.quarkus:quarkus-container-image-docker")
// UIMA
implementation("org.apache.uima:uimaj-core:2.10.3")
implementation("org.apache.uima:ConceptMapper:2.10.2")
implementation("org.apache.uima:uimafit-core:2.4.0")
// DKPro
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.io.xmi-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.metadata-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.langdetect-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.icu-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-tagger-de-maxent:20120616.1")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-tagger-en-maxent:20120616.1")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-ner-de-nemgp:20141024.1")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-ner-en-location:20100907.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-ner-en-organization:20100907.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-ner-en-person:20130624.1")
// tests
testImplementation("io.quarkus:quarkus-junit5")
testImplementation("io.rest-assured:rest-assured")
// for generating NLP type system during compile time
compileOnly("org.apache.uima:uimaj-tools:2.10.4")
}
// joins resource files from classpath into single file
fun joinResources(classLoader: URLClassLoader, inputResourceName: String, outputFile: File) {
val outputStream = FileOutputStream(outputFile)
val resources = classLoader.findResources(inputResourceName).toList()
resources.forEach {
val inputStream = it.openStream()
IOUtils.copy(inputStream, outputStream)
outputStream.write('\n'.toInt());
inputStream.close()
}
outputStream.close()
}
// generate NLP type system from XML files and join uimaFIT files
val generateNlpFiles = task("generateNlpFiles") {
inputs.files(fileTree("src/main/typesystem"))
inputs.files(fileTree("src/main/resources"))
outputs.dir("${buildDir}/generated/sources/jcasgen/main/")
outputs.dir("${buildDir}/generated/resources/uimafit/")
val compileClasspath = project.sourceSets.main.get().compileClasspath
val runtimeClasspath = project.sourceSets.main.get().runtimeClasspath
val compileClassLoader = URLClassLoader(compileClasspath.map{ it.toURI().toURL() }.toTypedArray())
val runtimeClassLoader = URLClassLoader(runtimeClasspath.map{ it.toURI().toURL() }.toTypedArray())
// from XML files in src/main/typesystem/ generate Java sources into build/generated/sources/jcasgen/main/
val jCasGen = compileClassLoader.loadClass("org.apache.uima.tools.jcasgen.Jg").newInstance()
fileTree("src/main/typesystem").forEach() { typeSystemFile ->
doFirst {
// see https://github.com/Dictanova/gradle-jcasgen-plugin/blob/master/src/main/groovy/com/dictanova/jcasgen/gradle/JCasGenTask.groovy#L45
val jcasgeninput = "${typeSystemFile}"
val jcasgenoutput = "${buildDir}/generated/sources/jcasgen/main/"
val jcasgenclasspath = "${runtimeClasspath.asPath}"
val arguments: Array<String> = arrayOf("-jcasgeninput", jcasgeninput, "-jcasgenoutput", jcasgenoutput, "-jcasgenclasspath", jcasgenclasspath)
val main1 = jCasGen.javaClass.getMethod("main1", arguments.javaClass)
main1.invoke(jCasGen, arguments)
}
}
// collect types.txt and components.txt from classpath and join them in build/generated/resources/uimafit/META-INF/org.apache.uima.fit/
val uimafitDir = "${buildDir}/generated/resources/uimafit/META-INF/org.apache.uima.fit"
mkdir(uimafitDir)
joinResources(runtimeClassLoader, "META-INF/org.apache.uima.fit/types.txt", File("${uimafitDir}/types.txt"))
joinResources(runtimeClassLoader, "META-INF/org.apache.uima.fit/components.txt", File("${uimafitDir}/components.txt"))
}
eclipse {
project {
natures(
"org.eclipse.wst.common.project.facet.core.nature",
"org.eclipse.buildship.core.gradleprojectnature"
)
}
classpath {
file.withXml {
val attributes = mapOf("kind" to "src", "path" to "build/generated/sources/jcasgen/main")
this.asNode().appendNode("classpathentry", attributes)
}
}
}
tasks {
compileJava {
options.encoding = "UTF-8"
options.compilerArgs.add("-parameters") // was in original Quarkus Gradle file, not sure what this does
dependsOn(generateNlpFiles)
// add generated sources to source sets
sourceSets["main"].java.srcDir(file("${buildDir}/generated/sources/jcasgen/main/"))
sourceSets["main"].resources.srcDir(file("${buildDir}/generated/resources/uimafit/"))
}
compileTestJava {
options.encoding = "UTF-8"
}
"eclipse" {
dependsOn(generateNlpFiles)
}
}
One workaround would be using gradlew quarkusBuild -Dquarkus.package.uber-jar=true with entries in quarkus.package.user-configured-ignored-entries and adding my own files manually to the resulting jar, but that wouldn't work with gradle quarkusDev.
I am using Quarkus 1.3.2, as Quarkus 1.4.1 cannot handle multiple resource directories (see also https://github.com/quarkusio/quarkus/blob/master/devtools/gradle/src/main/java/io/quarkus/gradle/tasks/QuarkusDev.java#L391 ), as needed by my project.
I also tried to exclude files with some Gradle JarJar plugins, like https://github.com/shevek/jarjar , but couldn't get them running.
Right now, you can't, it will just take one from the jars providing it.
Could you create a feature request in our tracker: https://github.com/quarkusio/quarkus/issues/new?assignees=&labels=kind%2Fenhancement&template=feature_request.md&title= .
Sounds like something useful.
Thanks!
I have a project with two subprojects.
One of these subprojects, "A", contains code that is being published to an artifact.
The other subproject, "B", has a task that needs to do exactly what one of the methods in A's code does. I can replicate the logic in groovy, but is there any way I can actually have my task in subproject B call the code that was compiled as part of subproject A?
I'd tried adding a buildscript block in B that added the artifact from A to the classpath:
buildscript {
dependencies {
classpath project(':subproject-a')
}
}
...but this gave me an error:
Cannot use project dependencies in a script classpath definition.
I don't believe I can move subproject-a to buildSrc, as I'm also publishing its artifact to a maven repository for other projects to use.
You have a chicken or egg problem where all of the Gradle project classloaders are resolved before any classes are compiled. This can be resolved using a custom configuration and a Classloader
Eg:
configurations {
custom
}
dependencies {
custom project(':subproject-a')
}
task customTask {
doLast {
def urls = configurations.custom.files.collect { it.toURI().toURL() }
ClassLoader cl = new java.net.URLClassLoader(urls as URL[])
Class myClass = cl.loadClass('com.foo.MyClass')
// assuming zero args constructor
Object myObject = myClass.newInstance()
// assuming method which accepts single String argument
java.lang.reflect.Method myMethod = myClass.getMethod('myMethodName', String.class)
myMethod.invoke(myObject, 'methodArg')
}
}
I'm writing some multi-module Android app (actually in Kotlin) and I'd like to have some project-wide flags. I was thinking about setting those flags in Gradle and accessing them from my code. But I couldn't find how to do that. Is it possible?
You could generate a property file in Gradle which is added to your runtime classpath. Please adapt the following groovy/java to kotlin/android
ext {
someProperty = 'xxx'
}
dependencies {
compile files("$buildDir/generated/resources")
}
task generateResources {
// configure task inputs/outputs for up-to-date checking/caching
inputs.property("someProperty", someProperty)
outputs.dir "$buildDir/generated/resources"
doLast {
file("$buildDir/generated/resources/myprops.properties").text = "someProperty=$someProperty"
}
}
// important: wire the task into the DAG
compileJava.dependsOn generateResources
Then in your java code
InputStream in = MyClass.getClassLoader().getResourceAsStream("myprops.properties");
Properties props = new Properties()
props.load(in)
String someProperty = props getProperty("someProperty"); // xxx
Background: Running Android Studio 3.0-beta7 and trying to get a javadoc task to work for an Android library (the fact that this is not available as a ready-made task in the first place is really strange), and I managed to tweak an answer to a different question for my needs, ending up with this code (https://stackoverflow.com/a/46810617/1226020):
task javadoc(type: Javadoc) {
failOnError false
source = android.sourceSets.main.java.srcDirs
// Also add the generated R class to avoid errors...
// TODO: debug is hard-coded
source += "$buildDir/generated/source/r/debug/"
// ... but exclude the R classes from the docs
excludes += "**/R.java"
// TODO: "compile" is deprecated in Gradle 4.1,
// but "implementation" and "api" are not resolvable :(
classpath += configurations.compile
afterEvaluate {
// Wait after evaluation to add the android classpath
// to avoid "buildToolsVersion is not specified" error
classpath += files(android.getBootClasspath())
// Process AAR dependencies
def aarDependencies = classpath.filter { it.name.endsWith('.aar') }
classpath -= aarDependencies
aarDependencies.each { aar ->
System.out.println("Adding classpath for aar: " + aar.name)
// Extract classes.jar from the AAR dependency, and add it to the javadoc classpath
def outputPath = "$buildDir/tmp/exploded-aar/${aar.name.replace('.aar', '.jar')}"
classpath += files(outputPath)
// Use a task so the actual extraction only happens before the javadoc task is run
dependsOn task(name: "extract ${aar.name}").doLast {
extractEntry(aar, 'classes.jar', outputPath)
}
}
}
}
// Utility method to extract only one entry in a zip file
private def extractEntry(archive, entryPath, outputPath) {
if (!archive.exists()) {
throw new GradleException("archive $archive not found")
}
def zip = new java.util.zip.ZipFile(archive)
zip.entries().each {
if (it.name == entryPath) {
def path = new File(outputPath)
if (!path.exists()) {
path.getParentFile().mkdirs()
// Surely there's a simpler is->os utility except
// the one in java.nio.Files? Ah well...
def buf = new byte[1024]
def is = zip.getInputStream(it)
def os = new FileOutputStream(path)
def len
while ((len = is.read(buf)) != -1) {
os.write(buf, 0, len)
}
os.close()
}
}
}
zip.close()
}
This code tries to find all dependency AAR:s, loops through them and extracts classes.jar from them, and puts them in a temp folder that is added to the classpath during javadoc generation. Basically trying to reproduce what the really old android gradle plugin used to do with "exploded-aar".
However, the code relies on using compile dependencies. Using api or implementation that are recommended with Gradle 4.1 will not work, since these are not resolvable from a Gradle task.
Question: how can I get a list of dependencies using the api or implementation directives when e.g. configuration.api renders a "not resolvable" error?
Bonus question: is there a new, better way to create javadocs for a library with Android Studio 3.0 that doesn't involve 100 lines of workarounds?
You can wait for this to be merged:
https://issues.apache.org/jira/browse/MJAVADOC-450
Basically, the current Maven Javadoc plugin ignores classifiers such as AAR.
I ran in to the same problem when trying your answer to this question when this error message kept me from resolving the implementation dependencies:
Resolving configuration 'implementation' directly is not allowed
Then I discovered that this answer has a solution that makes resolving of the implementation and api configurations possible:
configurations.implementation.setCanBeResolved(true)
I'm not sure how dirty this workaround is, but it seems to do the trick for the javadocJar task situation.
I want to extract the dependencies defined in a particular gradle configuration; My code is like :
project.configurations.myConfig.files.each { src ->
logger.debug "Extracting ${src.absolutePath} to ${to}"
project.copy {
eachFile { fileCopyDetails ->
logger.debug("Extracting file : ${fileCopyDetails.file.path}")
}
from project.zipTree(src)
into to
}
But this is extracting the ALL the files including dependencies defined in the pom files. My requirement is to just extract the first level dependencies as defined in dependencies{ myConfig ... }
Solution 1
I tried with setting transitive = false and it works but that breaks the build because we are removing the dependent libraries from classpath.
Solution 2
Tried with creating a new configuration which is copy of myConfig but set transitive = false; And it works
I'm looking for any better solution where I do not have to copy the configuration.
You can make a copy of a configuration, and then set the copy to non-transitive. This way you only have to specify the dependencies once:
// This will include dependencies from superconfigurations. If that is not
// what you want, use "copy()" instead.
def nonTransitiveMyConfig = configurations.myConfig.copyRecursive()
nonTransitiveMyConfig.transitive = false
nonTransitiveMyConfig.files.each { src ->
// ...
}
See:
copyRecursive()
copy()