Gradle publish multiple independent artifacts - gradle

I've got a project that builds using Gradle and the ivy-publish plugin. In addition to building a JAR, build.gradle also executes a run task that executes XmlFileGenerator.main(), which generates 5 XML files (call them A, B, C, D, and E). I'm looking to publish each of these XML files to our Ivy repository; each should have the same group and version but a different module and a different filename, and each should have its own ivy.xml that lists only itself.
I'm able to set the filename of the file that's published, but the module name remains the same as my project's name, and as a result all of my XML files are published under the same module name instead of under independent ones.
So for example, I want A.xml to be published at {myLocalIvyRootDir}\my-group\A\{version}\xmls\A-{version}.xml and I want B.xml to be published at {myLocalIvyRootDir}\my-group\B\{version}\xmls\B-{version}.xml. But instead A is published at {myLocalIvyRootDir}\my-group\my-project\{version}\xmls\A-{version}.xml and B is published alongside it at {myLocalIvyRootDir}\my-group\my-project\{version}\xmls\B-{version}.xml.
Here's the relevant subset of build.gradle (showing only A but not B-E):
apply plugin: 'ivy-publish'
group = 'my-group'
publishing {
publications {
ivy(IvyPublication) {
artifact jar
}
aXml(IvyPublication) {
artifact('target/A.xml') {
name = 'A'
extension = 'xml'
type = 'xml'
}
}
}
}
mainClassName = 'my-group.my-project.XmlFileGenerator'
I've tried defining the module property on the publication with this code:
aXml(IvyPublication) {
module 'A'
artifact('target/A.xml') {
name = 'A'
extension = 'xml'
type = 'xml'
}
}
But I get the following error message:
> org.gradle.api.internal.MissingMethodException: Could not find method module() for arguments [A] on org.gradle.api.publish.ivy.internal.publication.DefaultIvyPublication_Decorated#32384c50.
And I've tried changing the rootProject.name dynamically with code like:
publishing {
publications {
ivy(IvyPublication) {
artifact jar
}
project.metaClass.getName {"A"}
aXml(IvyPublication) {
artifact('target/A.xml') {
name = 'A'
extension = 'xml'
type = 'xml'
}
}
}
}
That produced no errors, but also no change in behavior.
I feel like I'm probably just missing something small, but don't know what it is. Can anyone point me in the right direction?

It turned out that this particular project was still pointing to Gradle 1.6, before these properties were made available (they were added in 1.7). So all that was needed was to point to 1.7, and everything worked as intended.

Related

Quarkus Gradle plugin: overriding duplicate file entries coming from dependency libraries

Can I tell the Quarkus Gradle plugin (gradle quarkusDev or gradlew quarkusBuild -Dquarkus.package.uber-jar=true), to use resources provided by myself instead of choosing resources from dependency jars when they are duplicate?
I get these messages when building an uber-jar:
Duplicate entry META-INF/org.apache.uima.fit/types.txt entry from de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.segmentation-asl::jar:1.10.0(runtime) will be ignored. Existing file was provided by de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.syntax-asl::jar:1.10.0(runtime)
Duplicate entry META-INF/org.apache.uima.fit/types.txt entry from de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.lexmorph-asl::jar:1.10.0(runtime) will be ignored. Existing file was provided by de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.syntax-asl::jar:1.10.0(runtime)
Duplicate entry META-INF/org.apache.uima.fit/types.txt entry from de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.metadata-asl::jar:1.10.0(runtime) will be ignored. Existing file was provided by de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.syntax-asl::jar:1.10.0(runtime)
Duplicate entry META-INF/org.apache.uima.fit/types.txt entry from de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.ner-asl::jar:1.10.0(runtime) will be ignored. Existing file was provided by de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.syntax-asl::jar:1.10.0(runtime)
These DKPro / uimaFIT libraries are NLP libraries that bring all their own META-INF/org.apache.uima.fit/types.txt file. You are supposed to merge these files yourself and adding your own types, and then only include this newly merged file in your uber-jar, or as first one in your classpath.
There is an option quarkus.package.user-configured-ignored-entries in application.properties, but it also removes my own provided files. So that's not what I want (see also https://github.com/quarkusio/quarkus/blob/master/core/deployment/src/main/java/io/quarkus/deployment/pkg/steps/JarResultBuildStep.java#L186 ). I haven't checked the sources of gradle quarkusDev, but it results in the same runtime exceptions.
For reference for other people using uimaFIT, this incorrect META-INF/org.apache.uima.fit/types.txt file results in an error like
org.apache.uima.analysis_engine.AnalysisEngineProcessException: JCas type "org.apache.uima.conceptMapper.support.tokenizer.TokenAnnotation" used in Java code, but was not declared in the XML type descriptor..
So my question is, how do I tell Gradle or Quarkus to use this file provided by myself instead of randomly choosing a file from a dependency jar?
The example Gradle script written in Kotlin DSL. The task generateNlpFiles and the function joinResources automatically generate Java source files from XML files in src/main/typesystem into build/generated/sources/jcasgen/main/, as required by uimaFIT, and joins the duplicate resources like META-INF/org.apache.uima.fit/types.txt into /generated/resources/uimafit/. You don't need to look at them too hard.
import java.io.FileOutputStream
import java.net.URLClassLoader
import org.apache.commons.io.IOUtils
plugins {
id("java")
id("io.quarkus")
id("eclipse")
}
repositories {
jcenter()
// required for downloading OpenNLP models
maven("https://zoidberg.ukp.informatik.tu-darmstadt.de/artifactory/public-releases/")
}
group = "com.example"
version = "0.0.0-SNAPSHOT"
java.sourceCompatibility = JavaVersion.VERSION_11
java.targetCompatibility = JavaVersion.VERSION_11
dependencies {
val quarkusPlatformGroupId: String by project
val quarkusPlatformArtifactId: String by project
val quarkusPlatformVersion: String by project
// Quarkus dependencies
implementation(enforcedPlatform("${quarkusPlatformGroupId}:${quarkusPlatformArtifactId}:${quarkusPlatformVersion}"))
implementation("io.quarkus:quarkus-jaxb")
implementation("io.quarkus:quarkus-jackson")
implementation("io.quarkus:quarkus-resteasy")
implementation("io.quarkus:quarkus-jdbc-mariadb")
implementation("io.quarkus:quarkus-resteasy-jsonb")
implementation("io.quarkus:quarkus-smallrye-openapi")
implementation("io.quarkus:quarkus-container-image-docker")
// UIMA
implementation("org.apache.uima:uimaj-core:2.10.3")
implementation("org.apache.uima:ConceptMapper:2.10.2")
implementation("org.apache.uima:uimafit-core:2.4.0")
// DKPro
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.io.xmi-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.api.metadata-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.langdetect-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.icu-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-tagger-de-maxent:20120616.1")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-tagger-en-maxent:20120616.1")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-asl:1.10.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-ner-de-nemgp:20141024.1")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-ner-en-location:20100907.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-ner-en-organization:20100907.0")
implementation("de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.opennlp-model-ner-en-person:20130624.1")
// tests
testImplementation("io.quarkus:quarkus-junit5")
testImplementation("io.rest-assured:rest-assured")
// for generating NLP type system during compile time
compileOnly("org.apache.uima:uimaj-tools:2.10.4")
}
// joins resource files from classpath into single file
fun joinResources(classLoader: URLClassLoader, inputResourceName: String, outputFile: File) {
val outputStream = FileOutputStream(outputFile)
val resources = classLoader.findResources(inputResourceName).toList()
resources.forEach {
val inputStream = it.openStream()
IOUtils.copy(inputStream, outputStream)
outputStream.write('\n'.toInt());
inputStream.close()
}
outputStream.close()
}
// generate NLP type system from XML files and join uimaFIT files
val generateNlpFiles = task("generateNlpFiles") {
inputs.files(fileTree("src/main/typesystem"))
inputs.files(fileTree("src/main/resources"))
outputs.dir("${buildDir}/generated/sources/jcasgen/main/")
outputs.dir("${buildDir}/generated/resources/uimafit/")
val compileClasspath = project.sourceSets.main.get().compileClasspath
val runtimeClasspath = project.sourceSets.main.get().runtimeClasspath
val compileClassLoader = URLClassLoader(compileClasspath.map{ it.toURI().toURL() }.toTypedArray())
val runtimeClassLoader = URLClassLoader(runtimeClasspath.map{ it.toURI().toURL() }.toTypedArray())
// from XML files in src/main/typesystem/ generate Java sources into build/generated/sources/jcasgen/main/
val jCasGen = compileClassLoader.loadClass("org.apache.uima.tools.jcasgen.Jg").newInstance()
fileTree("src/main/typesystem").forEach() { typeSystemFile ->
doFirst {
// see https://github.com/Dictanova/gradle-jcasgen-plugin/blob/master/src/main/groovy/com/dictanova/jcasgen/gradle/JCasGenTask.groovy#L45
val jcasgeninput = "${typeSystemFile}"
val jcasgenoutput = "${buildDir}/generated/sources/jcasgen/main/"
val jcasgenclasspath = "${runtimeClasspath.asPath}"
val arguments: Array<String> = arrayOf("-jcasgeninput", jcasgeninput, "-jcasgenoutput", jcasgenoutput, "-jcasgenclasspath", jcasgenclasspath)
val main1 = jCasGen.javaClass.getMethod("main1", arguments.javaClass)
main1.invoke(jCasGen, arguments)
}
}
// collect types.txt and components.txt from classpath and join them in build/generated/resources/uimafit/META-INF/org.apache.uima.fit/
val uimafitDir = "${buildDir}/generated/resources/uimafit/META-INF/org.apache.uima.fit"
mkdir(uimafitDir)
joinResources(runtimeClassLoader, "META-INF/org.apache.uima.fit/types.txt", File("${uimafitDir}/types.txt"))
joinResources(runtimeClassLoader, "META-INF/org.apache.uima.fit/components.txt", File("${uimafitDir}/components.txt"))
}
eclipse {
project {
natures(
"org.eclipse.wst.common.project.facet.core.nature",
"org.eclipse.buildship.core.gradleprojectnature"
)
}
classpath {
file.withXml {
val attributes = mapOf("kind" to "src", "path" to "build/generated/sources/jcasgen/main")
this.asNode().appendNode("classpathentry", attributes)
}
}
}
tasks {
compileJava {
options.encoding = "UTF-8"
options.compilerArgs.add("-parameters") // was in original Quarkus Gradle file, not sure what this does
dependsOn(generateNlpFiles)
// add generated sources to source sets
sourceSets["main"].java.srcDir(file("${buildDir}/generated/sources/jcasgen/main/"))
sourceSets["main"].resources.srcDir(file("${buildDir}/generated/resources/uimafit/"))
}
compileTestJava {
options.encoding = "UTF-8"
}
"eclipse" {
dependsOn(generateNlpFiles)
}
}
One workaround would be using gradlew quarkusBuild -Dquarkus.package.uber-jar=true with entries in quarkus.package.user-configured-ignored-entries and adding my own files manually to the resulting jar, but that wouldn't work with gradle quarkusDev.
I am using Quarkus 1.3.2, as Quarkus 1.4.1 cannot handle multiple resource directories (see also https://github.com/quarkusio/quarkus/blob/master/devtools/gradle/src/main/java/io/quarkus/gradle/tasks/QuarkusDev.java#L391 ), as needed by my project.
I also tried to exclude files with some Gradle JarJar plugins, like https://github.com/shevek/jarjar , but couldn't get them running.
Right now, you can't, it will just take one from the jars providing it.
Could you create a feature request in our tracker: https://github.com/quarkusio/quarkus/issues/new?assignees=&labels=kind%2Fenhancement&template=feature_request.md&title= .
Sounds like something useful.
Thanks!

How can I call code from one subproject in a gradle tasks of another subproject?

I have a project with two subprojects.
One of these subprojects, "A", contains code that is being published to an artifact.
The other subproject, "B", has a task that needs to do exactly what one of the methods in A's code does. I can replicate the logic in groovy, but is there any way I can actually have my task in subproject B call the code that was compiled as part of subproject A?
I'd tried adding a buildscript block in B that added the artifact from A to the classpath:
buildscript {
dependencies {
classpath project(':subproject-a')
}
}
...but this gave me an error:
Cannot use project dependencies in a script classpath definition.
I don't believe I can move subproject-a to buildSrc, as I'm also publishing its artifact to a maven repository for other projects to use.
You have a chicken or egg problem where all of the Gradle project classloaders are resolved before any classes are compiled. This can be resolved using a custom configuration and a Classloader
Eg:
configurations {
custom
}
dependencies {
custom project(':subproject-a')
}
task customTask {
doLast {
def urls = configurations.custom.files.collect { it.toURI().toURL() }
ClassLoader cl = new java.net.URLClassLoader(urls as URL[])
Class myClass = cl.loadClass('com.foo.MyClass')
// assuming zero args constructor
Object myObject = myClass.newInstance()
// assuming method which accepts single String argument
java.lang.reflect.Method myMethod = myClass.getMethod('myMethodName', String.class)
myMethod.invoke(myObject, 'methodArg')
}
}

How do I re-use gradle definitions across projects

I am working on a set of projects that each uses Gradle as the build tool. This is not a multi-project setup although I want to be able to re-use some common Gradle scripts across each project for consistency as the projects are related.
For example, for the Java component, I want the manifest file in the generated JAR file to have the same information. In particular, all the projects will have the same major and minor versions numbers, while the patch version will be project specific.
Here's what I've tried so far:
master.gradle - to be shared across projects
group 'com.example'
ext.majorVersion = 2
ext.minorVersion = 3
ext.patchVersion = 0; // Projects to override
def patchVersion() {
return patchVersion;
}
apply plugin: 'java'
jar {
manifest {
attributes 'Bundle-Vendor': 'Example Company',
'Bundle-Description': 'Project ABC',
'Implementation-Title': project.name,
'Implementation-Version': majorVersion + '.' + minorVersion + '.' + patchVersion()
}
}
build.gradle - for one of the projects
apply from: 'master.gradle'
patchVersion = 3
task hello {
println 'Version: ' + majorVersion + '.' + minorVersion + '.' + patchVersion
}
If I run gradle hello jar from the command line, I get Version: 2.3.3 from the hello task. However, the JAR file manifest contains 2.3.0 which is not what I want. How do I get the correct patch version into the manifest? And more generally, how do I let projects supply information to the master scripts?
Based on #Oliver Charlesworth's suggestion I came up with the following. I had to write a simple plugin to hold the version information and use it as an extension object. Please note (as suggested by the comments in the gradle files), the order in which items are applied and set is very important. Different orderings result in compiler errors or values used before they are set.
If anyone wants to suggest improvements, please do so.
master.gradle
group 'com.example'
// N.B. The individual project must have applied the semantic version
// plugin and set the patch version before applying this file.
// Otherwise the following will fail.
// Specify the major and minor version numbers.
project.semver.major = 2
project.semver.minor = 3
project.version = project.semver
apply plugin: 'java'
jar {
manifest {
attributes 'Bundle-Vendor': 'Example Company',
'Bundle-Description': project.description,
'Implementation-Title': project.name,
'Implementation-Version': project.semver
}
}
build.gradle
// Describe the project before importing the master gradle file
project.description = 'Content Upload Assistant'
// Specify the patch version
apply plugin: SemanticVersionPlugin
project.semver.patch = 3
// Load the master gradle file in the context of the project and the semantic version
apply from: 'master.gradle'
The simple plugin can be found below. At the moment it is with the application source code, but it should be moved out into a library, along with the master gradle file.
buildSrc/src/main/groovy/SemanticVersionPlugin.groovy
import org.gradle.api.Plugin
import org.gradle.api.Project
class SemanticVersionPlugin implements Plugin<Project> {
void apply(Project project) {
project.extensions.create('semver', SemanticVersion)
}
}
class SemanticVersion {
int major
int minor
int patch
String toString() {
return major + '.' + minor + '.' + patch
}
}

Gradle plugin for XML Beans

I am trying to write a Gradle plugin for XML Beans. I have started with one of the 'Hello from Gradle' plugin examples, and also a plugin published by R. Artavia here. That plugin went straight to jar - I am trying to only generate source. The generated source must then be compiled with other project source and included in a single jar. Other goals include
- full plugin - all I should need is "apply plugin: 'xmlbean'"
- I can configure source/code gen location and some features if I want to
- It detects whether it needs to be rebuilt. (well, eventually!!!)
I am off to a pretty good start, but am blocked defining a new sourceSet. I am getting an error "No such property 'srcDirs'" (or 'srcDir'). It seems there is something I have to define someplace to make a new sourceSet work but I cannot find it. I have tried several different syntaxes (with/without equal sign, brackets, srcDir/srcDirs, etc. - nothing is working...
What do I need to do inside a plugin to make a new sourceSet entry be properly recognized?
Thank you!
JKE
File: xmlbean.gradle (includes greeting plugin for the moment for debugging)
apply plugin: xmlbean
apply plugin: 'java'
xmlbean {
message = 'Hi'
greeter = 'Gradle'
}
class xmlbean implements Plugin<Project> {
void apply(Project project) {
project.extensions.create("xmlbean", xmlbeanExtension)
Task xmlbeanTask = project.task('xmlbean')
xmlbeanTask << {
project.configurations {
xmlbeans
}
project.dependencies {
xmlbeans 'org.apache.xmlbeans:xmlbeans:2.5.0'
}
project.sourceSets {
main {
java {
srcDirs += '$project.buildDir/generated-source/xmlbeans'
}
}
xmlbeans {
srcDirs = ['src/main/xsd']
}
}
ant.taskdef(name: 'xmlbean',
classname: 'org.apache.xmlbeans.impl.tool.XMLBean',
classpath: project.configurations.xmlbeans.asPath)
ant.xmlbean(schema: project.sourceSets.xmlbean.srcDir,
srconly: true,
srcgendir: "$project.buildDir/generated-sources/xmlbeans",
classpath: project.configurations.xmlbeans.asPath)
println "${project.xmlbean.message} from ${project.xmlbean.greeter}"
}
project.compileJava.dependsOn(xmlbeanTask)
}
}
class xmlbeanExtension {
String message
String greeter
}
File: build.gradle
apply from: '../gradle/xmlbeans.gradle'
dependencies {
compile "xalan:xalan:$ver_xalan",
":viz-common:0.0.1",
":uform-repository:0.1.0"
}
Console: Error message:
:idk:xmlbean FAILED
FAILURE: Build failed with an exception.
* Where:
Script 'C:\jdev\cpc-maven\try.g2\comotion\gradle\xmlbeans.gradle' line: 32
* What went wrong:
Execution failed for task ':idk:xmlbean'.
> No such property: srcDirs for class: org.gradle.api.internal.tasks.DefaultSourceSet_Decorated
...
BUILD FAILED
Gradle info: version 2.5 / groovy 2.3.10 / JVM 7u55 on Windows 7 AMD64
You should try to become familiar with the Gradle DSL reference guide, because it's a huge help in situations like this. For example, if you click on the sourceSets { } link in the left navigation bar, you're taken to this section on source sets.
From there, you'll discover that the sourceSets {} block is backed by a class, SourceSetContainer. The next level of configuration nested inside is backed by a SourceSet object, and then within that you have one or more SourceDirectorySet configurations. When you follow the link to SourceDirectorySet, you'll see that there are getSrcDirs() and setSrcDirs() methods.
So how does this help? If you look closely at the exception, you'll see that Gradle is saying it can't find a srcDirs property on DefaultSourceSet_Decorated, which you can hopefully infer is an instance of SourceSet. That interface does not have an srcDirs property. That's because your xmlbeans {} block is configuring a SourceSet, not a SourceDirectorySet. You need to add another nested configuration to gain access to srcDirs.
At this point, I'm wondering whether a new source set is the appropriate solution. Unfortunately it's not clear to me exactly what the plugin should be doing, so I can't offer any alternatives at this point.

gradle tar include nested files at top level

In gradle (1.9), I have multiple subprojects. Each one uses the application plugin to create a tar and cli. I am trying to get all these tars into a unified tar, but I am having a lot of trouble.
Here is the tar format I am looking for:
${project.name}/${subproject.name}.tar
I have tried using both the Tar task and the distribution plugin, but for each one, I am not able to find a clean way to just get the generated tars (or any tar), and put them at top level, excluding everything else.
Here is a sample using the distirbution pluging, but its not giving the output I like
apply plugin: 'distribution'
distributions {
testing {
contents {
from(".")
exclude "*src*"
exclude "*idea*"
exclude "*.jar"
exclude ".MF"
filesMatching("**/build/distributions/*.tar") {
if(file.name == "${project.name}-testing.tar") {
exclude()
} else {
name file.name
}
}
}
}
}
Here is what I would like (but not working):
apply plugin: 'distribution'
distributions {
testing {
contents {
include "**/*.tar" // shows up at top level
}
}
}
EDIT:
Getting closer.
distributions {
testing {
contents {
from subprojects.buildDir
includeEmptyDirs false
include "**/*.tar"
exclude "**/${project.name}-testing.tar"
}
}
}
This will give me ${project.name}/distribution/${subproject.name}.tar
Here is the solution for your problem. Put the following to the root project:
task distTar(type: Tar) {
destinationDir = new File("$buildDir/distributions")
baseName = 'unifiedTar'
}
subprojects {
// definitions common to subprojects...
afterEvaluate {
def distTar = tasks.findByName('distTar')
if(distTar) {
rootProject.distTar.dependsOn distTar
rootProject.distTar.inputs.file distTar.archivePath
rootProject.distTar.from distTar.archivePath
}
}
}
then invoke "build distTar" on the root project - it will assemble "unifiedTar.tar" in "build/distributions" subfolder (of the root project).
How it works:
"task distTar(...)" declares a new task of type Tar in the root project.
"subprojects" applies the specified closure to each subproject.
"afterEvaluate" ensures that the specified closure is called AFTER the current project (in this case subproject) is evaluated. This is very important, because we are going to use properties of the subproject which are defined only after it's evaluation.
"tasks.findByName" allows us to determine, whether the given task is defined for given project. If not, it returns null and the following code is not performed. This way we stay agnostic regarding the nature of the subproject.
"dependsOn" ensures that distTar of the root project depends on distTar of the given project (and, therefore, is executed later than it).
"inputs.file" ensures that distTar on root project is not executed, if none of the constituent tars has changed.
"from" adds constituent tar to unified tar.

Resources