I am wondering how to declare a dependency on artifact produced by a legacy Ant build.
I have a Gradle build that consists of 2 modules: 'legacy' and 'backend'.
The 'legacy' module is quite old, requires an Ant to generate a 'legacy/dist' directory. I cannot convert this build to Gradle, unfortunately. This build takes a lot of time. For convenience, the Ant build is wrapped as a Gradle build.
The 'backend' module uses the 'war' plugin. It needs to package 'legacy/dist' directory into the war.
legacy/build.grade:
ant.importBuild 'build.xml'
current backend/build.grade:
plugins {
id 'war'
}
dependecies {
?
}
war {
dependsOn ':legacy:dist'
from('../legacy/dist')
}
This solution works, but has several drawbacks. To force Gradle to build 'legacy' before 'backend', I stated that the :backend:war task depends on the :legacy:dist task (not recommended). Every time I build backend, Gradle cannot know if legacy was build and calls :legacy:dist task, triggering unnecessarily the legacy build that takes a lot of time.
How should I declare on legacy/build.grade that it generates the dist directory and that the backend/build.grade depends on this dist directory? Without depending on tasks and without mentioning directory explicitly? I could figure out how to accomplish this from the documentation. Thanks.
Related
We have a Gradle project with a bunch of modules. One of those modules is a custom code generator, written as a Gradle plugin. We want to run that code-generator plugin in another module later in the same overall multi-module build, in order to test the code generator.
We know how to create a separate project on the fly and run the code generator in that, but we need to run the code generator in the main project, not in a temporary test project.
Nothing we have tried works, and the Gradle documentation doesn't appear to address this. It seems to be fundamental to Gradle's design, because the entire set of plugins used in a build is basically a single program, assembled at the start. Trying to add a just-now-built plugin after the fact seems unsupported, or we're missing something.
The best we've been able to come up with so far is to implement the plugin in Java (Kotlin would also have worked), so the Gradle plugin is just a thin Gradle skin over the implementation, and call the Java implementation directly when running the code generator in the other module. This works, but it means we aren't actually testing the Gradle portion of the code generator.
This is natively supported in Maven (maven multi-module project with one plugin module, and https://maven.apache.org/guides/mini/guide-multiple-modules.html), which is not surprising because every plugin in Maven runs in a separate class loader. If it's not possible in Gradle, that would be one of the few cases where Gradle doesn't have feature parity.
A hacky way to do this is to run the newly-compiled plugin via Gradle's test kit runner.
A cleaner way to do this is to write plugins as thin shells of code written to Gradle's API that delegate the real work to plain old Java (or Kotlin) utility methods. This has a number of advantages:
You can unit test the utility methods.
You can use the utility methods for other purposes unrelated to the plugin.
You can call the utility methods directly from other modules in the project, thereby accomplishing what the plugin would have done if you could have built it and then called it in the same build.
To expand on the above answer.
Instead of calling the plugin like a plugin, add a main method that accepts the same parameters that Gradle plugin configuration passed to the plugin.
Then call the plugin's main using Gradle's Java exec task:
task(generateFoo, type: JavaExec) {
main = 'com.bar.Foo'
classpath = configurations.runtimeClasspath
args = ["arg1", "${projectDir}/src/generated/java"]
}
Note the args: those are the same pieces of information that used to be passed in via Gradle configuration:
apply plugin: 'foo-plugin'
generateFoo {
theArg "arg1"
outputDir "${projectDir}/src/generated/java"
}
Because the runtime classpath used by Java exec is the one for the calling module, you may encounter runtime classloader problems.
If that happens, it's easily fixed. Just change the rewritten plugin to a fat jar:
task fatJar(type: Jar) {
manifest {
attributes 'Implementation-Title': 'Foo Fat JAR', 'Main-Class': 'com.bar.Foo'
}
baseName = project.name + '-exec'
from { configurations.runtimeClasspath.collect { it.isDirectory() ? it : zipTree(it) } }
with jar
}
artifacts {
archives fatJar
}
And then execute the fat jar with Java exec:
def fooGenerate = task(generateFoo, type: JavaExec) {
main = 'com.bar.Foo'
classpath = files("${projectDir}/../foo-plugin-module/build/libs/foo-plugin-module-exec.jar")
args = ["arg1", "${projectDir}/src/generated/java"]
}
Finally, make the dependent module's compile task depend on the code generation:
compileJava.mustRunAfter fooGenerate
If you use the fatJar approach, you don't even need to declare implementation project(":foo") in the dependent modules.
It might be also be possible to use Gradle's composite builds for this (https://docs.gradle.org/current/userguide/composite_builds.html).
I am building a plugin in IntelliJ and Gradle, and have the following question:
What is the difference between the predefined tasks build and buildSearchableOptions in Gradle?
I can see that :buildSearchableOptions is called as part of :build and that it produces its own JAR file.
They come from two different plugins.
Assuming a Java project, the build task comes from the java plugin which in turn comes from the life cycle/base plugin:
https://github.com/gradle/gradle/blob/master/subprojects/plugins/src/main/java/org/gradle/api/plugins/JavaBasePlugin.java#L74
https://docs.gradle.org/current/userguide/base_plugin.html
The buildSearchableOptions task comes from the org.jetbrains.intellij plugin:
https://github.com/JetBrains/gradle-intellij-plugin/blob/master/src/main/groovy/org/jetbrains/intellij/IntelliJPlugin.groovy#L350..L360
https://github.com/JetBrains/gradle-intellij-plugin#tasks
I am trying out Gradle, and am wondering, what is supposed to happen to a project's dependencies after you run gradle build? For example, my sample projects don't run on the command line after they are built, because they are missing dependencies. They seem to compile fine, as gradle doesn't give me errors or warnings about finding the dependencies.
Gradle projects I've made in IntelliJ Idea have the same problem. They compile and run inside the IDE, but are missing dependencies and can't run on the command line.
So what is supposed to happen to the dependencies I declare in the build.gradle file? Shouldn't they be output somewhere together with my .class files? Otherwise, what is the point of gradle when I could manage this by editing my classpath?
Edit: Here is my build.gradle file:
apply plugin: 'java'
jar {
manifest {
attributes('Main-Class': 'Animals')
}
}
repositories {
flatDir{
dirs "D:\\libs\\gradleRepo"
}
}
dependencies {
compile name: "AnimalTypes-1.0-SNAPSHOT"
}
sourceSets{
main{
java {
srcDirs=['src']
}
}
}
Your Gradle build only takes care of the compile time and allows you to use the specified dependencies in your code (it adds them to the compile classpath). But it does not take care of the runtime. Once the JAR is build, you need to specify the runtime classpath and provide all required dependencies.
You may think, that this is bad or a disadvantage, but actually it is totally fine and intended, because if you build a Java library, you won't need to execute it, you just want to specify it as a dependency for another project. If you would distribute your library to a Maven repository, all dependencies from Maven repositories (module dependencies) would end up in a POM descriptor as transitive dependencies.
Now, if you want to build a runnable Java application, simply use the Gradle Application Plugin (apply plugin: 'application'), which will create a ZIP file containing the dependencies and start scripts providing your runtime classpath for execution.
Third-party plugins can also produce so-called fat JARs, which are JAR files with all dependencies included. It depends on your use case if you should use them, because often dependency management via repositories is the better way to go.
I'm working on compiling python bindings using gradle.
There is a plugin by linkedin that facilitates that.
They also include a project called the pivy-importer that converts python dependencies into an ivy repository.
I've created a gradle plugin that wraps the pivy-importer so that it can run as a python task.
My repositories are declared like this:
repositories {
pyGradlePyPi()
ivy {
name 'pypi-local' //optional, but nice
url "${project.buildDir.path}/pythonIvy"
layout "pattern", {
ivy "[organisation]/[module]/[revision]/[module]-[revision].ivy"
artifact "[organisation]/[module]/[revision]/[artifact]-[revision](-[classifier]).[ext]"
m2compatible = true
}
}
}
The problem, however, is that the repositories are being loaded before the plugin executes.
The first task that the python task runs is pinRequirements so I was adding my cusom pythonImporter task before that like this:
pinRequirements.dependsOn pythonImporter
However, even when I do that, the console shows that the pythonImporter task is running before but as soon as it tries to assemble the dependencies, it can't find them even though they do exist in the file system.
If you rerun the task again, however, it passes because the first run added the repository to the file system.
TL;DR
I need a way to run a task before dependencies are looked up under using a project's configured repositories are loaded.
I moved the tasks execution to my buildSrc subproject and made the build task depend upon its execution.
This works because the buildSrc project is always evaluated before the rest of the projects so you can do "before build" logic there.
gradle multiple build script dependencies
We am in the process from transacting my our build scripts from ant to gradle. The ant build is configured the old way without using ivy and getting the dependencies from a lib folder.
We have a number of custom ant tasks packaged in jar. To run the tasks in that jar we also need some other third parties dependencies from the same lib folder.
Being a complex build we cannot afford to move everything in one go and would rather move one bit at a time as we find some time to do it.
I was able to run those custom ant tasks from the gradle build but I am having problems accessing classes from or tasks jars in my gradle build scripts.
In the build script section we have a class path entry needed for artifactory plugin and I tried to add some more class path entries to make our local libs available.
buildscript {
….
dependencies {
// This dependency below is needed by artifactory plugin which we download
classpath "org.jfrog.buildinfo:build-info-extractor-gradle:3.0.1"
}
….
}
I tried lots of combinations but I could not get it to work. What we want is to be able to do something like below:
buildscript {
…
dependencies {
classpath {
["org.jfrog.buildinfo:build-info-extractor-gradle:3.0.1",
fileset(dir: "${antBuildDir}/customTasks", includes: 'myTasks.jar'),
fileset(dir: "${antBuildDir}/lib", includes: '*.jar')]
}
}
…
}
Any idea about how can I address this or any other suggestions if you think I am on the wrong path.
Thank you in advance.
Julian