I have the following dependency structure
prodwebserver -> prod.jar -> transitive prod dependencies
devwebserver -> prodwebserver.jar, runtimeCompiler.jar, dev-router.jar -> more transitive dependencies
My devwebserver 'will' have ZERO source code. In gradle, currently I have a target called embeddabledevwebserver which depends on those.
Now I need to package up a release into a format of
release
|
|--prod - contains all prod jars
|
|--development - contains ONLY the extra jars that prod is missing
How can I get the difference in jar sets between the two targets such that I only put the extra jars needed for development in the development directory?
(this is nitpicky and not too important) Is there a way I can do this without having this empty embeddabledevwebserver project which is sort of an annoying shell project?
My actual current build file ...(WIP)...
https://github.com/deanhiller/webpieces/blob/gradlePackaging/build.gradle
(comments on that file welcome and appreciated)
EDIT: More specifically, I have these sections to start copying/syncing files over
task stageTemplate(type: Copy) {
from '.'
into buildDir
include stagingDirName + '/**'
}
task stageWebServer(type: Sync, dependsOn: [':embeddablewebserver:assemble', 'stageTemplate']) {
from childProjects.embeddablewebserver.toStagingDir
into new File(outputStagingDir, 'prod')
}
task stageDevServer(type: Sync, dependsOn: [':http-router-dev:assemble', 'stageWebServer']) {
from childProjects['http-router-dev'].toStagingDir
into new File(outputStagingDir, 'development')
exclude stageWebServer
}
and I can't exclude stageWebServer from stageDevServer but basically for all the jars that stageWebServer task moved over, I want to filter out all the jars with those same names for the development one.
thanks,
Dean
ok, I finally worked through this adding println to print every object and got to this which works for copy at least...I am not sure if Sync is fully working(ie. if I remove a file from prod and it is still in development, will it show up or not).
task stageWebServer(type: Sync, dependsOn: [':embeddablewebserver:assemble', 'stageTemplate']) {
from childProjects.embeddablewebserver.toStagingDir
into new File(outputStagingDir, 'prod')
}
task stageDevServer(type: Sync, dependsOn: [':http-router-dev:assemble', 'stageWebServer']) {
from childProjects['http-router-dev'].toStagingDir
into new File(outputStagingDir, 'development')
exclude { details ->
def fileNames = stageWebServer.source.collect{ entry -> entry.getName()}
fileNames.contains(details.file.getName())
}
}
Related
How can I make a sub-project copy a file that is produced by a sibling sub-project? All this with proper dependency management, and without assuming that any language-specific plugins (like the JavaPlugin) are used.
I have looked at the updated Gradle 6 draft Sharing artifacts between projects but it does not really answer that question.
My multi-project structure is something like:
top/
build.gradle
settings.gradle
producer/
build.gradle
myFile_template.txt
consumer/
build.gradle
I want a Copy-task in producer/build.gradle to copy+transform myFile_template.txt into $buildDir/target/myFile.txt and another Copy-task in consumer/build.gradle should further copy+transform that myFile.txt to a finalFile.txt.
Presumably a proper solution would be able to use task outputs.files or some such so that consumer/build.gradle does not need to explicitly mention the location of $buildDir/target/myFile.txt.
(I'm completely new to Gradle).
Gradle gives you lots of freedom but I prefer that projects only "share" with each other by Configurations and/or Artifacts. I feel that one project should never concern itself with another project's tasks and feel that the tasks are private to each project.
With this principle in mind you could do something like
project(':producer') {
configurations {
transformed
}
task transformTemplate(type: Copy) {
from 'src/main/template'
into "$buildDir/transformed"
filter(...) // transformation goes here
}
dependencies {
// file collection derived from a task.
// Any task which uses this as a task input will depend on the transformTemplate task
transformed files(transformTemplate)
}
}
project(':consumer') {
configurations {
producerTransformed
}
dependencies {
producerTransformed project(path: ':producer', configuration: 'transformed')
}
task transformProducer(type:Copy) {
from configurations.producerTransformed // this will create a task dependency
into ...
filter ...
}
}
I have a multi-module project in Gradle
root
-- ProjectA
-- ProjectB
Both ProjectA and ProjectB use the application plugin to create a zip in "ProjectA/build/distributions" and "ProjectB/build/distributions" respectively.
Now I want to copy the two zip files into "root/build/distributions".
I have tried various approaches, e.g. adding this in the build.gradle of the root project:
subprojects {
task copyFiles(type: Copy) {
includeEmptyDirs = true
from "$buildDir/distributions"
include '*.zip'
into "$parent.buildDir/distributions"
}
copyFiles.dependsOn(build)
}
or just adding a task to the root project:
task copyFiles(type: Copy) {
from "ProjectA/build/distributions"
from "ProjectB/build/distributions"
include "*.zip"
into "$buildDir/distributions"
}
build.dependsOn(copyFiles)
However, in both cases, nothing happens. No file gets copied.
What am I doing wrong?
I can see two things you are doing wrong:
You have relative paths to the subprojects. This is discouraged as it means you will always have to invoke Gradle from the root folder. And if a Gradle daemon was started from somehere else, it will fail. You could fix it by using the rootDir property (e.g. `from "$rootDir/ProjectA/...") but there is a better way...
The other problem is that you have no dependencies from your copyFiles task in your root project to the required distZip tasks in the sub-projects. So if the distributions have not already been built previously, there are no guarantees that it will work (which it apparently doesn't).
To fix it, you can have a look at the question "Referencing the outputs of a task in another project in Gradle", which covers the more general use case of what you ask. There are currently two answers, both of which are good.
So in your case, you can probably do either this:
task copyFiles(type: Copy) {
from tasks.getByPath(":ProjectA:distZip").outputs
from tasks.getByPath(":ProjectB:distZip").outputs
into "$buildDir/distributions"
}
or this:
task copyFiles(type: Copy) {
subprojects {
from(tasks.withType(Zip)) {
include "*.zip"
}
}
into "$buildDir/distributions"
}
Gradle will implicitly make the copy task depend on the other tasks automatically, so you don't need to do that yourself.
Also note that the currently accepted answer to the question I referenced is about configuration variants, and this is probably the most correct way to model the relationships (see here for more documentation on the topic). But I prefer the simplicity of the direct access to the tasks over the verbose and arguably more complex way to model it through configurations. Let's hope it gets simpler in a later release.
I am struggling with the Gradle build lifecycle; specifically with the split between the configuration and execution phases. I have read a number of sections in the Gradle manual and have seen a number of ideas online, but have not found a solution to the following problem:
I want to run a specific task to produce an artifact at the end of my java-library-distribution build that is a flattened version of the runtime configuration jars. That is, I only want to produce the artifact when I run the specific task to create the artifact.
I have created the following task:
task packageSamplerTask(type: Tar, dependsOn: distTar) {
description "Packages the build jars including dependencies as a flattened tar file. Artifact: ${distsDir}/${archivesBaseName}-${version}.tar"
from tarTree("${distsDir}/${archivesBaseName}-${version}.tar").files
classifier = 'dist'
into "${distsDir}/${archivesBaseName}-dist-${version}.tar"
}
Although this task does produce the required artifact, the task runs during gradle's configuration phase. This behavior has the following consequences:
Irrespective of which task I run from the command line, this packageSamplerTask task is always run, often unnecessarily; and
If I clean the project, then the build fails on the next run because $distsDir doesn't exist during the configuration phase (obviously).
It appears that if I extend the Copy task in this manner I'm always going to get this kind of premature behavior.
Is there a way to use the << closure / doLast declarations to get what I want? Or is there something else I'm missing / should be doing?
Update
After further work I have clarified my requirements, and resolved my question as follows (specifically):
"I want to package my code and my code's dependencies as a flat archive of jars that can be deployed as a jMeter plugin. The package can then be installed by unpacking into the jMeter lib/ext directory, as is. The package, therefore, must not include the jMeter jars (and their dependencies) which are used for building and testing"
Because Gradle doesn't appear to support the Maven-like provided dependency management, I created a new configuration for my package which excludes the jMeter jars.
configurations {
jmpackage {
extendsFrom runtime
exclude group: 'org.apache.jmeter', name: 'ApacheJMeter_core', version: '2.11'
exclude group: 'org.apache.jmeter', name: 'ApacheJMeter_java', version: '2.11'
}
}
And then created the following task (using the closure recommendation from Peter Niederwieser):
task packageSamplerTask(type: Tar, dependsOn: assemble) {
from { libsDir }
from { configurations.jmpackage.getAsFileTree() }
classifier = 'dist'
}
This solution appears to work, and it allows me to use just theGradle java plugin, too.
The task declaration is fine, but the flattening needs to be deferred too:
...
from { tarTree("${distsDir}/${archivesBaseName}-${version}.tar").files }
Also, the Tar file should be referred to in a more abstract way. For example:
from { tarTree(distTar.archivePath).files }
First your task isn't executed in the configuration phase but like EVERY task it is configured in that phase. And your closure is just a configuration of your task (a Configuration closure, not an Action closure). That is why your code is "executed" in the configuration phase".
If you want your code to be executed in the execution phase have to write it in a doLastclosure or doFirst. But in your case it is better to keep it in a configuration closure, because you are configuring your task.
To make sure your build doesn't fail because of the missing folder, you can create it with distsDir.mkdirs().
I'm trying to create multiple jars from a single project that should be published to a maven repo, but I can't seem to get the artifact-details correct.
From examples I've seen handling sources-jar, I tried to create (dynamically) several tasks that should create each jar.
def environments = ['local', 'dev', 'test', 'acc', 'prod']
environments.each { e ->
task "create${e}jar"(type: Jar, dependsOn: classes) << {
def dir = filterPropertiesForEnv(e)
from (dir)
classifier = e
}
artifacts.add('archives', tasks["create${e}jar"])
}
File filterPropertiesForEnv(envName) {
println "filter for $envName"
def destDir = new File(project.buildDir, envName)
destDir.mkdir()
// Do filter stuff based on each envName
destDir
}
When I run the "install" task, my dynamically created task "create[name]jar" is run, but it doesn't create any jar-file. If I remove the doLast-"<<", the build produces several jar-file, but the jar file is built before everything else is executed (during the configure-stage) so it does only contain a manifest file.
Is there a better way of creating multiple jars and attaching them as artifacts? I will also need to create several ear-files based on the same pattern, where this jar is included, so it would be nice to find a reusable solution.
I'm fluent in Maven, but Gradle is a new acquaintance and I haven't really got to grips with how to structure these kind of problems!
After some more investigation in the matter I found that if you add a closure to the from-method, the evaluation will be done at runtime instead of configuration-time.
A working solution would then be:
def environments = ['local', 'dev', 'test', 'acc', 'prod']
environments.each { e ->
task "create${e}jar"(type: Jar, dependsOn: classes) << {
from {
filterPropertiesForEnv(e)
}
classifier = e
}
artifacts.add('archives', tasks["create${e}jar"])
}
File filterPropertiesForEnv(envName) {
println "filter for $envName"
def destDir = new File(project.buildDir, envName)
destDir.mkdir()
// Do filter stuff based on each envName
destDir
}
So you want to publish a different jar for each environment? And publish those with an uploadArchives task?
The best way to do this would probably be to create a subproject for each environment. That way you can have an upload task for every subproject. From the code you posted, it appears you have the code for the different environment in different directories already.
If you are doing some crazy stuff where you filter different files out of one big code base to try to create a Jar file for different environments, that is probably never going to work.
I have a task that generates java sources and a set of jars from these sources (say, project a). I would like to export these jars to dependent projects (say, project b). So here's roughly what I have right now:
//a.gradle
configurations{
generatedJars
}
task generateJars(type: JavaExec) {
//generate jars ...
outputs.files += //append generated jars here
}
dependencies{
generatedJars generateJars.outputs.files
}
//b.gradle
dependencies{
project(path: ':a', configuration: 'generatedJars')
}
It works OK, except that adding generateJars.outputs.files as a dependency does not tell gradle that it has to run generateJars task when there are no jars generated yet. I have tried adding the task itself as a dependency hoping that it would work in the same way as it does when you add a jar/zip task to an artifact configuration (e.g. artifacts{ myJarTask }), but it throws an error telling me that I cannot do that. Of course I can inject the generateJars task somewhere in the build process before :b starts evaluating, but that's clumsy and brittle, so I would like to avoid it.
I feel like I should be adding the generated jars to artifacts{ ... } of the project, but I am not sure how to make them then visible to dependent projects. Is there a better way of achieving this?
Dependent projects (project b) will need to do setup IntelliJ IDEA module classpath to point to project a's generated jars. Something rather like this (pseudo-code):
//b.gradle
idea{
module{
scopes.COMPILE.plus += project(path: ':a', configuration: 'generatedJars').files
}
}
So far I have tried simply adding a project dependecy on :a's generatedJars in :b, but Idea plugin simply adds module :a as a module-dependency and assumes that it exports its generated jars (which is probably a correct assumption), therefore not adding the generated jars to :b's classpath.
Any help would be greatly appreciated!
First, do you need a separate configuration? That is, do you have clients of a that should not see the generated Jars? If not, you can add the generated Jars to the archives configuration, which will simplify things.
Second, the correct way to add the generated Jars to the configuration is (instead of the dependencies block):
artifacts {
generatedJars generateJars
}
This should make sure that the generateJars task gets run automatically when needed.
Third, I'd omit the += after outputs.files, although it might not make a difference. You should also add the necessary inputs.
Fourth, why do you need a JavaExec task to generate the Jars? Can you instead add the generated sources to some source set and let Gradle build them?
Fifth, IDEA doesn't have a concept corresponding to Gradle's project configuration dependencies. Either an IDEA module fully depends on another module, or not at all. You have two options: either use a module dependency and make the generated sources a source folder of the depended-on module (preferably both in the Gradle and the IDEA build), or pass the generated Jars as external dependencies to IDEA. In either case, you should probably add a task dependency from ideaModule to the appropriate generation task. If this still doesn't lead to a satisfactory IDEA setup, you could think about moving the generation of the Jars into a separate subproject.
For my use case, I had a C++ project which generated some native libraries which my java project needed to load in order to run.
In the project ':native' build.gradle:
task compile(type: Exec, group: 'build') {
dependsOn ...
outputs.files(fileTree('/some/build/directory') {
include 'mylib/libmy.so'
})
...
}
In project java application build.gradle:
configurations {
nativeDep
}
// Add dependency on the task that produces the library
dependencies {
nativeDep files(project(':native').tasks.findByPath('compile'))
}
// Unfortunately, we also have to do this because gradle will only
// run the ':native:compile' task if we needed the tasks inputs for another
// task
tasks.withType(JavaCompile) {
dependsOn ':native:compile'
}
run {
doFirst {
// Use the configuration to add our library to java.library.path
def libDirs = files(configurations.nativeDep.files.collect {it.parentFile})
systemProperty "java.library.path", libDirs.asPath
}
}