I'm trying to create multiple jars from a single project that should be published to a maven repo, but I can't seem to get the artifact-details correct.
From examples I've seen handling sources-jar, I tried to create (dynamically) several tasks that should create each jar.
def environments = ['local', 'dev', 'test', 'acc', 'prod']
environments.each { e ->
task "create${e}jar"(type: Jar, dependsOn: classes) << {
def dir = filterPropertiesForEnv(e)
from (dir)
classifier = e
}
artifacts.add('archives', tasks["create${e}jar"])
}
File filterPropertiesForEnv(envName) {
println "filter for $envName"
def destDir = new File(project.buildDir, envName)
destDir.mkdir()
// Do filter stuff based on each envName
destDir
}
When I run the "install" task, my dynamically created task "create[name]jar" is run, but it doesn't create any jar-file. If I remove the doLast-"<<", the build produces several jar-file, but the jar file is built before everything else is executed (during the configure-stage) so it does only contain a manifest file.
Is there a better way of creating multiple jars and attaching them as artifacts? I will also need to create several ear-files based on the same pattern, where this jar is included, so it would be nice to find a reusable solution.
I'm fluent in Maven, but Gradle is a new acquaintance and I haven't really got to grips with how to structure these kind of problems!
After some more investigation in the matter I found that if you add a closure to the from-method, the evaluation will be done at runtime instead of configuration-time.
A working solution would then be:
def environments = ['local', 'dev', 'test', 'acc', 'prod']
environments.each { e ->
task "create${e}jar"(type: Jar, dependsOn: classes) << {
from {
filterPropertiesForEnv(e)
}
classifier = e
}
artifacts.add('archives', tasks["create${e}jar"])
}
File filterPropertiesForEnv(envName) {
println "filter for $envName"
def destDir = new File(project.buildDir, envName)
destDir.mkdir()
// Do filter stuff based on each envName
destDir
}
So you want to publish a different jar for each environment? And publish those with an uploadArchives task?
The best way to do this would probably be to create a subproject for each environment. That way you can have an upload task for every subproject. From the code you posted, it appears you have the code for the different environment in different directories already.
If you are doing some crazy stuff where you filter different files out of one big code base to try to create a Jar file for different environments, that is probably never going to work.
Related
In build.gradle it is simple enough:
println "project name $name"
... but that wouldn't be available with a distributed version of the app. Nor do I understand how I can easily access that value from inside testing and app code.
The name should be available in "settings.gradle": hopefully this should contain a line like:
rootProject.name = 'MyProject'
If I want to incorporate this name into a test, and also into app code, and also to make sure that that info is available to a distributed version of the app, using installDist for example, it seems to me that the best way might be to automate, prior to testing, building and distributing, a task which extracts that information from settings.gradle and puts it into a .properties file under src/main/resources.
How would I do that? Or... is there a better approach?
PS a possible use case might be for configuring the path of a logfile: .../logging/MyProject... There are probably others.
build.gradle:
task copySettingsGradle {
doFirst {
ant.copy(file: 'settings.gradle',
todir: 'src/main/resources',
overwrite: true,
failonerror: true)
}
}
classes.dependsOn copySettingsGradle
app code:
URL settingsGradleURL = App.class.getResource( '/settings.gradle' )
ConfigObject conf = new ConfigSlurper().parse( settingsGradleURL )
def projectName = conf.rootProject.name
... the point about this being that it works both during Gradle development and in the distributed app...
I have studied thousand similar questions on SO and I am still lost. I have a simple multiproject build:
rootProject.name = 'mwe'
include ":Generator"
include ":projectB"
include ":projectC"
with a top level build.gradle as follows (settings.gradle):
plugins { id "java" }
allprojects { repositories { jcenter() } }
and with two kinds of project build.gradle files. The first one (Generator) exposes a run command that runs the generator taking the command line argument:
plugins {
id "application"
id "scala"
}
dependencies { compile "org.scala-lang:scala-library:2.12.3" }
mainClassName = "Main"
ext { cmdlineargs = "" }
run { args cmdlineargs }
The code generator is to be called from projectB (and an analogous projectC, and many others). I am trying to do this as follows (projectB/build.gradle):
task TEST {
project (":Generator").ext.cmdlineargs = "Hurray!"
println ("Value set:" + project(":Generator").ext.cmdlineargs )
dependsOn (":Generator:run")
}
Whatever I try to do (a gradle newbie here) I am not getting what I need. I have two problems:
The property cmdlineargs is not set at the point that task :projectB:TEST is run. The println sees the right value but the argument passed to the executed main method is the one configured in Generator/build.gradle, not the one in projectB/build.gradle. As pointed out in responses this can be work around using lazy property evaluation, but this does not solve the second problem.
The generator is only run once, even if I build both projectB and projectC. I need to run Generator:run for each of projectB and projectC separately (to generate different sources for each dependent project).
How can I get this to work? I suppose a completely different strategy is needed. I don't have to use command line and run; I can also try to run the main class of the generator more directly and pass arguments to it, but I do find the run task quite convenient (the complex classpath is set up automatically, etc.). The generator is a Java/Scala project itself that is compiled within the same multi-project build.
Note: tasks aren't like methods in java. A task will execute either 0 or 1 times per gradle invocation. A task will never execute twice (or more) in a single Gradle invocation
I think you want two or more tasks. Eg:
task run1(type:xxx) {
args 'foo'
}
task run2(type:xxx) {
args 'bar'
}
Then you can depend on run1 or run2 in your other projects.
I have a Gradle copy task set-up to publish undecorated JAR file(s) for testing and debugging, viz.
Task definition:
task copyJarToStaging( type: Copy ) {
from jar // shortcut for createJar.outputs.files
into ( "${rootProject.rootDir}/dist/" )
rename( '-.*\\.jar', ".jar" )
}
Which works, to put a JAR file into the one directory. What's really needed is to drop the JAR into one or more different folders under "dist/".
Following many trials (and errors) I found this version worked for me.
Invoke the copy task:
// build.gradle (module)
assemble.dependsOn copyJarToStaging {
println "into ==> ${destinationDir}/support"
into "${destinationDir}/support/"
}
However, it doesn't really smell right.
Is there a cleaner alternative way? I would have liked a closure for instance to just append to the into attribute -- But it didn't go.
If I wanted the same file in different places, it would be better if I can do something like take the into string and yield each value back.
Is part or all of that possible? Or, am I dreaming???
Typically you'd create multiple copy tasks
['dev', 'staging', 'uat', 'prod'].each { String dir ->
Task task = tasks.create("copyJarTo${dir.capitalize()}", type: Copy) {
from jar
into "dist/$dir"
}
assemble.dependsOn task
}
I have the following dependency structure
prodwebserver -> prod.jar -> transitive prod dependencies
devwebserver -> prodwebserver.jar, runtimeCompiler.jar, dev-router.jar -> more transitive dependencies
My devwebserver 'will' have ZERO source code. In gradle, currently I have a target called embeddabledevwebserver which depends on those.
Now I need to package up a release into a format of
release
|
|--prod - contains all prod jars
|
|--development - contains ONLY the extra jars that prod is missing
How can I get the difference in jar sets between the two targets such that I only put the extra jars needed for development in the development directory?
(this is nitpicky and not too important) Is there a way I can do this without having this empty embeddabledevwebserver project which is sort of an annoying shell project?
My actual current build file ...(WIP)...
https://github.com/deanhiller/webpieces/blob/gradlePackaging/build.gradle
(comments on that file welcome and appreciated)
EDIT: More specifically, I have these sections to start copying/syncing files over
task stageTemplate(type: Copy) {
from '.'
into buildDir
include stagingDirName + '/**'
}
task stageWebServer(type: Sync, dependsOn: [':embeddablewebserver:assemble', 'stageTemplate']) {
from childProjects.embeddablewebserver.toStagingDir
into new File(outputStagingDir, 'prod')
}
task stageDevServer(type: Sync, dependsOn: [':http-router-dev:assemble', 'stageWebServer']) {
from childProjects['http-router-dev'].toStagingDir
into new File(outputStagingDir, 'development')
exclude stageWebServer
}
and I can't exclude stageWebServer from stageDevServer but basically for all the jars that stageWebServer task moved over, I want to filter out all the jars with those same names for the development one.
thanks,
Dean
ok, I finally worked through this adding println to print every object and got to this which works for copy at least...I am not sure if Sync is fully working(ie. if I remove a file from prod and it is still in development, will it show up or not).
task stageWebServer(type: Sync, dependsOn: [':embeddablewebserver:assemble', 'stageTemplate']) {
from childProjects.embeddablewebserver.toStagingDir
into new File(outputStagingDir, 'prod')
}
task stageDevServer(type: Sync, dependsOn: [':http-router-dev:assemble', 'stageWebServer']) {
from childProjects['http-router-dev'].toStagingDir
into new File(outputStagingDir, 'development')
exclude { details ->
def fileNames = stageWebServer.source.collect{ entry -> entry.getName()}
fileNames.contains(details.file.getName())
}
}
I have a multiproject build. Some of the projects in the build produce test results. One project produces an installer, and I want that project's artifacts to include the test results from all the other projects. I attempted to do that like this (in the project that produces an installer):
// Empty test task, so that we can make the gathering of test results here depend on the tests' having
// been run in other projects in the build
task test << {
}
def dependentTestResultsDir = new File( buildDir, 'dependentTestResults' )
task gatherDependentTestResults( type: Zip, dependsOn: test ) {
project.parent.subprojects.each { subproject ->
// Find projects in this build which have a testResults configuration
if( subproject.configurations.find { it.name == 'testResults' } ) {
// Extract the test results (which are in a zip file) into a directory
def tmpDir = new File( dependentTestResultsDir, subproject.name )
subproject.copy {
from zipTree( subproject.configurations['testResults'].artifacts.files.singleFile )
into tmpDir
}
}
}
// Define the output of this task as the contents of that tree
from dependentTestResultsDir
}
The problem is that at the point when this task is configured, the test tasks in the other projects haven't run, so their artifacts don't exist, and I get messages during my build like this:
The specified zip file ZIP 'C:\[path to project]\build\distributions\[artifact].zip' does not exist and will be silently ignored. This behaviour has been deprecated and is scheduled to be removed in Gradle 2.0
So I need to do something that will involve the configuration of my task being delayed until the test artifacts have actually been produced. What is the idiomatic way to achieve this?
I seem to need to address questions of this nature quite frequently about Gradle. I think I'm missing something conceptually.
In that instance, you should just be able to make it an action by adding << as in
task gatherDependentTestResults( type: Zip, dependsOn: test ) << {
// your task code here
}