/--common
/--common/build.gradle
/--common/deploy.gradle
/--project1
/--project1/build.gradle
I have a multi-project structure and have extracted repeating code from my build.gradle file and placed this into another file deploy.gradle.
I have placed the deploy.gradle file into the common project at the same folder level as the build.gradle file. The folder structure is shown above.
In the build.gradle file of the common project I can reference the file using the statement,
apply from: 'deploy.gradle'
This works like a dream and the common project build works perfectly calling in the tasks from the deploy.gradle file.
The problem comes when I try to reference deploy.gradle file from one of the other projects. When I add the apply... statement to the build.gradle of project1 I get the compilation error,
Error:(23, 0) Could not read script
'C:\path-to-project1-script-file\deploy.gradle' as it does not exist.
So Gradle is looking for the deploy.gradle file in project1 only even though I have a dependency set to the common project in the project1 build.gradle file.
Question is how can I make deploy.gradle from common project visible to project1.
We successfully use the following project layout
├── a
│ └── build.gradle
├── b
│ └── build.gradle
├── build.gradle
├── gradle-scripts
│ └── deploy.gradle
└── settings.gradle
The rootproject's build.gradle defines
ext.gradleScript = { scriptName ->
file("${rootProject.projectDir}/gradle-scripts/${scriptName}.gradle")
}
Subprojects use the scripts within gradle-scripts this way
apply from: gradleScript('deploy')
Whole content of the project:
$> find . -type f | while read file; do echo "--- $file ---" && cat $file; done
--- ./a/build.gradle ---
apply from: gradleScript('deploy')
--- ./b/build.gradle ---
apply from: gradleScript('deploy')
--- ./build.gradle ---
// Where common build logic is found
ext.gradleScript = { scriptName ->
file("${rootProject.projectDir}/gradle-scripts/${scriptName}.gradle")
}
--- ./gradle-scripts/deploy.gradle ---
task myDeployTask {
doLast { println 'common deploy logic goes here' }
}
--- ./settings.gradle ---
include 'a', 'b'
$> gradle -q b:myDeployTask
common deploy logic goes here
$>
Here is an example project1/build.gradle that references common/deploy.gradle:
// import myDeploy task
apply from: "${rootDir}/common/deploy.gradle"
task myProject1Task(dependsOn: 'myDeploy') {
doLast {
println 'TRACER myProject1Task'
}
}
It is often important to distinguish projectDir from rootDir in multi-project builds. projectDir is the specific subproject; rootDir is where settings.gradle lives.
Related
I'm running into the problem that my gradle wrapper will only find subprojects if I execute it whilst being in the same working directory. For example:
Let's say the project structure is as follows:
.
├── app
│ ├── build.gradle
│ ├── ...
├── build.gradle
├── gradlew
├── settings.gradle
└── ...
It makes a difference whether I run gradlew from it's directory or from a different directory. If I run:
$ ./gradlew projects
> Task :projects
------------------------------------------------------------
Root project
------------------------------------------------------------
Root project 'com.name'
+--- Project ':app'
it has no problem finding :app. However, if I navigate and execute gradlew from a folder up, it cannot find it:
$ cd ..
$ ./android/gradlew projects
> Task :projects
------------------------------------------------------------
Root project
------------------------------------------------------------
Root project 'com'
No sub-projects
It can't find the projects. This is problematic for me since I need to run a task in :app from a pipeline from a different working directory, e.g. ./xx/yy/gradlew app:publishTask. However doing it this way, gradle can't find the task because it can't find the project. Is there a way to run these commands from any location?
Yes, it is. You have to:
store your current location in an temporary variable
change location to project directory
run ./gradlew
restore directory from variable
ex:
TMP_DIR=`pwd`
cd /path/to/project
./gradlew projects
cd $TMP_DIR
I am building a jenkins shared library (in groovy) and testing this with JenkinsPipelineUnit and in gradle. Running ./gradlew test jacocoTestReport runs fine, but the report is almost empty (just headers); no coverage is present.
Here are the relevant parts of my build.gradle:
plugins {
id 'groovy'
id 'application'
id 'jacoco'
}
dependencies {
compile 'org.codehaus.groovy:groovy-all:2.5.4'
testCompile 'junit:junit:4.12'
testCompile 'com.lesfurets:jenkins-pipeline-unit:1.1.1-custom' // minor adaptations, but that's another story
}
test {
systemProperty "pipeline.stack.write", System.getProperty("pipeline.stack.write")
}
jacocoTestReport {
group = "Reporting"
reports {
xml.enabled true
csv.enabled false
}
additionalSourceDirs = files('vars')
sourceDirectories = fileTree(dir: 'vars')
}
I think the trouble resides in the fact that my "source" files reside in the vars directory and not in src/groovy as expected in a normal groovy project. This is however a requirement for a Jenkins shared library.
I tried specifying
sourceSets {
main {
groovy {
srcDir 'vars'
}
}
}
but then gradle would start compiling this shared library while it's supposed to be loaded upon use; and this breaks everything...
My folder structure looks like this:
├── build.gradle
├── src
│ └── test
│ ├── groovy
│ │ └── TestSimplePipeline.groovy
│ └── resources
│ └── simplePipeline.jenkins
└── vars
├── MyPipeline.groovy
└── sh.groovy
I think my problem is linked to https://github.com/jenkinsci/JenkinsPipelineUnit/issues/119 , but I wouldn't know how to use the changes proposed for maven in gradle (not even sure they apply to jacoco).
The problem is that JenkinsPipelineUnit evaluates your scripts in runtime. It means jacoco agent cannot instrument the byte-code generated in runtime.
To overcome this issue you need to do two changes.
Use jacoco offline instrumentalisation
In my case I used maven, so I cannot provide you with a specific example of a gradle configuration.
Load compiled classes instead of groovy scripts in your test. Something like this:
def scriptClass = helper.getBaseClassloader().loadClass("fooScript")
def binding = new Binding()
script = InvokerHelper.createScript(scriptClass, binding)
InterceptingGCL.interceptClassMethods(script.metaClass, helper, binding)
Here fooScript is the name of the class (say you have a source file called fooScript.groovy in this case).
Now you can call methods of this class via
def result = script.invokeMethod(methodName, args)
I have following simple setup:
File structure
$ tree
.
├── build.gradle
├── modules
│ ├── rest-model
│ └── rest-resource
└── settings.gradle
File contents
settings.gradle
def MODULES = 'modules'
file(MODULES).eachDir {
include ":${MODULES}:${it.name}"
}
build.gradle
task hello {
doLast {
subprojects.each {
println it.name
}
}
}
The task hello above will print out all subprojects. I was expecting only two subprojects: rest-model and rest-resource. However, I am getting three: modules, rest-module, and rest-resource. Here is gradle output:
$ gradle hello
Starting a Gradle Daemon (subsequent builds will be faster)
> Task :hello
modules
rest-model
rest-resource
BUILD SUCCESSFUL in 2s
1 actionable task: 1 executed
So, why does gradle automatically includes the parent folder modules as a subproject? Can I prevent that?
Use includeFlat instead of include:
def MODULES = 'modules'
file(MODULES).eachDir {
includeFlat ":${MODULES}:${it.name}"
}
Hi I am newbie to gradle script, I am trying to copy directory to root directory of war file, but I ended up copying only the contents of the folder to root directory, but I want the whole directory structure with parent directory also to be copied.
Folder structure to copy
polymer-client
├── file1.txt
├── index.html
Gradle script
apply plugin: 'war'
war {
archiveName = 'WebDeployment.war'
from 'polymer-client'
}
Generated folder structure
WebDeployment
├── WEB-INF
├── META-INF
├── file1.txt
├── index.html
Expected folder structure
WebDeployment
├── WEB-INF
├── META-INF
├── polymer-client
├── file1.txt
├── index.html
Please let me know, how to get it done with gradle script.
You can try to do it with into as follows:
apply plugin: 'war'
war {
archiveName = 'WebDeployment.war'
into 'polymer-client', {
from 'polymer-client'
}
}
It should create the polymer-client subdirectory within war-archive and copy all the content of the polymer-client directory into it.
Consider the following multi-project build script:
build.gradle
subprojects {
apply plugin: 'java'
apply plugin: 'maven'
group = "myorg"
version = "1.0.0-SNAPSHOT"
}
project(':client') {
dependencies {
compile 'myorg:shared:1.0.0-SNAPSHOT'
}
}
With the following files:
├── build.gradle
├── client
│ └── src
│ └── main
│ └── java
│ └── myorg
│ └── client
│ └── MyOrgClient.java
├── settings.gradle
└── shared
└── src
└── main
└── java
└── myorg
└── shared
└── MyOrgObj.java
In the above files MyOrgClient.java includes myorg.shared.MyOrgObj and settings.gradle has the single line include 'client', 'shared'
Problem
The project/task build order for maven related tasks like installing locally and deploying to remote repositories does not take into account the implied project dependency. Because gradle does not know that 'myorg:shared:1.0.0-SNAPSHOT' is created by project(':shared'), the build order is :client -> :shared and causes errors like the one below:
$ gradle install
:client:compileJava
FAILURE: Build failed with an exception.
* What went wrong:
Could not resolve all dependencies for configuration ':client:compile'.
> Could not find myorg:shared:1.0.0-SNAPSHOT.
Required by:
myorg:client:1.0.0-SNAPSHOT
Question:
Is there a standard way to deal with this problem? I have tried these solutions without success:
Using mustRunAfter but ran into problems with tasks not existing yet. I also don't think this would scale well with a large number of projects
Adding archives project(':shared') to the client's dependencies
Adding compile project(':shared') to the client's dependencies and then removing it from the generated pom. Unfortunately this doesn't add the dependency to the install task or artifactoryPublish Edit: This actually was the solution. A project dependency will provide the correct version/name/group in the generated pom.xml so the explicit group:name:version dependency is not needed
You have to define the dependencies between the projects more or less the same way as in Maven:
For example like this:
project(':app') {
apply plugin: 'ear'
dependencies {
compile project (':webgui')
compile project (':service')
}
}
But you need to define the settings.gradle which contains the modules like this:
include 'app'
include 'domain'
include 'service'
include 'service-client'
include 'webgui'