How to bundle dependencies in Gradle for multiple applications - gradle

I'd like to somehow enable shared dependency management between different applications using Gradle and have created a hierarchy of 'bundles' essentially packaging Java, Spring, Groovy and various test artifacts.
The root of the 'bundling' consists of config as below:
ext {
// Spring-specific
springVersion = '4.1.7.RELEASE'
springCore = "org.springframework:spring-core:${springVersion}"
springContext = "org.springframework:spring-context:${springVersion}"
springBeans = "org.springframework:spring-beans:${springVersion}"
springBase = [springCore, springContext, springBeans]
... and more Spring
// Miscellaneous
sourceCompatibility = 1.8
groovy = 'org.codehaus.groovy:groovy-all:2.4.4'
servlet = 'javax.servlet:javax.servlet-api:3.1.0'
... and more
}
subprojects {
// Enabling Groovy, Wrapper
}
With a 'Spring bundle' configured as follows:
dependencies {
compile(
groovy,
springBase,
)
}
JAR artifacts like my common-spring then depend on the above 'bundle' - and other projects depend on common-spring.
Building locally using Gradle's Maven plugin, this works when I also install the 'bundles' and 'common' to my local Maven repository. However, I am unsure if this is the way I'm 'supposed' to do it in Gradle.
An alternative (and perhaps more correct?) approach could be publishing to Artifactory and resolving artifacts from there, thus (hopefully?) not using any Maven-y stuff at all.
Or is there a completely different and simpler way I could centralize which versions I use of various third-party artifacts?

Yes, this is a good way to share dependencies and in fact Gradle's build uses this same technique.
One difference is that they create a single map to store all of the bundles and wrap them all in a list.
ext.libs = [:]
libs.springCore = ["org.springframework:spring-core:${springVersion}",]
...

Related

Gradle independent parent module for a microservice application

I'm working on a springboot microservices application using Gradle and I want to have a separate parent module for all the common parts that is going to be used by all of my microservices (abstract entities, common properties, dependencies versions ...). This parent module is going to have it's own repository. I have done something similar when I worked on a maven app, by having a <packaging>pom</packaging> for the parent project.
So my questions are, Am I doing the right thing by separating the common aspects of my application in a separate repository ? and what is the best way to do so in gradle ?
Edit :
In order for me to be more precise about my problematic, I want to do what is described in this approach using Gradle instead of maven.
https://stackoverflow.com/a/27865893/8326336
Thank you for your help.
Personally I would try to reuse only complex parts, or complex tasks/plugins if really needed. There is a maintenance cost with "parent logic" especially for things that change often and a lot flexibility can be lost. Also updating multiple dependent projects is not fun. So be careful.
With Gradle it's possible to reuse some common build logic. One way of doing this is to create a convention Gradle project. I will use Kotlin dsl in examples, but same thing can be done with groovy.
Convention project
First create a normal Gradle project and put in build.gradle.kts config like:
plugins {
// Kotlin dsl plugin since we will use Kotlin dsl
// (you can also use Groovy version if you like Groovy)
`kotlin-dsl`
// Plugin needed to publish it
id("maven-publish")
}
repositories {
// This repository is needed for getting kotlin-gradle-plugin,
// you can also add any other repo here if you add any other dep.
gradlePluginPortal()
}
dependencies {
// This is needed so we can access gradle constructs
implementation("org.jetbrains.kotlin:kotlin-gradle-plugin:1.5.30")
}
publishing {
// Config to publish it
publications {
create<MavenPublication>("maven") {
group = "com.mycompany"
artifactId = "gradle-conventions"
version = "1.0.0"
from(components["java"])
}
}
repositories {
// my repositories where I want publish this
}
}
Where to put common logic? You put it in src/main/kotlin (or groovy if you use groovy). So lets create such structure:
└── src
└── main
└── kotlin
├── dependencies
│   └── CommonDependencies.kt
├── my-company.java-conventions.gradle.kts
Where CommonDependenies.kt has our dependencies:
package dependencies
open class CommonDependencies {
val guava = "com.google.guava:guava:30.1.1-jre"
}
and my-company.java-conventions.gradle.kts has our common Java settings:
import dependencies.CommonDependencies
plugins {
id("java-library")
}
// register extension so we can nicely access variables from CommonDependencies
extensions.create<CommonDependencies>("commonLibs")
java {
// All our projects will use toolchains with Java11
toolchain.languageVersion.set(JavaLanguageVersion.of("11"))
}
tasks.test {
// All our projects use Junit
useJUnitPlatform()
}
Now since this is a regular project you can publish it to maven repo. For testing purposes, let's publish it to local maven repo with:
./gradlew publishToMavenLocal
Ok, our conventions are all set. Now how we can use them?
Consumer project
In consumer project we have to add our project to build logic classpath. This can be done by adding our project to buildSrc/build.gradle(.kts) or to buildScript. Lets for example put it into buildSrc.
buildSrc/build.gradle.kts example:
repositories {
// I have put maven local here just because I published
// convention project to maven local
mavenLocal()
gradlePluginPortal()
}
dependencies {
implementation("com.mycompany:gradle-conventions:1.0.0")
}
And after this is set, IDE reloaded, you can use your conventions in your modules. Example:
plugins {
id("my-company.java-conventions")
}
dependencies {
implementation(commonLibs.guava)
}
Notes
If you don't want to publish your conventions to some Maven repo, you can also just include project locally with includeBuild(). For example if you have projects in same folders like that:
├── gradle-conventions
├── gradle-project-consuming-conventions
You would do in settings.gradle(.kts) of gradle-project-consuming-convention: includeBuild("../gradle-conventions")
In my-company.java-conventions.gradle.kts you can skip my-company. Name it however you think is best, just be careful that it does not conflict with official plugins.
I used Gradle 7.2

In a multi-module project can Gradle build a plugin as one module and then use that plugin in the same build?

We have a Gradle project with a bunch of modules. One of those modules is a custom code generator, written as a Gradle plugin. We want to run that code-generator plugin in another module later in the same overall multi-module build, in order to test the code generator.
We know how to create a separate project on the fly and run the code generator in that, but we need to run the code generator in the main project, not in a temporary test project.
Nothing we have tried works, and the Gradle documentation doesn't appear to address this. It seems to be fundamental to Gradle's design, because the entire set of plugins used in a build is basically a single program, assembled at the start. Trying to add a just-now-built plugin after the fact seems unsupported, or we're missing something.
The best we've been able to come up with so far is to implement the plugin in Java (Kotlin would also have worked), so the Gradle plugin is just a thin Gradle skin over the implementation, and call the Java implementation directly when running the code generator in the other module. This works, but it means we aren't actually testing the Gradle portion of the code generator.
This is natively supported in Maven (maven multi-module project with one plugin module, and https://maven.apache.org/guides/mini/guide-multiple-modules.html), which is not surprising because every plugin in Maven runs in a separate class loader. If it's not possible in Gradle, that would be one of the few cases where Gradle doesn't have feature parity.
A hacky way to do this is to run the newly-compiled plugin via Gradle's test kit runner.
A cleaner way to do this is to write plugins as thin shells of code written to Gradle's API that delegate the real work to plain old Java (or Kotlin) utility methods. This has a number of advantages:
You can unit test the utility methods.
You can use the utility methods for other purposes unrelated to the plugin.
You can call the utility methods directly from other modules in the project, thereby accomplishing what the plugin would have done if you could have built it and then called it in the same build.
To expand on the above answer.
Instead of calling the plugin like a plugin, add a main method that accepts the same parameters that Gradle plugin configuration passed to the plugin.
Then call the plugin's main using Gradle's Java exec task:
task(generateFoo, type: JavaExec) {
main = 'com.bar.Foo'
classpath = configurations.runtimeClasspath
args = ["arg1", "${projectDir}/src/generated/java"]
}
Note the args: those are the same pieces of information that used to be passed in via Gradle configuration:
apply plugin: 'foo-plugin'
generateFoo {
theArg "arg1"
outputDir "${projectDir}/src/generated/java"
}
Because the runtime classpath used by Java exec is the one for the calling module, you may encounter runtime classloader problems.
If that happens, it's easily fixed. Just change the rewritten plugin to a fat jar:
task fatJar(type: Jar) {
manifest {
attributes 'Implementation-Title': 'Foo Fat JAR', 'Main-Class': 'com.bar.Foo'
}
baseName = project.name + '-exec'
from { configurations.runtimeClasspath.collect { it.isDirectory() ? it : zipTree(it) } }
with jar
}
artifacts {
archives fatJar
}
And then execute the fat jar with Java exec:
def fooGenerate = task(generateFoo, type: JavaExec) {
main = 'com.bar.Foo'
classpath = files("${projectDir}/../foo-plugin-module/build/libs/foo-plugin-module-exec.jar")
args = ["arg1", "${projectDir}/src/generated/java"]
}
Finally, make the dependent module's compile task depend on the code generation:
compileJava.mustRunAfter fooGenerate
If you use the fatJar approach, you don't even need to declare implementation project(":foo") in the dependent modules.
It might be also be possible to use Gradle's composite builds for this (https://docs.gradle.org/current/userguide/composite_builds.html).

Gradle: How do you include your own library as part of build?

I have a commons gradle project which is a shared library for all my other projects.
In the build.gradle of dependent project, I included the commons jar as following:
dependencies {
...
runtime files('../commons/build/libs/commons-1.0.jar')
}
And this builds fine with the relative path. But this feels like hard-coding a specific library. What is the standard way to achieve the builds in this case?
You can publish the common jar to Your local maven repository. But If You are developing with in a team, You should publish to a repository manager where other parties have also access to.
And in the dependent projects You simply add this common jar, like You are adding a third party library. With this way You don't have to store dependent jar files on Your version control system. When projects are developed by different parties, this way would be more convenient.
Example of using mavenlocal
// Common project build.gradle
apply plugin: 'maven-publish'
version = '2.0'
...
publishing {
publications {
maven(MavenPublication) {
groupId = 'com.gradle.sample'
artifactId = 'project1-sample'
from components.java
}
}
}
You can use gradle publishToMavenLocal to publish common project to mavenlocal. (Dependent projects can not use the new versions of common jar, unless You publish it to mavenlocal in this case)
// Dependent project build.gradle
repositories {
mavenLocal()
}
dependencies {
implementation 'com.gradle.sample:project1-sample:2.0'
....
}
Check the following link for details.
https://docs.gradle.org/current/userguide/publishing_maven.html

Boilerplate project configuration in Gradle with Gradle Kotlin DSL

I'm currently trying to improve the way our projects share their configuration. We have lots of different multi-module gradle projects for all of our libraries and microservices (i.e. many git repos).
My main goals are:
To not have my Nexus repository config duplicated in every project (also, I can safely assume that the URL won't change)
To make my custom Gradle plugins (published to Nexus) available to every project with minimal boilerplate / duplication (they should be available to every project, and the only thing the project cares about is the version it's using)
No magic - it should be obvious to developers how everything is configured
My current solution is a custom gradle distribution with an init script that:
adds mavenLocal() and our Nexus repository to the project repos (very similar to the Gradle init script documentation example, except it adds repos as well as validating them)
configures an extension that allows our gradle plugins to be added to the buildscript classpath (using this workaround). It also adds our Nexus repo as a buildscript repo as that's where the plugins are hosted. We have quite a few plugins (built upon Netflix's excellent nebula plugins) for various boilerplate: standard project setup (kotlin setup, test setup, etc), releasing, publishing, documentation, etc and it means our project build.gradle files are pretty much just for dependencies.
Here is the init script (sanitised):
/**
* Gradle extension applied to all projects to allow automatic configuration of Corporate plugins.
*/
class CorporatePlugins {
public static final String NEXUS_URL = "https://example.com/repository/maven-public"
public static final String CORPORATE_PLUGINS = "com.example:corporate-gradle-plugins"
def buildscript
CorporatePlugins(buildscript) {
this.buildscript = buildscript
}
void version(String corporatePluginsVersion) {
buildscript.repositories {
maven {
url NEXUS_URL
}
}
buildscript.dependencies {
classpath "$CORPORATE_PLUGINS:$corporatePluginsVersion"
}
}
}
allprojects {
extensions.create('corporatePlugins', CorporatePlugins, buildscript)
}
apply plugin: CorporateInitPlugin
class CorporateInitPlugin implements Plugin<Gradle> {
void apply(Gradle gradle) {
gradle.allprojects { project ->
project.repositories {
all { ArtifactRepository repo ->
if (!(repo instanceof MavenArtifactRepository)) {
project.logger.warn "Non-maven repository ${repo.name} detected in project ${project.name}. What are you doing???"
} else if(repo.url.toString() == CorporatePlugins.NEXUS_URL || repo.name == "MavenLocal") {
// Nexus and local maven are good!
} else if (repo.name.startsWith("MavenLocal") && repo.url.toString().startsWith("file:")){
// Duplicate local maven - remove it!
project.logger.warn("Duplicate mavenLocal() repo detected in project ${project.name} - the corporate gradle distribution has already configured it, so you should remove this!")
remove repo
} else {
project.logger.warn "External repository ${repo.url} detected in project ${project.name}. You should only be using Nexus!"
}
}
mavenLocal()
// define Nexus repo for downloads
maven {
name "CorporateNexus"
url CorporatePlugins.NEXUS_URL
}
}
}
}
}
Then I configure each new project by adding the following to the root build.gradle file:
buildscript {
// makes our plugins (and any others in Nexus) available to all build scripts in the project
allprojects {
corporatePlugins.version "1.2.3"
}
}
allprojects {
// apply plugins relevant to all projects (other plugins are applied where required)
apply plugin: 'corporate.project'
group = 'com.example'
// allows quickly updating the wrapper for our custom distribution
task wrapper(type: Wrapper) {
distributionUrl = 'https://com.example/repository/maven-public/com/example/corporate-gradle/3.5/corporate-gradle-3.5.zip'
}
}
While this approach works, allows reproducible builds (unlike our previous setup which applied a build script from a URL - which at the time wasn't cacheable), and allows working offline, it does make it a little magical and I was wondering if I could do things better.
This was all triggered by reading a comment on Github by Gradle dev Stefan Oehme stating that a build should work without relying on an init script, i.e. init scripts should just be decorative and do things like the documented example - preventing unauthorised repos, etc.
My idea was to write some extension functions that would allow me to add our Nexus repo and plugins to a build in a way that looked like they were built into gradle (similar to the extension functions gradleScriptKotlin() and kotlin-dsl() provided by the Gradle Kotlin DSL.
So I created my extension functions in a kotlin gradle project:
package com.example
import org.gradle.api.artifacts.dsl.DependencyHandler
import org.gradle.api.artifacts.dsl.RepositoryHandler
import org.gradle.api.artifacts.repositories.MavenArtifactRepository
fun RepositoryHandler.corporateNexus(): MavenArtifactRepository {
return maven {
with(it) {
name = "Nexus"
setUrl("https://example.com/repository/maven-public")
}
}
}
fun DependencyHandler.corporatePlugins(version: String) : Any {
return "com.example:corporate-gradle-plugins:$version"
}
With the plan to use them in my project's build.gradle.kts as follows:
import com.example.corporateNexus
import com.example.corporatePlugins
buildscript {
repositories {
corporateNexus()
}
dependencies {
classpath(corporatePlugins(version = "1.2.3"))
}
}
However, Gradle was unable to see my functions when used in the buildscript block (unable to compile script). Using them in the normal project repos/dependencies worked fine though (they are visible and work as expected).
If this worked, I was hoping to bundle the jar into my custom distribution , meaning my init script could just do simple validation instead of hiding away the magical plugin and repo configuration. The extension functions wouldn't need to change, so it wouldn't require releasing a new Gradle distribution when plugins change.
What I tried:
adding my jar to the test project's buildscript classpath (i.e. buildscript.dependencies) - doesn't work (maybe this doesn't work by design as it doesn't seem right to be adding a dependency to buildscript that's referred to in the same block)
putting the functions in buildSrc (which works for normal project deps/repos but not buildscript, but is not a real solution as it just moves the boilerplate)
dropping the jar in the lib folder of the distribution
So my question really boils down to:
Is what I'm trying to achieve possible (is it possible to make custom classes/functions visible to the buildScript block)?
Is there a better approach to configuring a corporate Nexus repo and making custom plugins (published to Nexus) available across lots of separate projects (i.e. totally different codebases) with minimal boilerplate configuration?
If you want to benefit from all the Gradle Kotlin DSL goodness you should strive to apply all plugins using the plugins {} block. See https://github.com/gradle/kotlin-dsl/blob/master/doc/getting-started/Configuring-Plugins.md
You can manage plugin repositories and resolution strategies (e.g. their version) in your settings files. Starting with Gradle 4.4 this file can be written using the Kotlin DSL, aka settings.gradle.kts. See https://docs.gradle.org/4.4-rc-1/release-notes.html.
With this in mind you could then have a centralized Settings script plugin that sets things up and apply it in your builds settings.gradle.kts files:
// corporate-settings.gradle.kts
pluginManagement {
repositories {
maven {
name = "Corporate Nexus"
url = uri("https://example.com/repository/maven-public")
}
gradlePluginPortal()
}
}
and:
// settings.gradle.kts
apply(from = "https://url.to/corporate-settings.gradle.kts")
Then in your project build scripts you can simply request plugins from your corporate repository:
// build.gradle.kts
plugins {
id("my-corporate-plugin") version "1.2.3"
}
If you want your project build scripts in a multi-project build to not repeat the plugin version you can do so with Gradle 4.3 by declaring versions in your root project. Note that you also could set the versions in settings.gradle.kts using pluginManagement.resolutionStrategy if having all builds use the same plugins version is what you need.
Also note that for all this to work, your plugins must be published with their plugin marker artifact. This is easily done by using the java-gradle-plugin plugin.
I promised #eskatos that I would come back and give feedback on his answer - so here it is!
My final solution consists of:
Gradle 4.7 wrapper per project (pointed at a mirror of http://services.gradle.org/distributions setup in Nexus as a raw proxy repository, i.e. it's vanilla Gradle but downloaded via Nexus)
Custom Gradle plugins published to our Nexus repo along with plugin markers (generated by the Java Gradle Plugin Development Plugin)
Mirroring the Gradle Plugin Portal in our Nexus repo (i.e. a proxy repo pointing at https://plugins.gradle.org/m2)
A settings.gradle.kts file per project that configures our maven repo and gradle plugin portal mirror (both in Nexus) as plugin management repositories.
The settings.gradle.kts file contains the following:
pluginManagement {
repositories {
// local maven to facilitate easy testing of our plugins
mavenLocal()
// our plugins and their markers are now available via Nexus
maven {
name = "CorporateNexus"
url = uri("https://nexus.example.com/repository/maven-public")
}
// all external gradle plugins are now mirrored via Nexus
maven {
name = "Gradle Plugin Portal"
url = uri("https://nexus.example.com/repository/gradle-plugin-portal")
}
}
}
This means that all plugins and their dependencies are now proxied via Nexus, and Gradle will find our plugins by id as the plugin markers are published to Nexus as well. Having mavenLocal in there as well facilitates easy testing of our plugin changes locally.
Each project's root build.gradle.kts file then applies the plugins as follows:
plugins {
// plugin markers for our custom plugins allow us to apply our
// plugins by id as if they were hosted in gradle plugin portal
val corporatePluginsVersion = "1.2.3"
id("corporate-project") version corporatePluginsVersion
// 'apply false` means this plugin can be applied in a subproject
// without having to specify the version again
id("corporate-publishing") version corporatePluginsVersion apply false
// and so on...
}
And configures the gradle wrapper to use our mirrored distribution, which when combined with the above means that everything (gradle, plugins, dependencies) all come via Nexus):
tasks {
"wrapper"(Wrapper::class) {
distributionUrl = "https://nexus.example.com/repository/gradle-distributions/gradle-4.7-bin.zip"
}
}
I was hoping to avoid the boilerplate in the settings files using #eskatos's suggestion of applying a script from a remote URL in settings.gradle.kts. i.e.
apply { from("https://nexus.example.com/repository/maven-public/com/example/gradle/corporate-settings/1.2.3/corporate-settings-1.2.3.kts" }
I even managed to generate a templated script (published alongside our plugins) that:
configured the plugin repos (as in the above settings script)
used a resolution strategy to apply the version of the plugins associated with the script if the requested plugin id was one of our plugins and the version wasn't supplied (so you can just apply them by id)
However, even though it removed the boilerplate, it meant our builds were reliant on having a connection to our Nexus repo, as it seems that even though scripts applied from a URL are cached, Gradle does a HEAD request anyway to check for changes. It also made it annoying to test plugin changes locally, as I had to point it manually at the script in my local maven directory. With my current config, I can simply publish the plugins to maven local and update the version in my project.
I'm quite happy with the current setup - I think it's far more obvious to developers now how the plugins are applied. And it's made it far easier to upgrade Gradle and our plugins independently now that there's no dependency between the two (and no custom gradle distribution required).
I've been doing something like this in my build
buildscript {
project.apply {
from("${rootProject.projectDir}/sharedValues.gradle.kts")
}
val configureRepository: (Any) -> Unit by extra
configureRepository.invoke(repositories)
}
In my sharedValues.gradle.kts file I have code like this:
/**
* This method configures the repository handler to add all of the maven repos that your company relies upon.
* When trying to pull this method out of the [ExtraPropertiesExtension] use the following code:
*
* For Kotlin:
* ```kotlin
* val configureRepository : (Any) -> Unit by extra
* configureRepository.invoke(repositories)
* ```
* Any other casting will cause a compiler error.
*
* For Groovy:
* ```groovy
* def configureRepository = project.configureRepository
* configureRepository.invoke(repositories)
* ```
*
* #param repoHandler The RepositoryHandler to be configured with the company repositories.
*/
fun repositoryConfigurer(repoHandler : RepositoryHandler) {
repoHandler.apply {
// Do stuff here
}
}
var configureRepository : (RepositoryHandler) -> Unit by extra
configureRepository = this::repositoryConfigurer
I follow a similar patter for configuring the resolution strategy for plugins.
The nice thing about this pattern is that anything you configure in sharedValues.gradle.kts can also be used from your buildSrc project meaning that you can reuse repository declarations.
Updated:
You can apply another script from a URL, for example doing this:
apply {
// This was actually a plugin that I used at one point.
from("http://dl.bintray.com/shemnon/javafx-gradle/8.1.1/javafx.plugin")
}
Simply host your script that you want all your builds to share on some http server (would highly recommend using HTTPS so your build can't be targeted by a man in the middle attack).
The downside of this is that I don't think that scripts applied from urls aren't cached so they will be re-downloaded every time you run your build.
This may have been fixed by now, I'm not certain.
A solution offered to me by Stefan Oehme when I was having a similar problem was to vendor my own custom distribution of Gradle. According to him this is a common thing to do at large companies.
Simply create a custom fork of the gradle repo, add your companies special sauce to every project using this custom version of gradle.
I encountered a similar problem when common config is replicated in each and every project. Solved it by a custom gradle distribution with the common settings defined in init script.
Created a gradle plugin for preparing such custom distributions - custom-gradle-dist. It works perfectly for my projects, e.g. a build.gradle for a library project looks like this (this is a complete file):
dependencies {
compile 'org.springframework.kafka:spring-kafka'
}

Gradle, OSGI and dependency management

I'm new to Gradle, please, help me to understand the following. I'm trying to build an OSGI web app via Intellij Idea + Gradle. I've found that Gradle has OSGI plugin, which is described here:
https://docs.gradle.org/current/userguide/osgi_plugin.html
But I have no idea on how to add dependency on, for example, org.apache.felix.dependencymanager which is OSGI bundle. So, I need this jar while compilation, and I don't need it in my resulting jar. I think, that I need something similar to maven 'provided' scope, or something like that.
P.S. Does anyone understand, what 'TBD' means in Gradle documentation? Does this means it has to be implemented in future, or is some mechanism is implemented, but is not yet described in docs?
Please check out the plugin I wrote, osgi-run, which was designed to make it extremely easy to play with OSGi without using any external tools like Eclipse (though osgi-run can generate a Manifest file for you, which you can point at from your IDE to get IDE OSGi support - this is what I do using IntelliJ), just Gradle.
With osgi-run, you just add a dependency to whatever you want as with any Java project... whether it should be provided by the environment or not does not matter at compile time, this is a deployment-time concern.
For example, add to your build.gradle file:
apply plugin: 'osgi' // or other OSGi plugin if you prefer
repositories {
mavenCentral() // add repos to get your dependencies from
}
dependencies {
compile "org.apache.felix:org.apache.felix.dependencymanager:4.3.0"
}
Note: the osgi plugin is just required to turn your jar into a bundle. osgi-run does not do that.
If you have any runtime dependencies that should be present in the OSGi environment but not in the compile classpath, do something like this:
dependencies {
...
osgiRuntime 'org.apache.felix:org.apache.felix.configadmin:1.8.8'
}
Now write some code, and once you're ready to run a OSGi container with your stuff in it, add these lines to the build.gradle file:
// this should be the first line
plugins {
id "com.athaydes.osgi-run" version "1.4.3"
}
...
// deployment to OSGi container config
runOsgi {
// which bundles do you want to add?
// transitive deps will be automatically added
bundles += project
// do not deploy jars matching these regexes (not needed, this is the default)
excludedBundles = ['org\\.osgi\\..*']
// make the manifest visible to the IDE for OSGi support
copyManifestTo file( 'auto-generated/MANIFEST.MF' )
}
Run:
gradle createOsgiRuntime
And find your full OSGi environment, ready to run, in the build/osgi directory.
Run it with:
build/osgi/run.sh # or run.bat in Windows
You can even run it during the build already:
gradle runOsgi
So you probably want to make your own provided configuration.
configurations {
// define new scope
provided
}
sourceSets {
// add the configurations to the compile classpath but not runtime
main.compileClasspath += configurations.provided
// be sure to add the provided configs to your tests too if needed
test.compileClasspath += configurations.provided
}
dependencies {
// declare your provided dependencies
provided 'org.apache.felix:org.apache.felix.dependencymanager:4.3.0'
}
Also the suggestion above about using the bndtool directly instead of the gradle provided osgi plugin is a good one. The gradle plugin has many deficiencies and is really just a wrapper to the bndtool anyways. Also the gradle team has declared they do not have the bandwidth or expertise to fix the osgi plugin [1].
[1] https://discuss.gradle.org/t/the-osgi-plugin-has-several-flaws/2546/5

Resources