I'm trying to fetch a maven artifact's dependencies using aether. I see a RepositorySystem.collectDependencies(), but that fetches only compile and runtime scoped dependencies. How do I fetch all dependencies for the artifact, including test and provided?
Take a look at jcabi-aether (I'm a developer), which is a wrapper around Sonatype Aether:
File repo = this.session.getLocalRepository().getBasedir();
Collection<Artifact> deps = new Aether(this.getProject(), repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
JavaScopes.RUNTIME
);
Assuming you are using DefaultRepositorySystemSession you may do the following:
defaultRepositorySystemSession.setDependencySelector(new DependencySelector() {
#Override
public boolean selectDependency(Dependency dependency) {
return true;
}
#Override
public DependencySelector deriveChildSelector(DependencyCollectionContext context) {
return this;
}
});
and then
CollectResult result = repositorySystem.collectDependencies(defaultRepositorySystemSession, request);
Here is an example project that does this.
These three files:
https://github.com/terraframe/Runway-SDK/tree/v1.8.0/runwaysdk-server/src/main/java/com/runwaysdk/business/generation/maven
Are a working, stand-alone, example using Aether.
It worked for a few months then I all of a sudden had an issue pop up where it would sometimes on Mac JRE throw a DependencyResolutionException on com.sun:tools.jar.
Good luck, if you decide to use it, I'm instead going to use maven-dependency-plugin dependency:build-classpath.
You can utilize DependencyFilter in Eclipses Aether. A complete version for a sample below can be found in this awesome set of Aether snippets.
DependencyFilter classpathFilter = DependencyFilterUtils.classpathFilter(JavaScopes.COMPILE, JavaScopes.PROVIDED);
CollectRequest collectRequest = new CollectRequest();
collectRequest.setRoot( new Dependency( artifact, JavaScopes.COMPILE ) );
collectRequest.setRepositories(repositories);
DependencyRequest dependencyRequest = new DependencyRequest( collectRequest, classpathFilter );
List<ArtifactResult> artifactResults =
system.resolveDependencies( session, dependencyRequest ).getArtifactResults();
UPDATE
Version 0.9.0M3 is not compatible with Maven 3.1.0, so don't use it inside Maven, i.e. in a plugin.
Take a look at this github project: https://github.com/Gemba/mvn-dd
It downloads all dependencies including test and provided.
It uses aether's library to fetch them.
Related
I have a file called reference.conf that exists in many dependencies and derives from typesafe config (https://github.com/lightbend/config)
I use Quarkus for my application.
When Quarkus builds the uber-jar it keeps only one of these files (the one from the last dependency that it parses).
How can I merge all this files to a single one?
Thanks to this commit in Quarkus https://github.com/quarkusio/quarkus/commit/b3d3788ae92542d5fb39d89488890e16d64cec90 "Introduce UberJarMergedResourceBuildItem and UberJarIgnoredResourceBuildItem",
that works from release https://github.com/quarkusio/quarkus/releases/tag/1.13.4.Final we can create an extension in Quarkus and use it to merge any resource we want to.
Many thanks to George Gastaldi for this.
To create the extension is quite easy, as I did also for first time for this feature.
c:\projects> mvn io.quarkus:quarkus-maven-plugin:1.13.4.Final:create-extension -N -DgroupId=myproject.quarkus -DextensionId=quarkus-files-extension -DwithoutTests
Done, the extension has been created with all the needed maven projects, configuration etc. Now, edit the file of the extension
c:\projects\quarkus-files-extension\deployment\src\main\java\myproject\quarkus\files\extension\deployment\QuarkusFilesExtensionProcessor.java
add the BuildStep: UberJarMergedResourceBuildItem
class QuarkusFilesExtensionProcessor {
private static final String path = "reference.conf";
#BuildStep
UberJarMergedResourceBuildItem feature() {
return new UberJarMergedResourceBuildItem(path);
}
}
Build the Quarkus extension maven project, ie
c:\projects\quarkus-files-extension> mvn clean install
Now add the extension to your project that needs the merging by either executing
c:\project\mybigproject> mvn quarkus:add-extension -Dextensions="myproject.quarkus:quarkus-files-extension"
It simply adds the dependency of the extension in your pom.xml
That's it, next time that you build the uber-jar in the project, it merges all the reference.conf files to a single one.
The only thing missing is a verbose message, which is not possible with the current version,
You can notice that it works, by either of course checking the reference.conf in the uber-jar
or the following message is now missing from the build log
[WARNING] [io.quarkus.deployment.pkg.steps.JarResultBuildStep] Dependencies with duplicate files detected. The dependencies [groupId1:artifact1::jar:1.0(compile), groupId2:artifact2::jar:1.0(compile)] contain duplicate files, e.g. reference.conf
They also offer a feature to ignore files with a certain name, with the BuildItem UberJarIgnoredResourceBuildItem. You create an extension exactly the same way.
The solution proposed by Andreas is fine for files that can be appended sequentially, however that's not true for XML or JSON files, for example.
For these cases, it's preferable to perform the merge in a #BuildStep method in an extension and produce a GeneratedResourceBuildItem instead.
Here is an example using XmlCombiner:
#BuildStep
void uberJarMergedResourceBuildItem(BuildProducer<GeneratedResourceBuildItem> generatedResourcesProducer,
PackageConfig packageConfig) {
if (packageConfig.isUberJar()) {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
try {
XmlCombiner combiner = new XmlCombiner();
List<URL> resources = Collections
.list(getClass().getClassLoader().getResources("META-INF/wsdl.plugin.xml"));
for (URL resource : resources) {
try (InputStream is = resource.openStream()) {
combiner.combine(is);
}
}
combiner.buildDocument(baos);
} catch (ParserConfigurationException | SAXException | TransformerException | IOException e) {
e.printStackTrace();
}
generatedResourcesProducer.produce(new GeneratedResourceBuildItem("META-INF/wsdl.plugin.xml", baos.toByteArray()));
}
}
See a test in https://github.com/quarkusio/quarkus/pull/17199
I am very excited about the incubating Gradle's version catalogs and have been experimenting with it. I’ve found that the information in my gradle/libs.versions.toml is accessible in the build.gradle.kts scripts for my app and utility-lib projects.
However, I am unable to use the content of the toml file for buildSrc/build.gradle.kts or the convention files.
The only way that I could build was to hard-code the dependencies into those files, as I did before the version catalog feature.
In the buildSrc folder, I created a settings.gradle.kts file and inserted the dependencyResolutionManagement code for versionCatalogs, which is pointing to the same file as for my app and utility-lib projects.
Based on the Gradle7 docs, it seems that sharing a version catalog with buildSrc and modules is possible… I’d appreciate a nudge into getting it to work with buildSrc, if possible.
Here is a simple sample project, which I created via gradle init: my-version-catalog
Thank you for your time and help,
Mike
With Gradle 7.3.3, it is possible. Note version catalogs are GA since Gradle 7.4
The code snippet assumes Gradle is at least 7.4, but if you need them prior that version, insert enableFeaturePreview("VERSION_CATALOGS") at the beginning of each settings.gradle.kts.
Using buildSrc
buildSrc/settings.gradle.kts
dependencyResolutionManagement {
versionCatalogs {
create("libs") {
from(files("../gradle/libs.versions.toml"))
}
}
}
buildSrc/build.gradle.kts
dependencies {
implementation(libs.gradleplugin.intellij) // <- the lib reference
}
You can even use the version catalog for plugins
gradle/libs.versions.toml
...
[plugins]
kotlin-jvm = { id = "org.jetbrains.kotlin.jvm", version.ref = "kotlin" }
jetbrains-changelog = { id = "org.jetbrains.changelog", version.ref = "changelog-plugin" }
jetbrains-intellij = { id = "org.jetbrains.intellij", version.ref = "intellij-plugin" }
hierynomus-license = { id = "com.github.hierynomus.license", version.ref = "license-plugin" }
nebula-integtest = { id = "nebula.integtest", version.ref = "nebula-integtest-plugin" }
build.gradle.kts
plugins {
id("java")
alias(libs.plugins.kotlin.jvm)
alias(libs.plugins.nebula.integtest)
alias(libs.plugins.jetbrains.intellij)
alias(libs.plugins.jetbrains.changelog)
alias(libs.plugins.hierynomus.license)
}
Note for accessing the catalog within scripts, please refer to the below section, the trick is the same.
Using convention plugins and included build
In the main project include a the Gradle project that holds the convention plugins.
build.gradle.kts
includeBuild("convention-plugins") // here it's a subfolder
convention-plugins/settings.gradle.kts
dependencyResolutionManagement {
repositories {
gradlePluginPortal()
}
versionCatalogs {
create("libs") {
from(files("../gradle/libs.versions.toml"))
}
}
}
rootProject.name = "convention-plugins"
The trick to enable convention plugins to access the version catalog is split in two part, add an ugly implementation dependency that locate where the version catalog generated classes are located.
libs.javaClass.superclass.protectionDomain.codeSource.location
Then in the convention plugin refer to the libs extension via Project::the.
val libs = the<LibrariesForLibs>()
This is tracked by gradle/gradle#15383.
convention-plugins/build.gradle.kts
plugins {
`kotlin-dsl`
}
dependencies {
implementation(libs.gradleplugin.kotlin.jvm)
// https://github.com/gradle/gradle/issues/15383
implementation(files(libs.javaClass.superclass.protectionDomain.codeSource.location))
}
And in the actual convention plugin
import org.gradle.accessors.dm.LibrariesForLibs
plugins {
id("org.jetbrains.kotlin.jvm")
}
// https://github.com/gradle/gradle/issues/15383
val libs = the<LibrariesForLibs>()
dependencies {
detektPlugins(libs.bundles.kotlinStuff) // access catalog entries
}
The org.gradle.accessors.dm.LibrariesForLibs class is generated by gradle is somewhere in local gradle folder ./gradle/<version>/dependency-accessors/<hash>/classes
Quick note that older IntelliJ IDEA currently (2022.3) reports alias(libs.gradleplugin.thePlugin) as an error in the editor,
although the dependencies are correctly resolved.
This tracked by KTIJ-19369, the ticket indicates this is actually a bug in Gradle Kotlin DSL gradle/gradle#22797, and someone made a simple IntelliJ IDEA plugin to hide this error until resolved.
Brice, it looks like a can of worms to go down that path, particularly for my situation, where I'm trying to use a libs.version.toml file from an android project, but the custom plugin is of course from a java/kotlin project. I tried creating the libs file by hardwiring the path to the toml file in the custom plugin. It might work if both were java projects, but I never tried that since that's not what I'm after. The ideal solution would be for the plugin to use the libs file from the project it is applied to, but it looks like the version catalog needs to be created in the settings file, before you even have access to "Project", so that's why you would have to hardwire the path.
Short answer. No, but there are other techniques for a custom plugin to get project version data from the project it is applied to.
Say I'm using the palantir/gradle-git-version Gradle plugin, and have the following code in build.gradle.kts to determine the project version:
// If release branch, return after incrementing patch version.
// Else, return $lastTag-SNAPSHOT.
val projectVersion: String by lazy {
val versionDetails: groovy.lang.Closure<VersionDetails> by extra
with(versionDetails()) {
if (!lastTag.matches("^(?:(?:\\d+\\.){2}\\d+)\$".toRegex())) {
throw GradleException("Tag '$lastTag' doesn't match 'MAJOR.MINOR.PATCH' format")
}
// If it detached state, get branch name from GitLab CI env var
val branch = branchName ?: System.getenv("CI_COMMIT_REF_NAME")
if (branch?.startsWith("release/") == true) {
val tokens = lastTag.split('.')
"${tokens[0]}.${tokens[1]}.${tokens[2].toInt() + commitDistance}"
} else "$lastTag-SNAPSHOT"
}
}
This works, but the code is duplicated across all the projects, which is difficult to maintain except for a very small number of projects.
This is just one example, the same applies for other Gradle tasks that assume certain conventions within the company/team, like creating a Dockerfile.
What is a good way to centralize such code so that all projects can use them? Note that code like this don't usually stand on their own, but rely on Gradle plugins.
What is a good way to centralize such code so that all projects can use them?
You'll want to create a custom Gradle plugin to hold your project's conventions.
If you have Gradle installed locally, you can use the Build Init Plugin to create a skeleton plugin project. With Gradle installed locally, simple run gradle init in a new project directory and follow the prompts to create the plugin project.
As a concrete example (assuming you generated a plugin project as mentioned earlier), to apply your versioning conventions, a plugin could be:
// Plugin's build.gradle.kts
dependencies {
// Add dependency for plugin, GAV can be found on the plugins page:
// https://plugins.gradle.org/plugin/com.palantir.git-version
implementation("com.palantir.gradle.gitversion:gradle-git-version:0.12.3")
}
Then a versioning conventions plugin could be:
import com.palantir.gradle.gitversion.VersionDetails
import groovy.lang.Closure
import org.gradle.api.GradleException
import org.gradle.api.Plugin
import org.gradle.api.Project
class VersioningConventionsPlugin : Plugin<Project> {
override fun apply(project: Project) {
// Apply plugin to project as you would in the main Gradle build file.
project.pluginManager.apply("com.palantir.git-version")
// Configure version conventions
val projectVersion: String by lazy {
// Gradle generates some Kotlin DSL code on the fly, in a plugin implementation we don't have that.
// So we must convert the DSL to the Gradle API.
val versionDetails: Closure<VersionDetails> = project.extensions.extraProperties.get("versionDetails") as Closure<VersionDetails>
with(versionDetails.call()) {
if (!lastTag.matches("^(?:(?:\\d+\\.){2}\\d+)\$".toRegex())) {
throw GradleException("Tag '$lastTag' doesn't match 'MAJOR.MINOR.PATCH' format")
}
val branch = branchName ?: System.getenv("CI_COMMIT_REF_NAME")
if (branch?.startsWith("release/") == true) {
val tokens = lastTag.split('.')
"${tokens[0]}.${tokens[1]}.${tokens[2].toInt() + commitDistance}"
} else "$lastTag-SNAPSHOT"
}
}
// Set the version as an extra property on the project
// Accessible via extra["projectVersion"]
project.extensions.extraProperties["projectVersion"] = projectVersion
}
}
I gave a Kotlin example since your sample used the Kotlin DSL. Once you've finished development work of your conventions plugin, then you would publish to a repository such as the Gradle Plugins repository. If it's an internal company plugin, then publish it to an internal Nexus Repository or similar.
Follow the docs for the maven-publish plugin for more details on publishing. Gradle plugins can be published like any other artifact/JAR.
I'm considering moving a project from boot to Gradle with clojurephant hoping to leverage more of the Gradle ecosystem. This project builds one large uberjar that contains a Clojure project with Ring and Jetty that in turn ships a ClojureScript app built with re-frame.
In boot, I essentially just required boot-cljs, added
(boot-cljs/cljs :optimizations :advanced)
to my build task, which also calls (pom), (aot) and (uber) (all standard boot tasks), and everything worked smoothly.
With Clojurephant, I find that the Clojure and ClojureScript parts end up in different subdirectories. In particular I find underneath of build
clojure/main
clojurescript/main
resources/main (essentially a copy of my resources project folder)
Adding to my confusion is that these paths don't translate in a way that I can see to the structure of the Uberjar that Gradle builds using the shadow plugin
Some excerpts from my build.gradle:
plugins {
id 'dev.clojurephant.clojure' version '0.5.0'
id 'dev.clojurephant.clojurescript' version '0.5.0'
id 'application'
id 'com.github.johnrengelman.shadow' version '5.0.0'
id 'maven-publish'
id 'distribution'
id 'com.meiuwa.gradle.sass' version '2.0.0'
}
// ...
clojure {
builds {
main {
aotAll()
}
}
}
// ...
clojurescript {
builds {
all {
compiler {
outputTo = 'public/app.js'
outputDir = 'public/js/out'
main = 'com.example.mycljsstuff'
assetPath = 'js/out'
}
}
main {
compiler {
optimizations = 'advanced'
sourceMap = 'public/app.js.map'
}
}
dev {
compiler {
optimizations = 'none'
preloads = ['com.example.mycljsstuff']
}
}
}
}
EDIT: Forgot to mention that for boot I configure the init function to start loading the CLJS code in a file called app.cljs.edn. With Clojurephant I only found a way to set a main namespace, not a function.
My question here is ultimately, how can I configure the ClojureScript build so that it works in the end when being delivered from the Uberjar?
The Clojure things seem to work. Ring and Jetty run and happily deliver a first static webpage. But all the CLJS/JS things can't be found.
I'd be super happy to just receive some pointers to other projects where I can learn, documentation, or tutorials. I haven't found a lot and then got lost in understanding the code of Clojurephant itself.
A year ago at work I was able to split up a combined CLJ & CLJS (backend/frontend) project into 2 separate projects: pure Clojure for the backend, and pure ClojureScript for the frontend. This resolved many, many problems we had and I would never try to keep two codebases in the same project again.
The backend CLJ part continued to use Lein as the build tool. It's not perfect but it is well understood.
The frontend CLJS part was transitioned from the original Figwheel (aka "1.0") to the newer Figwheel-Main (aka "2.0"). Following the lead from figwheel.org
we chose to restructure the build into using Deps/CLI (the original combined project used Lein for everything). The transition away from Lein to Deps/CLI was a real winner for CLJS work.
While Deps/CLI works great for pure Clojure code, be aware that it does not natively support the inclusion of Java source code. I have a template project
you can clone that shows a simple workaround for this.
For any CLJS project, I highly recommend the use of Figwheel-Main over the original Figwheel as it is a major "2.0" type of upgrade, and will make your life much, much better.
Enjoy!
Trying to answer my own question here, since I managed to get it running.
Clojure
plugins {
id 'dev.clojurephant.clojure' version '0.5.0'
id 'application'
id 'com.github.johnrengelman.shadow' version '5.0.0'
// ... more to come further down
}
group = 'com.example'
version = '1.0.0-SNAPSHOT'
targetCompatibility = 1.8
mainClassName = 'com.example.myproject'
dependencies {
implementation(
'org.clojure:clojure:1.10.1',
'ring:ring:1.8.0',
// and many more
)
testImplementation(
'junit:junit:4.12',
'clj-http-fake:clj-http-fake:1.0.3',
'ring:ring-mock:0.4.0'
)
devImplementation(
'org.clojure:tools.namespace:0.3.0-alpha4',
'cider:cider-nrepl:0.21.1',
'org.clojure:java.classpath',
'jonase:eastwood:0.3.11',
'lein-bikeshed:lein-bikeshed:0.5.2'
)
}
clojure {
builds {
main {
aotAll()
}
}
}
clojureRepl {
handler = 'cider.nrepl/cider-nrepl-handler'
}
This is enough to get an executable JAR running -main from com.example.myproject namespace when calling ./gradlew shadowJar from the command line. Not sure if the application plugin is relevant here. Also, ./gradlew clojureRepl spins up an nrepl that Emacs/Cider can connect to.
ClojureScript
// Merge with plugins above. Reproduced only the CLJS relevant
// part here
plugins {
id 'dev.clojurephant.clojurescript' version '0.5.0'
}
// Again, merge with dependencies above
dependencies {
implementation(
// ....
'org.clojure:clojurescript:1.10.597',
're-frame:re-frame:0.10.5',
'reagent:reagent:0.7.0',
// and a few more
)
}
clojurescript {
builds {
all {
compiler {
outputTo = 'public/app.js'
outputDir = 'public/state/'
main = 'com.example.myproject.webui'
assetPath = 'status/app.out'
}
}
main {
compiler {
optimizations = 'advanced'
sourceMap = 'public/app.js.map'
}
}
dev {
compiler {
optimizations = 'none'
preloads = ['com.example.myproject.webui']
}
}
}
}
This creates a /public folder in the top level of there JAR and app.js inside that folder, which is where the HTML file delivered by Ring expects it.
One important step for me was to call my init CLJS function in the CLJS file which was taken care of by some other component before. I'm not sure that this set up is totally correct and will do the figwheel setup eventually. Maybe the call to init will not be necessary then.
CSS
// Merge ....
plugins {
id 'com.meiuwa.gradle.sass' version '2.0.0'
}
sassCompile {
output = file("$buildDir/resources/main/public/")
source = fileTree("${rootDir}/src/main/resources/public/")
include("**/*.scss")
exclude("**/_*.sass", "**/_*.scss")
}
This compiles my app.scss to app.css in the right spot, where my HTML file searches for it.
Pro & Con
After migrating, I get for free
Faster compilation locally and in CI after setting up the caches correctly.
License and OWASP dep check reports by using plugins com.github.jk1.dependency-license-report and org.owasp.dependencycheck for which equivalents exists in leiningen but not boot (AFAIK).
Maven publishing with authentication via HTTP header instead of username/password which is not available in boot.
On the downsides:
Yucky syntax in my build file. No, really.
I have to enter the nrepl port number in Emacs manually.
I am creating a Maven plugin with a rather unique requirement for proper operation: it needs to spawn new processes of itself and then wait for those processes to complete a task.
While this is relatively trivial to do on the command line, Maven plugins do not get invoked in the same manner as traditional Java code and thus there is no classpath. I cannot figure out how to resolve the correct classpath inside the plugin such that I can spawn a new JVM (invoking the Main method of another class inside the plugin).
Using the current artifact's MavenProject I am able to get an Artifact reference to myself (the plugin) and get it's relative directory inside the local Maven repository:
Artifact self = null;
for (Artifact artifact : project.getPluginArtifacts()) {
if ("my-group-id".equals(artifact.getGroupId()) && "my-artifact-id".equals(artifact.getArtifactId())) {
self = artifact;
break;
}
}
if (self == null) {
throw new MojoExecutionException("Could not find representation of this plugin in project.");
}
for (ArtifactRepository artifactRepository : project.getPluginArtifactRepositories()) {
String path = artifactRepository.pathOf(self);
if (path != null) {
getLog().info("relative path to self: " + path);
break;
}
}
How do I get a reference to all of its dependencies (and transitive dependencies) such that I can construct a full classpath for a new invocation? I see that self has a dependency filter but I don't know where to apply it.
Is this the proper way to create a new process of "myself" inside a plugin? Is there a better way?
I found a great article on the differences between dependency resolution on Maven 2 and Maven 3.
Given an Artifact it boils down to the following:
private Set<Artifact> getDependenciesForArtifact(Artifact artifact) {
ArtifactResolutionRequest arr = new ArtifactResolutionRequest()
.setArtifact(artifact)
.setResolveTransitively(true)
.setLocalRepository(local);
return repositorySystem.resolve(arr).getArtifacts();
}
With the Set you can construct a by calling pathOf on an ArtifactRepository for each element and joining with File.pathSeparator.
Hm. Not really an answer but some hints. Why do you need such a complex thing? Furthermore i would take a deep look into the maven-surefire-plugin which can fork a jvm for unit tests and can handle classpath. On the other hand you can take a look into the maven-invoker or in the maven-invoker-plugin which can fork maven completely. Ah..what i missed. Take a look into the maven-dependency-plugin which has a particular goal for creating the classpath where you can take a look into the sources how they construct the classpath.