I am creating a Maven plugin with a rather unique requirement for proper operation: it needs to spawn new processes of itself and then wait for those processes to complete a task.
While this is relatively trivial to do on the command line, Maven plugins do not get invoked in the same manner as traditional Java code and thus there is no classpath. I cannot figure out how to resolve the correct classpath inside the plugin such that I can spawn a new JVM (invoking the Main method of another class inside the plugin).
Using the current artifact's MavenProject I am able to get an Artifact reference to myself (the plugin) and get it's relative directory inside the local Maven repository:
Artifact self = null;
for (Artifact artifact : project.getPluginArtifacts()) {
if ("my-group-id".equals(artifact.getGroupId()) && "my-artifact-id".equals(artifact.getArtifactId())) {
self = artifact;
break;
}
}
if (self == null) {
throw new MojoExecutionException("Could not find representation of this plugin in project.");
}
for (ArtifactRepository artifactRepository : project.getPluginArtifactRepositories()) {
String path = artifactRepository.pathOf(self);
if (path != null) {
getLog().info("relative path to self: " + path);
break;
}
}
How do I get a reference to all of its dependencies (and transitive dependencies) such that I can construct a full classpath for a new invocation? I see that self has a dependency filter but I don't know where to apply it.
Is this the proper way to create a new process of "myself" inside a plugin? Is there a better way?
I found a great article on the differences between dependency resolution on Maven 2 and Maven 3.
Given an Artifact it boils down to the following:
private Set<Artifact> getDependenciesForArtifact(Artifact artifact) {
ArtifactResolutionRequest arr = new ArtifactResolutionRequest()
.setArtifact(artifact)
.setResolveTransitively(true)
.setLocalRepository(local);
return repositorySystem.resolve(arr).getArtifacts();
}
With the Set you can construct a by calling pathOf on an ArtifactRepository for each element and joining with File.pathSeparator.
Hm. Not really an answer but some hints. Why do you need such a complex thing? Furthermore i would take a deep look into the maven-surefire-plugin which can fork a jvm for unit tests and can handle classpath. On the other hand you can take a look into the maven-invoker or in the maven-invoker-plugin which can fork maven completely. Ah..what i missed. Take a look into the maven-dependency-plugin which has a particular goal for creating the classpath where you can take a look into the sources how they construct the classpath.
Related
We have a huge monolith application which is build by multiple tools (shell scripts, Ant and Maven). The build process is quite complex:
a lot of manually steps
hidden dependencies between Ant targets
different steps must be executed depending on the used Operating System
We decided to simplify it by creating Gradle scripts which wraps all this logic (it is quite impossible to fix it, so we create a wrapper which standardize the way of executing all the logic). We have to download some files from the Maven repository, but we cannot use the dependencies syntax:
we don't need to always download all files
the versions of the downloaded artifacts are dynamic (depends on configuration located in completely different place)
we need a path to the downloaded files (e.g. we have to unpack an artifact distributed as zip)
How we can achieve it?
The easiest way to achieve it is to create a dynamic configuration with dependencies, and next resolve it. The resolve method returns paths to the dependencies on the local disk. It is important to use a unique name for every configuration. If not, executing the logic twice would fail (cannot overwrite the configuration with XYZ name).
Here is an example method which returns a path to an artifact. If the artifact is already available in the Gradle cache it won't be downloaded for the second time, but of course the path will be returned. In this example all artifacts are downloaded from Maven Central.
Method:
ext.resolveArtifact = { CharSequence identifier ->
def configurationName = "resolveArtifact-${UUID.randomUUID()}"
return rootProject.with {
configurations.create(configurationName)
dependencies.add(configurationName, identifier)
return configurations.getByName(configurationName, {
repositories {
mavenCentral()
}
}).resolve()[0]
}
}
Usage:
def jaCoCoZip = resolveArtifact('org.jacoco:jacoco:0.8.6')
def jaCoCoAgent = resolveArtifact('org.jacoco:org.jacoco.agent:0.8.6')
i'm currently writing a small plugin but i'm stuck when i want to get a list of all dependencies that are used.
what i'm doing
inside the plugin i create a new configuration
def config = project.configurations.create(sourceSet.getTaskName('foo', 'bar'))
in the build.gradle that uses the plugin i add some dependencies to this configuration
dependencies {
fooTestBar(project(':module'))
}
and in module/build.gradle i have
plugins {
id 'java-library'
}
dependencies {
implementation('org.apache.commons:commons-collections4:4.4')
api('org.springframework:spring-context:5.2.11.RELEASE')
}
when i now do the following in the plugin
List<ResolvedArtifact> = config.resolvedConfiguration.firstLevelModuleDependencies.allModuleArtifacts.flatten()
i get the artifacts from both declarations in :module, but what i'm interested in is only the api dependency, means the one that is also used when compiling the project
it looks like the entire configurations is treated as a runtime configuration, so i have all artifacts including the transitive ones from both declarations, instead of only the api one including the transitive ones from api
until now i was not able to find any way to see if a resolved dependency / artifact is of type api which i do not want to have in my result list
i had to add the attribute for the usage
config.attributes {
attribute( Usage.USAGE_ATTRIBUTE, objects.named( Usage, Usage.JAVA_API ) )
}
https://docs.gradle.org/current/userguide/variant_model.html
https://docs.gradle.org/current/userguide/variant_attributes.html
thanks Chris Doré on https://discuss.gradle.org/t/custom-configuration-and-resolving-only-compile-dependencies/38891
This article describes an interesting feature of Gradle 4.10+ called a source dependency:
https://blog.gradle.org/introducing-source-dependencies
It allows to use a Git (for example a GitHub) source code repository to build a dependency from it. However it seems like it supports only Gradle projects as source dependencies. Is it possible to use a Maven project as well and if it's possible, please show an example.
When I try to use this feature with Maven project Gradle tries to find the build.gradle file there anyway (I see it when run Gradle with the --info option) and fails with an error message like:
Git repository at https://github.com/something/something.git did not contain a project publishing the specified dependency.
The short answer
... is: "no".
Under the hood, source dependencies are composite builds. These needs to be Gradle projects as the external projects are sort of merged with the main project.
The long answer
... is: "yes but it is hard".
It is actually mentioned in the same blog post you linked to (emphasis mine):
Source dependencies make these use cases simpler to implement. Gradle takes care of automatically checking out the correct versions of dependencies, making sure the binaries are built when required. It does this everywhere that the build is run. The checked out project doesn’t even need to have an existing Gradle build. This example shows a Gradle build consuming two source dependencies that have no build system by injecting a Gradle build via plugins. The injected configuration could do anything a regular Gradle plugin can do, such as wrapping an existing CMake or Maven build.
Because it sounded like it wasn't the biggest thing in the world to create bridge between a Maven and a Gradle project in source dependencies, I gave it a shot. And I have it working except for transitive dependencies. You will basically need to do what is shown in the examples linked to above, but instead of building native libraries, you make a call-out to Maven (e.g. using a Maven plugin for Gradle).
However, the scripts I ended up with are complex enough that I would suggest you instead build the Maven project yourself, deploy it to a local Maven repository and then add that repository to the Gradle project.
<edit>
The loooooooong answer
Alright, so here is how to actually do it. The feature is poorly documented, and appears to be mostly targeted towards native projects (like C++ or Swift).
Root project setup
Take a normal Gradle project with the Java plugin applied. I did a "gradle init" in an empty folder. Assume that in this project, you are depending on a library called `` that you later want to include as a source dependency:
// [root]/build.gradle
dependencies {
implementation 'org.example:my-maven-project:1.1'
}
Note that the version number defined here must match a Git tag in the repository. This is the code revision that will be checkout out.
Then in the settings file, we define a source dependency mapping for it:
// [root]/settings.gradle
rootProject.name = 'my-project'
includeBuild('plugins') // [1]
sourceControl {
gitRepository("https://github.com/jitpack/maven-simple") { // [2]
producesModule("org.example:my-maven-project") // [3]
plugins {
id "external-maven-build" // [4]
}
}
}
[1]: This includes a Gradle project called plugins that will be explained later.
[2]: This is just an arbitrary Maven project that I found, which was relatively simple. Substitute with the actual repository you have.
[3]: This is the name of the Maven module (the same as in the dependency block) that we are defining a source build for
[4]: This defines a custom settings plugin called external-maven-build that is defined in the plugins project, which will be explained later.
Plugins project structure
Inside the root project, we define a new Gradle project. Again, you can use gradle init to initialize it as a Groovy (or whatever you like) project. Delete all generated sources and tests.
// [root]/plugins/settings.gradle
// Empty, but used to mark this as a stand-alone project (and not part of a multi-build)
// [root]/plugins/build.gradle
plugins {
id 'groovy'
id 'java-gradle-plugin' // [1]
}
repositories {
gradlePluginPortal() // [2]
}
dependencies {
implementation "gradle.plugin.com.github.dkorotych.gradle.maven.exec:gradle-maven-exec-plugin:2.2.1" // [3]
}
gradlePlugin {
plugins {
"external-maven-build" { // [4]
id = "external-maven-build"
implementationClass = "org.example.ExternalMavenBuilder"
}
}
}
[1]: In this project, we are defining a new Gradle plugin. This is a standard way to do that.
[2]: To invoke Maven, I am using another 3rd party plugin, so we need to add the Gradle plugin portal as a repository.
[3]: This is the plugin used to invoke Maven. I am not too familiar with it, and I don't know how production ready it is. One thing I noticed is that it does not model inputs and outputs, so there are no built-in support for up-to-date checking. But this can be added retrospectively.
[4]: This defines the custom plugin. Notice that it has the same ID as used in the settings file in the root project.
Plugin implementation class
Now comes the fun stuff. I chose to do it in Groovy, but it can be done in any supported JVM languages of cause.
The plugin structure is just like any other Gradle plugin. One thing to note is that it is a Settings plugin, whereas you normally do Project plugins. This is needed as it we are basically defining a Gradle project at run-time, which needs to be done as part of the initialization phase.
// [root]/plugins/src/main/groovy/org/example/ExternalMavenBuilder.groovy
package org.example
import com.github.dkorotych.gradle.maven.exec.MavenExec
import org.gradle.api.Plugin
import org.gradle.api.artifacts.ConfigurablePublishArtifact
import org.gradle.api.initialization.Settings
class ExternalMavenBuilder implements Plugin<Settings> {
void apply(Settings settings) {
settings.with {
rootProject.name = 'my-maven-project' // [1]
gradle.rootProject {
group = "org.example" //[2]
pluginManager.apply("base") // [3]
pluginManager.apply("com.github.dkorotych.gradle-maven-exec") // [4]
def mavenBuild = tasks.register("mavenBuild", MavenExec) {
goals('clean', 'package') // [5]
}
artifacts.add("default", file("$projectDir/target/maven-simple-0.2-SNAPSHOT.jar")) { ConfigurablePublishArtifact a ->
a.builtBy(mavenBuild) // [6]
}
}
}
}
}
[1]: Must match the Maven module name
[2]: Must match the Maven module group
[3]: Defines tasks like "build" and "clean"
[4]: The 3rd party plugin that makes it more easy to invoke Maven
[5]: For options, see https://github.com/dkorotych/gradle-maven-exec-plugin
[6]: Adds the Maven output as an artifact in the "default" configuration
Be aware that it does not model transitive dependencies, and it is never up-to-date due to missing inputs and outputs.
This is as far as I got with a few hours of playing around with it. I think it can be generalized into a generic plugin published to the Gradle portal. But I think I have too much on my plate as it is already. If anyone would like to continue on from here, you have my blessing :)
curently I am writing a gradle plugin and I need to add and download a maven dependency programmatically in a given task.
I evaluated DependencyHandler and ArtifactResolutionQuery but I can't figure out where and how to add a Dependency and resolve it in mavenCentral repository
Similarcoding for maven does look rather easy
Artifact artifact = artifactFactory.createArtifactWithClassifier(groupId, artifactId, version, type, classifier);
artifactResolver.resolve(artifact, remoteRepositories, localRepository);
So i guess/hope there is a similiar easy way in gradle and I am just not seeing it
Regards
Mathias
Update 1:
So here is some of the stuff I tried,wildly c&p from different tries. it is worth saying that the dependency I want to download has the classifier ZIP, so normal in my build.gradle I write
compile 'group:artifact:version#zip
to get the file
ComponentIdentifier componentIdentifier = new DefaultModuleComponentIdentifier("com.sap.cloud",
"neo-java-web-sdk", "3.39.10");
System.out.println("CompIdentifier = " + componentIdentifier.getDisplayName());
//getProject().getDependencies().add("compile", componentIdentifier.getDisplayName());
Configuration configuration = getProject().getConfigurations().getByName("compile");
org.gradle.api.artifacts.Dependency dep2 = new DefaultExternalModuleDependency("com.sap.cloud",
"neo-java-web-sdk", "3.39.10");
boolean depList = configuration.getDependencies().add(dep2);
//
configuration.forEach(file -> {
getProject().getLogger().lifecycle("Found project dependency # " + file.getAbsolutePath());
});
Set<File> files = configuration.resolve();
for (File file2 : files) {
System.out.println("Files: " + file2.getName());
}
DependencyHandler dep = getProject().getDependencies();
ComponentModuleMetadataHandler modules = dep.getModules();
ArtifactResolutionQuery a = getProject().getDependencies().createArtifactResolutionQuery()
.forComponents(componentIdentifier).withArtifacts(MavenModule.class, SourcesArtifact.class);
ArtifactResolutionResult r = a.execute();
Set<ComponentArtifactsResult> set = r.getResolvedComponents();
Set<ComponentResult> c = r.getComponents();
I think the simplest way to download a dependency programmatically in a Gradle plugin is the same as doing it in a build script. Just create a new configuration, add your dependency and resolve the configuration. Watch the example below how this works in Java (the preferred language for Gradle plugins):
Configuration config = project.getConfigurations().create("download");
config.setTransitive(false); // if required
project.getDependencies().add(config.getName(), "com.sap.cloud:neo-java-web-sdk:3.39.10#zip");
File file = config.getSingleFile();
For this example, the name of the configuration ("download") can be any string not already used as configuration name (like compile or runtime). Since the configuration will be resolved afterwards, you must use another name whenever you reuse this code snippet (or if you call it multiple times).
I am trying to build a gradle plugin, which does the following:
As part of one its tasks, it creates a new configuration
It adds a DefaultExternalModuleDependency to this configuration - more specifically, it constructs a dependency to the application server zip file (available on Nexus). This information can be overridden by the invoking project as well.
Tries to resolve this newly added dependency and then unpacks the file to a local folder
All of this was working well when I had the details hard coded in a build file, but it looks like adding dependencies as part of a task are not treated the same way as having that information available at the parsing time.
So my question is, how do I get the project to reload the configurations / dependencies?
The code looks like the following:
#TaskAction
void installAppserver() {
Dependency dependency = new DefaultExternalModuleDependency(group,name,version)
Configuration configuration = project.configurations.detachedConfiguration(dependency)
configuration.setTransitive(false)
configuration.files.each { file ->
if (file.isFile() && file.name.endsWith('.zip')) {
println 'Attempting to unzip: ' + file + ' into folder: ' + appServerFolder
new Copy().from(project.zipTree(file)).into(appServerFolder).execute()
}
}
}
The problem is that the actual artifacts are not getting resolved!
A task can't configure the build model (that's what plugins do). It's fine to create and resolve a detached configuration in a task. If this doesn't work, there is likely a problem with the task's code, or the dependency it tries to resolve. Note that dependencies can only be resolved if the correct repository(s) are defined.
Instead of new DetaultExternalModuleDependency() (which is an internal class), project.dependencies.create() should be used. Instead of new Copy().execute() (Task#execute must not be called from user code), project.copy should be used.