Add dependency to Maven pom file using org.apache.maven.Model - maven

I’m trying to add dependency to maven pom.xml file using approach described in this post…
def model = readMavenPom file: 'pom.xml'
dep = [
groupId : "org.foo",
artifactId : "bar"
]
model.addDependency(model.&addDependency.parameterTypes[0].newInstance(dep))
…but I’m facing an error which sounds like:
Groovy.lang.MissingMethodException: No signature of method: java.lang.Class.newInstance() is applicable for argument types: (java.util.LinkedHashMap) values: [[groupId:org.foo, artifactId:bar]]
Possible solutions: newInstance(), newInstance(), newInstance([Ljava.lang.Object;), isInstance(java.lang.Object)
What am I doing wrong? I spent 7 days for now trying to fix this thing and add dependency to pom file, but no luck. The only emergency exit here is to replace file contents using shell script, but it’s the least wanted solution. Yes, I can also parse xml file using groovy methods, but it’s not what we wanted, too.

Related

How to copy gradle dependencies from a subproject in a multi-project build?

I have a Java library built with gradle for which I would like to modify the repositories it reads from and publishes to, without changing it the original library files.
So, I created a new project library (lib-internal) which is just overriding the repositories and publishing options of the library that I don't want to modify (lib-open-source).
I could force lib-internal to use the source from lib-open-source, BUT I failed to copy its dependencies.
In my build.gradle of lib-internal, I have something like this copy the sources:
sourceSets.main.java.srcDirs = [project(':lib-open-source').projectDir.toString() + '/src/main/java']
But I am looking for something similar for its dependencies.
In short, I'm looking for the correct syntax of something like:
dependencies = project(':lib-open-source').getDependencies()
I also tried something with the configurations, as suggested by the help of the getDependencies() method but can't find the correct syntax.
configurations.add(project(':lib-open-source').configurations.compileClasspath)
If I copy the dependencies block of lib-open-source into lib-internal, it works as I want to, but I want to avoid this copy-paste.
Thank you!

Karaf features:addurl syntax

I have seen two different syntax in Karaf to add a repo, e.g.,
features:addurl mvn:org.apache.camel/camel-example-osgi/2.10.0/
xml/features
features:addurl mvn:org.apache.camel/camel-example-osgi/2.10.0/
xml
Can someone explain the difference between the 2? I believe they are both referring to a features file but they are in different locations?
features:addurl mvn:org.apache.camel/camel-example-osgi/2.10.0/xml
Doesn't actually work for me. BUT I think can break down what is happening.
mvn:org.apache.camel/camel-example-osgi/2.10.0 is a maven URl with an implicit 'type' and 'classifier'. The type is 'jar' and the classifier is empty, by default. Therefore it resolves to a file called camel-example-osgi-2.10.0.jar. (artifactId-version[-classifier].type)
In this case:
mvn:org.apache.camel/camel-example-osgi/2.10.0/xml is a type of 'xml' and no classifier. This resolves to a file called camel-example-osgi-2.10.0.xml, which doesn't exist.
mvn:org.apache.camel/camel-example-osgi/2.10.0/xml/features is a type of 'xml' and a classifier of 'features'. This, then, resolves to a file called camel-example-osgi-2.10.0-features.xml . We can look on the server and see that this file exists: http://repo1.maven.org/maven2/org/apache/camel/camel-example-osgi/2.10.0/
I can't find good documentation for it but 'classifier' adds the -$classifier to the filename. This is how some maven artifacts have a classifier of -jdkN and -jdkM or -jdbc4 or -jdbc3 on them.
References: https://ops4j1.jira.com/wiki/display/paxurl/Mvn+Protocol
http://maven.apache.org/pom.html#POM_Relationships

OSGi bundle build issue in Gradle

I have a simple use case of building an OSGi bundle using Gradle build tool. The build is successful if there are java files present in the build path, but it fails otherwise.
I am using 'osgi' plugin inside the gradle script and trying to build without any java files. The build always fails with following error:
Could not copy MANIFEST.MF to
I am sure there must be some way to do it in Gradle but not able to fine. Any idea what can be done to resolve this depending on your experience.
I ran into this today as well, and #Peter's fix didn't work for me (I hadn't applied the java plugin in the first place...). However, after hours of Googling I did find this thread, which helped me find the problem.
Basically, it seems that the error occurs (as Peter stated) when no class files are found in the jar - my guess is because the plugin then cannot scan the classes for package names on which to base all the Import and Export information.
My solution was to add the following to the manifest specification:
classesDir = theSourceSet.output.classesDir
classpath = theSourceSet.runtimeClasspath
In my actual build code, I loop over all source sets to create jar tasks for them, so then it looks like this:
sourceSets.each { ss ->
assemble.dependsOn task("jar${ss.name.capitalize()}", type: Jar, dependsOn: ss.getCompileTaskName('Java')) {
from ss.output
into 'classes'
manifest = osgiManifest {
classesDir = ss.output.classesDir
classpath = ss.runtimeClasspath
// Other properties, like name and symbolicName, also set based on
// the name of the source set
}
baseName = ss.name
}
}
Running with --stacktrace indicates that the osgi plugin doesn't deal correctly with the case where both the osgi and the java plugins are applied, but no Java code is present. Removing the java plugin should solve the problem.
I had the same issue also when java code was present.
Adding these two lines to the osgiManifest closure fixed the problem:
classesDir = sourceSets.main.output.classesDir
classpath = sourceSets.main.runtimeClasspath
-- erik

Handling missing configuration in dependency in Gradle build

I have a Gradle build which produces the main deliverable artifact (an installer) of my product. The Gradle project which models this has a number of different dependencies in different configurations. Many of those dependencies are on the default configuration of external modules, and some of those modules have a testResults configuration that contains the (zipped) results of the test task.
It's important that those test results for all dependencies, where they exist, be published as artifacts of the main product build (to use as evidence that testing took place and was successful). It's not an issue if they don't exist.
I tried to do this by iterating over all configurations of the product build, iterating over the dependencies in each and adding a programmatically created dependency (in a new configuration created for this purpose) on the testResults configuration of the module.
In other words, I create dependencies like this:
def processDependencyForTests( Dependency dependency ) {
def testResultsDependency = [
'group' : dependency.group,
'name' : dependency.name,
'version' : dependency.version,
'configuration' : 'testResults'
]
project.dependencies.add 'allTestResults', testResultsDependency
This populates that configuration just fine, but of course when I try to do anything with it, it fails the first time I encounter a dependency on a module that doesn't actually have a testResults configuration:
def resolvedConfiguration = configurations.allTestResults.resolvedConfiguration
Results in this:
Build file 'C:\myproduct\build.gradle' line: 353
* What went wrong:
Execution failed for task ':myproduct:createBuildRecord'.
> Could not resolve all dependencies for configuration ':myproduct:allTestResults'.
> Module version group:mygroup, module:myproduct, version:1.2.3.4, configuration:allTestResults declares a dependency on configuration 'testResults' which is not declared in the module descriptor for group:mygroup, module:mymodule, version:1.0
It's not really practical to instead explicitly list the dependencies in a declarative fashion, because I want them to be derived from "whatever real dependencies the product project has".
How can I ensure that such expected missing configurations don't derail my build? I thought something to do with lenient configurations might be the answer, but I haven't even got that far here (I need to get a ResolvedConfiguration first, as far as I can tell). Alternatively, if the way I'm doing this is insane, what's a more natural Gradle idiom to achieve this?
You'll need to check for the existence of the configuration before referencing it. In cases like this, the gradle DSL documentation is your friend. In fact, the gradle project is one of the most well-documented open source projects I've ever worked with.
Here, you'll find that configurations is simply a container of configuration objects. They are instances of ConfigurationContainer and Configuration respectively. Knowing this, all you need to do is to check whether the configurations container contains a configuration named "testResults".
This can be achieved by the following code:
if (configurations.find { it.name == 'testResults' }) {
// do your stuff
}
It seems implied that Dependency instances passed to your processDependencyForTests method are module dependencies in a multi-modules build.
In this case you could cast them to ProjectDependency which has a dependencyProject property that will allow you to reach the Project object of that dependency. From there you can use depProject.configurations.findByName to test if the configuration exists.
Something along the lines of:
def processDependencyForTests( Dependency dependency ) {
if( dependency instanceof ProjectDependency ) {
ProjectDependency projDep = (ProjectDependency) dependency
if( projDep.dependencyProject.configurations.findByName( 'testResults' ) ) {
def testResultsDependency = [
'group' : dependency.group,
'name' : dependency.name,
'version' : dependency.version,
'configuration' : 'testResults'
]
project.dependencies.add 'allTestResults', testResultsDependency
}
}
HTH

export all defined maven project properties to file?

I have a maven 3 project. In the POM, I define numerous <properties> - some under <project>, others under specific <profile>. is the a way in maven to export all declared properties to a .properties file?
My current way of doing so is to:
create env.properties file in src/main/resources
for each property 'myProp' add this line to env.properties: myProp=${myProp}
enable resource filtering during builds
Seems like there ought to be a way to eliminate step 2 above...
thanks,
-nikita
Use properties-maven-plugin and its write-project-properties goal.
If I understand your requirements correctly, you can do this using the antrun-plugin coupled with Ant's echoproperties task. An example of this configuration is in the StOf question here.

Resources