I'd like to externalize the dependency versions in the POM file to a properties file.
Example: pom.xml
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${externalized.junit.version}</version>
</dependency>
abc.properties
externalized.junit.version=4.8.2
Can this be done? If so, what would be the best way to do it?
No, you cannot do that.
When you're launching a Maven command on a project, the first action it takes is creating the effective model of the project. This notably means reading your POM file, reasoning with activated profiles, applying inheritance, performing interpolation on properties... all of this to build the final Maven model. This work is done by the Maven Model Builder component, whose entry point is the ModelBuilder interface, and exit point is the Model class.
The process of building the model is quite complicated, with a lot of steps divided in possibly 2 phases, but the crux of the issue here is simple: at the end of that step, there is a valid effective model to be used by Maven. Which means that all dependencies must have valid group ids, artifact ids, versions; that there are no duplicate dependencies, no plugin execution's with the same ids, etc. (refer to the model description).
Take note that during the model building, interpolation is happening, meaning that if a property is available at that time, and it was used, like ${myProperty}, then it will be replaced with its corresponding value. Among the properties available are:
POM content, like ${project.version} or ${project.artifactId};
${project.basedir}, which corresponds to the directory where the pom.xml is;
user properties set on the command line with the -D switch (like -DmyProperty=myValue);
any properties set directly in the POM file, inside the <properties> element;
several built-in Maven properties, like ${maven.build.timestamp}, which represents the date/time at which the build started, or ${maven.version} for the current Maven version;
Java system properties, as returned by System.getProperties()
environment variables;
and finally properties set inside the settings.xml file.
The critical point point here, is that the version of dependencies must be valid when the effective model is built. This is the only way to make sure the dependency graph is stable and consistent during the build.
This is exactly what is failing here: you would want Maven to be able to read versions inside a properties file, so it means binding a plugin to a specific phase of the lifecycle to read that file and somehow refer to the properties read using a standard Maven properties. Trouble is, this would all be happening after the model is already built, so it is too late for that. All of this process is happening before the build lifecycle has even started; no chance to invoke a plugin before that.
This also implies that it would work if you were to define the property as a command-line property (since, as outlined above, it is available during the interpolation process when building the model). But that is not a best practice: specifying dependency version as a command-line property makes the build very hard to reproduce. Better to specify it as a Maven property inside the <properties> element of the POM, or to make use of the <dependencyManagement> scheme in a parent POM or also importing a BOM.
If you really need something like this, the easiest way would be to write a shell script (or some other short command line program) that copies the pom, replaces the specified properties in the pom and calls maven.
But as was said before: Probably there are more Maven-like ways to achieve what you want to achieve.
Related
I want to chain two Maven plugins which should execute in sequence. The output from the first plugin should be used as input for the second plugin. Let me explain:
I want to write a plugin which generates resources and sources, such as configuration files, Java classes, ... Let's call this plugin generator-plugin.
This plugin needs input information to generate all this. This information can be retrieved from file system or from a SQL database. Possibly, in the future one might introduce several other input sources. My idea is to write two plugins, one for getting all information from the file system and another from a SQL database.
This gives:
information-plugin-file ---\
|--- generator-plugin
information-plugin-sql ---/
How can this be done with Maven? Can you chain plugins? I am familiar with writing basic Mojo's, but I have no idea how to approach this, hence this question.
One possibility is to output to a standardized file in information-plugin-file/information-plugin-sql and let the subsequent generator-plugin plugin read from the same file (the Unix way of working, everything is a file).
But I am looking for more direct, Maven specific approaches of doing this. Are there such approaches?
With regards to execution order, all plugins will run in the generate-sources phases and will be defined in correct order in the <plugins> section. So that is already covered I think.
AFAIK, plugins in maven are designed to be totally independent, so the following methods of sharing the information can be used:
Sharing via maven properties:
Its possible to set a property in the first plugin, and probably it will be accessible from within the second plugin
import org.apache.maven.project.MavenProject;
// now inject it into your mojo of the first plugin
#Parameter(defaultValue = "${project}")
private MavenProject project;
// Inside the "execute" method:
project.getProperties().setProperty("mySampleProperty", <SOME_VALUE_GOES_HERE>);
Sharing via Files
The first plugin can generate some output file in the 'target' folder
And the second plugin can read this file
Write a "wrapping" plugin that executes other plugins (like both first and second plugin). After all mojos are just java code that can be called from the aggregator plugin
You can find Here more information about this method
I believe the only way you can have something ordered in Maven is through lifecycles. You could have your first plugin (for the input information) run in the generate-sources phase, and the second in process-sources phase.
I want to create archetype from project. But this archetype need to be parametrized. I added my custom parameter to archetype-metadata.xml but it is removed from generated archetype(/target/generated-sources/archetype/src/main/resources/META-INF/maven/archetype-metadata.xml).
<requiredProperties>
<requiredProperty key="custom_parameter"/>
...
What i'm doing wrong?
The second thing is that i need to edit some xml files. In archetype it should contain my paramter(${custom_parameter}). Can it be done by, for example, groovy during archetype generation?
You used the archetype:create-from-project goal? Then you need to specify the parameter propertyFile (http://maven.apache.org/archetype/maven-archetype-plugin/create-from-project-mojo.html#propertyFile) to specify the replacements during the creation of your archetype.
I do not completely understand your second point, but as far as I know you cannot run code during the generation of a project from an archetype. You can specify custom properties, though (as above), but this is pure text replacement thing. Maybe you can achieve more elaborate things through the embedded Velocity engine.
Zeppelin has an object ZeppelinContext, which can then be used to share state between languages and bind variables to angular and thus create cool user interfaces inside Zeppelin notebooks.
We have written numerous convenience methods to create things like drop down menus, buttons, UI stuff, from scala. These methods call ZeppelinContext. We wish to add these methods to an sbt project, so that we can package them in a jar, but it seems the Zeppelin project provides no artifact that contains ZeppelinContext (we have tried many).
Rather there only seems to exist two work arounds:
Build all of Zeppelin and add the resulting jar as an unmanaged jar (not nice).
Use duck typing (also really not nice).
Question: Is there a lesser known resolver / artifact id to get hold of this type?
The ZeppelinContext class is available on github.
From the related pom.xml file the Maven coordinates are:
<groupId>org.apache.zeppelin</groupId>
<artifactId>zeppelin-spark_2.10</artifactId>
Which leads to this Maven dependency on the Maven Central repository.
<dependency>
<groupId>org.apache.zeppelin</groupId>
<artifactId>zeppelin-spark_2.10</artifactId>
<version>0.6.1</version>
</dependency>
Effectively, the jar file contains the ZeppelinContext.class.
What exactly would happen when multiple profiles are activated and they have conflicting definitions of a properties? For example, if there are two profiles both define the properties ${platform-path} but define it as two different values, what would be the final effective value used?
I tried using the help:effective-pom and it seems it is the profile defined later in the settings.xml file who has the last word, but I could not seem to see this behavior documented in either the maven site nor the sonaType book.
I guess it depends on the implementation of the xml parser.
A quick test showed me that the last definition of the variable on the pom file is the one that is considered the correct one.
We are using Maven(3.0.3) as build tool and we need to have different version for different environments (DEV , TEST, QA ) . If we pass version property value during build time based on environment , the installed POM doesn't have the passed property values instead it still has the ${app-version} string.
I saw already there is a bug for this http://jira.codehaus.org/browse/MNG-2971
Is there any other alternative ,because we cannot different POM file for different environments ,which will be hard to maintain..
Thanks
Vijay
Create different artifacts for the environments and use the parameter as a classifier. The pom is the same for all three artifacts but the classifier separates them.
Apparently Maven does not make any variable/property substitution when installing the POM. It is installed as is, that is the principle. You'd better not read any properties from POM (unless this is e.g. version number), bout you should configure your properties in external file (one per stage, e.g. dev.properties, test.properties, ...) and then configure Maven profiles (again, one per stage) and invoke Maven like mvn -Pdev depending on what you want to build. In profile you can package your final application with whatever properties you like (e.g. with the help of build-helper-maven-plugin:add-resource or maven-antrun-plugin + copy rule).
Alternatively you can filter your resources. For example, you can filter your Spring context XML file, which refers the properties file (so you package all property files, but Spring will refer only some specific). Or you can filter another properties file from which you will learn what is the "main" properties file to use (double indirection).
You should create the archives for your different targets within a single build and use as already mentioned the classifier to separate those artifacts from each others.