I have two top-level Maven projects, backend and frontend, that advance versions at their own individual pace. Since each has multiple modules, I define my dependency versions in dependencyManagement sections in the parent/aggregate POMs and use a property for the version number.
I want to cleanly update the property with the version number on frontend, preferably arbitrarily, but I can live with requiring a live upstream version to match. I've tried using versions:update-property, but that goal seems to be completely non-functional; regardless of whether there's actually a matching upstream version, I get this debug output:
$ mvn versions:update-property -Dproperty=frontend.version -DnewVersion=0.13.2 -DautoLinkItems=false -X
...
[DEBUG] Searching for properties associated with builders
[DEBUG] Property ${frontend.version}
[DEBUG] Property ${frontend.version}: Looks like this property is not associated with any dependency...
[DEBUG] Property ${frontend.version}: Set of valid available versions is [0.9.0, 0.9.1, 0.9.2, 0.9.3, 0.9.4, 0.9.5, 0.10.0, 0.10.1, 0.11.0, 0.12.0, 0.13.0, 0.13.1, 0.13.2, 0.13.3]
[DEBUG] Property ${frontend.version}: Restricting results to 0.13.2
[DEBUG] Property ${frontend.version}: Current winner is: null
[DEBUG] Property ${frontend.version}: Searching reactor for a valid version...
[DEBUG] Property ${frontend.version}: Set of valid available versions from the reactor is []
[INFO] Property ${frontend.version}: Leaving unchanged as 0.13.1
[INFO] ------------------------------------------------------------------------
I've specified -DautoLinkItems=false, and this appears to have no effect; versions-maven-plugin still scans all of my POMs for matching dependencies, throws up its hands, and quits. I've also tried setting searchReactor to false for that property in the plugin configuration. It appears that the plugin (1) incorrectly scans the dependencies even when I've explicitly said to ignore them and (2) even filters out an explicit specific match.
Is there a simple way to rewrite a Maven property entry to a specific value, either forcing versions-maven-plugin to do what I say without validating for a version number or by using another goal? I'd prefer to avoid a tool like sed that doesn't understand XML (as I've seen recommended in a similar question), but I would be okay with a simple XPath manipulation.
Is there a simple way to rewrite a Maven property entry to a specific value
Since version 2.5 we can use set-property (documentation):
mvn versions:set-property -Dproperty=your.property -DnewVersion=arbitrary_value
As documented, the set-property goal does not perform any 'sanity checks' on the value you specify, so it should always work, but you should use with care.
The newVersion parameter is badly documented (as is most of this plugin). By checking the integration tests, I see it takes a Maven version range not a simple version number. Also, it does not allow you to provide any value - it must be a valid one that Maven can resolve. The parameter would be better if it was called constrainRange
For anyone else in future, try this:
mvn versions:update-property -Dproperty=frontend.version -DnewVersion=[0.13.2]
If you need to update to a snapshot make sure you set the property allowSnapshots to true
mvn versions:update-property -Dproperty=frontend.version -DnewVersion=[0.13.2] -DallowSnapshots=true
How to update property in existing POM:
Try to use filtering in maven-resource-plugin:
specify version in property file;
add custom filter with path to this file (in child pom.xml, where dependency should be injected);
update version in property file;
run build.
Advantages:
it should work;
version is specified only once;
property file could be added under version control;
process-resources is one of the first maven lifecycle steps.
Disadvantages:
well, pom.xml still uses placeholder;
additional work to automatically update property file from initial build (too complicated, I suppose there should be easier solutions).
How to provide propery on build time:
You could specify any property by build parameter.
For example, I have property in my pom.xml like:
<properties>
<build.date>TODAY</build.date>
</properties>
To change it during build I simply use parameter:
mvn compile -Dbuild.date=10.10.2010
I'm pretty sure it will work for version as well. Also, properties from top level projects are inherited by childs.
I had the same problem and found nothing that changes pom properties in the file.
I ended up using sed like you suggested:
cat pom.xml | sed -e "s%<util.version>0.0.1-SNAPSHOT</util.version>%<util.version>$bamboo_planRepository_branch</util.version>%" > pom.xml.transformed;
rm pom.xml;
mv pom.xml.transformed pom.xml;
The following applies to versions:update-properties goal. I think the same would apply to versions:update-property.
The goal by default only works if a corresponding property definition and dependency declaration appear in the same POM file.
If, say, the property is defined in a project POM but used in a dependency declaration in a module POM, then the following config is required in the project POM for automatic update via Versions plugin.
<properties>
<my.version>3.7.11</my.version>
</properties>
...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>versions-maven-plugin</artifactId>
<version>2.2</version>
<configuration>
<properties>
<property>
<name>my.version</name>
<dependencies>
<dependency>
<groupId>com.acme.test</groupId>
<artifactId>demo-arti</artifactId>
</dependency>
</dependencies>
</property>
</properties>
</configuration>
</plugin>
</plugins>
</build>
The plugin configuration comes into action when the Versions maven plugin runs against the POM and attempts to update the property. The configuration tells the Versions plugin that the property will be used for the stated dependency "in a POM somewhere" even if not in the present POM.
I had issue with update-property, but managed to make it work with set-property:
mvn versions:set-property -Dproperty=mypropery -DnewVersion=myvalue
When you define your property in the pom.xml you must declare as an interval if you want that update-property works.
I mean, sure your frontend.version is defined as follow:
<frontend.version>0.13.1</frontend.version>
Then the plugin when you set -DnewVersion=0.13.2 does not recognise the value as valid value. Instead of if you define as a interval, the plugin works.
<frontend.version>[0.13.0,0.13.2]</frontend.version>
In one of my test i get the following result:
mvn versions:update-property -Dproperty=absis.version -DnewVersion=[2.20.4] -X
[DEBUG] Property ${test.version}: Set of valid available versions is [2.19.0-RC-REVISION-1, 2.19.0-RC0.1, 2.19.0-RC0.2, 2.19.0-RC0.3, 2.19.0-RC0.4, 2.19.0-RC0.5, 2.19.0-RC0.6, 2.19.0-RC0.7, 2.19.0-RC1, 2.19.0-RC2, 2.19.0-RC3, 2.19.0, 2.19.0-revision, 2.19.0-revision2, 2.19.0.2, 2.19.1, 2.19.2, 2.19.3, 2.19.4, 2.20.0-RC0, 2.20.0-RC0.1, 2.20.0-RC1, 2.20.0-RC2, 2.20.0-RC3, 2.20.0, 2.20.0-PRUEBA-VERSION, 2.20.0-PRUEBA-VERSION-2, 2.20.0-PRUEBA-VERSION-3, 2.20.0i-RC1, 2.20.0i-RC1.1, 2.20.0i, 2.20.0i.2, 2.20.1, 2.20.2, 2.20.4, 2.20.5, 2.20.5-LT, 2.20.5.1, 2.20.6i-RC1, 2.21.0-RCtest1, 2.21.0-RCtest2]
[DEBUG] Property ${test.version}: Restricting results to [2.20.4,2.20.4]
[DEBUG] Property ${test.version}: Current winner is: 2.20.4
[DEBUG] Property ${test.version}: Searching reactor for a valid version...
[DEBUG] Property ${test.version}: Set of valid available versions from the reactor is []
[INFO] Updated ${test.version} from [2.19.0,2.21.0-SNAPSHOT] to 2.20.4
Buy you need to change the property value to a range.
It's a shame because I can't use range in my poms definition.
Related
I'm getting an error when running maven build (unable to load a dependency).
[ERROR] Failed to execute goal on . . .
Could not transfer artifact my.group:libme1:${someVariable} from/to . . .
I believe that the developer that published this artifact was supposed to be setting the variable ${someVariable} but didn't. I think this is a bug but I'm trying to work around it by setting the variable.
The POM for the JAR I'm depending on my.group:libme1:1.2.3 looks like this (snippet highlighting the issue):
<groupId>my.group</groupId>
<artifactId>libme1</artifactId>
<parent>
<groupId>my.group</groupId>
<artifactId>libme1-parent</artifactId>
<version>${someVariable}</version>
</parent>
I tried defining it by adding -DsomeVariable=1.2.3 on the command line but it didn't work. For example, this command
mvn -DsomeVariable=1.2.3 clean install
should work based on Baeldung's article but doesn't.
I also ran:
mvn -DsomeVariable=1.2.3 help:effective-pom
and I see the variable being set, so I know he POM I'm using has that defined, but for some reason another POM doesn't pick up that value (or that is how it appears to me).
Is there any way to set the variable so it can be used in another POM? I'm guessing this is not possible.
Searching for an answer I found:
The maven doc
https://maven.apache.org/settings.html#Activation
If you know that this is bug, please let me know. I'm also reaching out to the publish of the artifact to ask them how this is supposed to work.
Basically the dependency's pom is invalid, the reasoning is following:
maven allows developers to do following things:
define dependencies in parent pom
impose restrictions on dependencies via <dependencyManagement> in both current and parent pom
use placeholders ${...} in <version> element, which somehow get resolved via system properties and current/parent pom properties
all those features mentioned above are very convenient from development perspective, however when you publish artifacts those features cause a pain in behind: that became not possible to use external library without it's parent pom, because parent pom may define dependencies and properties.
In your particular case someone have define version of parent pom as ${someVariable}, that in turn means it is not possible to use that library without information about the value of ${someVariable}. However, even if you had known the "correct" value of ${someVariable} and might specify it via system properties, that would cause some weird behaviour: today you may specify one value for ${someVariable}, tomorrow you (or someone else) will specify another value and ultimately you will get different builds, due to that maven denies such configurations (that is much better to fail a build rather than build something unreliable), that would be wiser to initially deny publishing such poms, but we have what we have.
It might be that the variable was stored in some user's settings.xml.
This would allow checking out an older version already in production for writing patches.
<settings>
...
<profiles>
<profile>
<id>work-in-progress</id>
<properties>
<someVariable>1.2.3</someVariable>
</properties>
</profile>
</profiles>
<activeProfiles>
<activeProfile>work-in-progress</activeProfile>
</activeProfiles>
</settings>
So you might do that too. And search in users' directories, .m2 repo directories where usually the settings.xml is stored.
I have a Maven project that builds fine even though I have specified a completely invalid plugin in my POM:
<build>
<plugins>
<plugin>
<groupId>bla</groupId>
<artifactId>bar</artifactId>
<version>1.9.553342342343</version>
<executions>
<execution>
<phase>compile</phase>
</execution>
</executions>
<configuration>
<project>
<inceptionYear>123123</inceptionYear>
<contributors>
asdad
</contributors>
</project>
</configuration>
</plugin>
</plugins>
</build>
I also don't see any errors in Eclipse and even after deleting the ~.m2\repository folder, it still builds fine. Has something changed in how Maven validates plugins? Or is it first when I declare a goal that it blows up?
Your question raises different matters, namely the different kind of validation checks that are performed by Maven, and when they are actually done. Sit tight, there is a lot to say.
Step 1: Model validation
The first set of validation is done right at the start of the build, when the model of the project is built. This process is done by the Model Builder component, and its goal is to parse the POM file into a Model object (so that later, a full MavenProject object can be created from it, performing notably dependency mediation). This validation step is actually splitted in 2 parts:
A raw model validation, which reasons on the POM, before any inheritance or anything is applied. It looks for missing required values, like the presence of a groupId, an artifactId or a version; that repositories have an id; or, in your case here, that a plugin has a groupId and an artifactId. It doesn't actually check if there is a version, because the version could be inherited, and that wasn't done yet.
An effective model validation, which is performed after inheritance, interpolation and profile/default injection. At this point, the model should be completely valid. Notably, it must have a packaging, each dependency must have a version, each plugin must have a version as well, etc. And the plugin version you have is actually perfectly valid, in the sense that 1.9.553342342343 is technically an accepted version number. In fact, practically all String qualify as a valid version number; the illegal characters are \\/:\"<>|?*. Also, the <configuration> of a plugin is not validated, simply because it can't: that is specific for each plugin, and one could potentially declare a <project> parameter. For the same reason, it doesn't check whether the plugin actually exists in a remote repository, or if any goals, phase, etc. are specified.
Therefore, at the end of this step, that POM is fully validated and perfectly OK.
Step 2: Project building
Then comes the step of actually building the MavenProject from it. Because Maven needs to perform dependency mediation on the dependencies of the project, it first has to download them. So if you have any invalid dependencies (i.e. dependencies that cannot be resolved with the configured remote repositories in the settings or the project itself), that'll stop right here.
But if we imagine that the dependencies are correctly resolved, the build will continue to invoke each plugin one by one. The important point is that plugins, and their respective dependencies, are only resolved if Maven detects that they are going to be invoked during the build. If not, Maven will not try to download anything. Furthermore, the validation of the configuration of the plugin is also done when the plugin is actually invoked and the values are injected into it, to be used by it.
Depending on the Maven command that was launched, not all plugins declared in the POM will have work to do. For example, if phases were entered by the user, like mvn clean package, then every plugin bound to a phase of the clean lifecycle, or the default lifecycle up to package, will be invoked; so any plugin bound to the install phase would not be invoked. Also, if the user entered a goal, like mvn org.apache.maven.plugins:maven-clean-plugin:3.0.0:clean then only that specific goal of that specific plugin will be invoked, and all the other plugins will be ignored.
This last part is why the POM in the question poses absolutely no trouble to Maven, and here are multiple points about it:
It is bound to the compile phase, but it doesn't have a <goal>, so even if that phase were to be executed, there is nothing the plugin could do, since no goals were defined. Maven knows about this, and doesn't try to resolve the plugin artifact.
Let's set a <goal> to foo and re-test, by adding <goals><goal>foo</goal></goals> to the plugin declaration. We have in the POM:
<executions>
<execution>
<phase>compile</phase>
<goals><goal>foo</goal></goals>
</execution>
</executions>
Running mvn clean, or mvn clean validate would still cause absolutely no issue: the compile phase was not executed. But now, if we run mvn compile, we'll finally get an error:
Plugin bla:bar:1.9.553342342343 or one of its dependencies could not be resolved
This is, after all, what we wanted. Since the plugin declaration has a phase of compile, and the command used would run that phase, Maven tries to download it (and fails).
So let's remove the phase. What would happen now?
<executions>
<execution>
<goals><goal>foo</goal></goals>
</execution>
</executions>
Actually, running any command with specific phases, like mvn clean or mvn validate, would now fail the build. The reason is that a plugin can have a default phase (see also the defaultPhase attribute on #Mojo annotated goal). Since each plugin has the discretion of providing a default phase to any of its goal, Maven has to download the plugin artifact, and find out if this particular plugin uses a default. So, our build will fail again, yay!
It's a different story if the user invokes a specific goal. Try mvn clean:clean with the above, and it will not fail. Actually, warnings are just going to get printed that Maven can't resolve the plugin artifact, but none of that is an error, since invoking clean:clean will just invoke the specific clean goal of the maven-clean-plugin. And actually, in theory, there shouldn't be any warnings here; Maven shouldn't try to download anything. It's a side-effect from the fact that using the prefix clean demands to checks to remote repositories in order to resolve it (refer to this answer to know how that works). But if you fully qualify it, without any plugin prefix resolution needed, with mvn org.apache.maven.plugins:maven-clean-plugin:3.0.0:clean, you're back to zero errors/warnings.
Finally, if we remove everything and end up with
<executions>
<execution>
</execution>
</executions>
it should be pretty clear, that nothing you'll do with result in an errors, because in no way can that plugin ever be executed. (You'll still get warnings if using a prefix).
Step 3: Plugin configuration
The last part of the question is the simple one: the configuration validation of the plugin. You'll notice that at no point this was mentioned here; this is because it only happens when the plugin is actually executed. And since it doesn't even exist, it's not likely to be executed.
Let's suppose it is, for the sake of the explanation. Each plugin is configured with a specific configurator. By default, it maps the XML elements to classes, fields, lists, maps, arrays, just like you would expect. You could provide your own configurator, but that's not a trivial task. There is actually no real validation performed: basically, if the configurator can wire the proper values in the mojo, it's done. You can check the different types of converters that are present by default, but it comes down to: not specifying a String "foo" to an expected integer value; passing a correct enumeration name if the plugin expects that; passing proper XML configuration for a custom class (i.e. each field with their own XML element)... Worth pointing out that setting "foo" to an expected boolean property is not a problem, it'll wire false into the value.
And finally, the XML configuration that did not map to any parameter of the mojo are completely ignored, so even if the bar plugin existed and didn't take any parameters, passing a <project> in the XML configuration would just be ignored, and wouldn't cause any errors.
I have a Maven project that builds fine even though I have specified a completely invalid plugin in my POM:
<build>
<plugins>
<plugin>
<groupId>bla</groupId>
<artifactId>bar</artifactId>
<version>1.9.553342342343</version>
<executions>
<execution>
<phase>compile</phase>
</execution>
</executions>
<configuration>
<project>
<inceptionYear>123123</inceptionYear>
<contributors>
asdad
</contributors>
</project>
</configuration>
</plugin>
</plugins>
</build>
I also don't see any errors in Eclipse and even after deleting the ~.m2\repository folder, it still builds fine. Has something changed in how Maven validates plugins? Or is it first when I declare a goal that it blows up?
Your question raises different matters, namely the different kind of validation checks that are performed by Maven, and when they are actually done. Sit tight, there is a lot to say.
Step 1: Model validation
The first set of validation is done right at the start of the build, when the model of the project is built. This process is done by the Model Builder component, and its goal is to parse the POM file into a Model object (so that later, a full MavenProject object can be created from it, performing notably dependency mediation). This validation step is actually splitted in 2 parts:
A raw model validation, which reasons on the POM, before any inheritance or anything is applied. It looks for missing required values, like the presence of a groupId, an artifactId or a version; that repositories have an id; or, in your case here, that a plugin has a groupId and an artifactId. It doesn't actually check if there is a version, because the version could be inherited, and that wasn't done yet.
An effective model validation, which is performed after inheritance, interpolation and profile/default injection. At this point, the model should be completely valid. Notably, it must have a packaging, each dependency must have a version, each plugin must have a version as well, etc. And the plugin version you have is actually perfectly valid, in the sense that 1.9.553342342343 is technically an accepted version number. In fact, practically all String qualify as a valid version number; the illegal characters are \\/:\"<>|?*. Also, the <configuration> of a plugin is not validated, simply because it can't: that is specific for each plugin, and one could potentially declare a <project> parameter. For the same reason, it doesn't check whether the plugin actually exists in a remote repository, or if any goals, phase, etc. are specified.
Therefore, at the end of this step, that POM is fully validated and perfectly OK.
Step 2: Project building
Then comes the step of actually building the MavenProject from it. Because Maven needs to perform dependency mediation on the dependencies of the project, it first has to download them. So if you have any invalid dependencies (i.e. dependencies that cannot be resolved with the configured remote repositories in the settings or the project itself), that'll stop right here.
But if we imagine that the dependencies are correctly resolved, the build will continue to invoke each plugin one by one. The important point is that plugins, and their respective dependencies, are only resolved if Maven detects that they are going to be invoked during the build. If not, Maven will not try to download anything. Furthermore, the validation of the configuration of the plugin is also done when the plugin is actually invoked and the values are injected into it, to be used by it.
Depending on the Maven command that was launched, not all plugins declared in the POM will have work to do. For example, if phases were entered by the user, like mvn clean package, then every plugin bound to a phase of the clean lifecycle, or the default lifecycle up to package, will be invoked; so any plugin bound to the install phase would not be invoked. Also, if the user entered a goal, like mvn org.apache.maven.plugins:maven-clean-plugin:3.0.0:clean then only that specific goal of that specific plugin will be invoked, and all the other plugins will be ignored.
This last part is why the POM in the question poses absolutely no trouble to Maven, and here are multiple points about it:
It is bound to the compile phase, but it doesn't have a <goal>, so even if that phase were to be executed, there is nothing the plugin could do, since no goals were defined. Maven knows about this, and doesn't try to resolve the plugin artifact.
Let's set a <goal> to foo and re-test, by adding <goals><goal>foo</goal></goals> to the plugin declaration. We have in the POM:
<executions>
<execution>
<phase>compile</phase>
<goals><goal>foo</goal></goals>
</execution>
</executions>
Running mvn clean, or mvn clean validate would still cause absolutely no issue: the compile phase was not executed. But now, if we run mvn compile, we'll finally get an error:
Plugin bla:bar:1.9.553342342343 or one of its dependencies could not be resolved
This is, after all, what we wanted. Since the plugin declaration has a phase of compile, and the command used would run that phase, Maven tries to download it (and fails).
So let's remove the phase. What would happen now?
<executions>
<execution>
<goals><goal>foo</goal></goals>
</execution>
</executions>
Actually, running any command with specific phases, like mvn clean or mvn validate, would now fail the build. The reason is that a plugin can have a default phase (see also the defaultPhase attribute on #Mojo annotated goal). Since each plugin has the discretion of providing a default phase to any of its goal, Maven has to download the plugin artifact, and find out if this particular plugin uses a default. So, our build will fail again, yay!
It's a different story if the user invokes a specific goal. Try mvn clean:clean with the above, and it will not fail. Actually, warnings are just going to get printed that Maven can't resolve the plugin artifact, but none of that is an error, since invoking clean:clean will just invoke the specific clean goal of the maven-clean-plugin. And actually, in theory, there shouldn't be any warnings here; Maven shouldn't try to download anything. It's a side-effect from the fact that using the prefix clean demands to checks to remote repositories in order to resolve it (refer to this answer to know how that works). But if you fully qualify it, without any plugin prefix resolution needed, with mvn org.apache.maven.plugins:maven-clean-plugin:3.0.0:clean, you're back to zero errors/warnings.
Finally, if we remove everything and end up with
<executions>
<execution>
</execution>
</executions>
it should be pretty clear, that nothing you'll do with result in an errors, because in no way can that plugin ever be executed. (You'll still get warnings if using a prefix).
Step 3: Plugin configuration
The last part of the question is the simple one: the configuration validation of the plugin. You'll notice that at no point this was mentioned here; this is because it only happens when the plugin is actually executed. And since it doesn't even exist, it's not likely to be executed.
Let's suppose it is, for the sake of the explanation. Each plugin is configured with a specific configurator. By default, it maps the XML elements to classes, fields, lists, maps, arrays, just like you would expect. You could provide your own configurator, but that's not a trivial task. There is actually no real validation performed: basically, if the configurator can wire the proper values in the mojo, it's done. You can check the different types of converters that are present by default, but it comes down to: not specifying a String "foo" to an expected integer value; passing a correct enumeration name if the plugin expects that; passing proper XML configuration for a custom class (i.e. each field with their own XML element)... Worth pointing out that setting "foo" to an expected boolean property is not a problem, it'll wire false into the value.
And finally, the XML configuration that did not map to any parameter of the mojo are completely ignored, so even if the bar plugin existed and didn't take any parameters, passing a <project> in the XML configuration would just be ignored, and wouldn't cause any errors.
I'm running a multi-module maven project and have an unexpected behavior. First time I'm seeing this...
My parent module configures the install plugin, defining its classifier.
<plugin>
<artifactId>maven-install-plugin</artifactId>
<configuration>
<generatePom>true</generatePom>
<classifier>${env}</classifier>
</configuration>
</plugin>
<!-- ... -->
<modules>
<module>webapp-formation</module>
<module>db-formation</module>
</modules>
But when I'm running mvn install the .pom files are not generate for my modules. Only my parent is associated with a .pom file in my repositories. Thus trying to browse to my module's artifact on Archiva (after running mvn depoy of course!) it simply fails. I can browse to the parent but not its children.
So... I need to add the undocumented attribute generatePom to my plugin configuration to have the .pom files generated --copied would be a better word actually-- for all my modules. --I said undocumented attribute because this attribute is documented only for the install-file goal which is not the one ran by default. The install goal is not expecting that attribute...
Of course, if I do not configure my install plugin --so not configuring the classifier-- I have no problem and all .pom files are generated properly.
For you guys, is that a normal behavior? Something that you have already seen? Or should I just file a bug?
Thanks,
Olivier.
What you describe as an undocumented attribute is simply wrong, cause the attributes are specific on a goal base which means the given configuration will not change anything, cause the generatePom attribute is only valid for install-file goal. So you can remove it.
In general such configuration does not make sense, cause if you have different environments you should go a different way. Just removed hte configuration with <classifier>${env}</classifier> as well and try to deploy via:
mvn clean deploy
Why is this NOT working ? How to pick the version numbers from the properties file.
Reading properties in pom.xml
<project>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0</version>
<executions>
<execution>
<phase>initialize</phase>
<goals>
<goal>read-project-properties</goal>
</goals>
</execution>
<configuration>
<files>
<file>dev.properties</file>
</files>
</configuration>
</executions>
</plugin>
</plugins>
</build>
</project>
In dev.properties
org.aspectj.aspectjrt.version=1.6.11
Dependency in pom.xml
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${org.aspectj.aspectjrt.version}</version>
</dependency>
Error : dependency must be a valid version
When you start up Maven from the command line, it goes through a number of stages. Here is a pseudo description of these stages, I am intentionally simplifying the exact sequencing (with the risk of saying things that are slightly incorrect/out-of-order) so that you can see why what you are trying to do cannot work.
First it parses your command line, any properties defined on the command line using the -Dname=value are injected into the MavenSession
The reactor defining command line options are checked to decide what the list of projects to build (also known as the reactor) should be. -N means build only the root pom.xml, -pl allows specifying a list of modules to build, -am and -amd allows adding upstream or downstream, respectively, modules from those specified by -pl. Maven has not parsed any pom.xml files at this point in time.
The -P profile activation rules are parsed to see what profiles to activate.
Now Maven has enough knowledge to start parsing the pom.xml files. It starts by loading and parsing the root pom.xml, i.e. the one in the current directory (or if you specified an alternative pom.xml with -f then that one). This initial parse is just concentrating on figuring out the list of projects to build. Profile activation is only considered in so much as it may affect the list of <modules> that are available. The Group Id, Artifact Id, Version and Packaging coordinates in the pom.xml cannot contain properties because the parsing of the properties in the pom.xml has not taken place at this point in time. (The observant reader will also see that this also explains why you cannot activate profiles based on properties within the pom.xml, as only system properties have been parsed at this stage)
Once the set of projects has been validated, Maven now does some more parsing of those pom.xml files to construct the list of build extensions (if any) and the list of plugins. At this stage the parsing requires evaluation of the <properties> in each project, so this is when these get evaluated and "injected" into the effective model. Thus you can use system properties and pom properties to define the coordinates and additional dependencies within (xpath) /project/build/extensions, /project/build/pluginManagement/plugins/plugin, /project/build/pluginManagement/plugins/plugin/dependencies, /project/build/plugins/plugin and /project/build/plugins/plugin/dependencies.
Now Maven starts parsing the list of goals and phases specified on the command line. Partially specified goals are evaluated for a match against the list of plugins. The match must be unique for all the projects that the plugin goal will be executed against (i.e. if it is an aggregator goal, the match is only required at the root, but for all other "normal" goals, the plugin short name must be the same plugin for all projects). Lifecycle phases must be from one of the default lifecycles, or from a lifecycle defined in a build extension.
From the parsed list of goals and phases, Maven constructs the build plan, i.e. what it is going to do on which projects and in what order. In order to do this Maven must parse the list of project dependencies defined in the reactor projects pom.xml files. This is because a dependency may be produced by another project within the reactor, thereby forcing a sequencing of project execution. Thus you can use system properties and pom properties to define the coordinates and additional dependencies within (xpath) /project/dependencyManagement/dependencies/dependency and /project/dependencies/dependency but note that at this point in time, no plugins have been executed.
Now that Maven has the build plan, it starts following that plan in the order that it constructed. If the first goal/phase on the CLI was a goal, then that goal will be invoked. If the first goal/phase was a phase from the default build lifecycle, then Maven will start with the initialize phase and execute all the plugins bound to that phase... continuing in a similar manner along the list of phases and then the list of projects. Note also that the initialize phase is only executed as part of the default build lifecycle. It is not executed on the default clean or default site lifecycles, and it is not executed on any custom lifecycles. (The observant reader will conclude that this highlights another problem with the technique that the question is attempting). Note: keep in mind that aggregator goals form a "break" in the reactor, so if you ask Maven to run clean package foo:bar site where foo:bar is an aggregator mojo goal, then clean package will be run against all the projects in the reactor, then foo:bar will be run against the root, then site will be run against all the projects in the reactor. In other words, the build plan will take the longest continuous run of non-aggregator goals & phases, split by longest continuous runs of aggregator goals.
Before it calls each mojo (i.e. goal bound to a phase or directly specified from the command line) Maven evaluates the pom.xml for the effective <configuration> of that mojo. At this point Maven has available the system properties, the properties specified in the pom and any properties injected into the MavenSession by previously executed mojos. Thus the <configuration> can reference any of those properties...
Aside
Now there is a caveat... if you say set (xpath) /project/build/directory to ${some-property-i-will-set-via-a-mojo} and then reference that from your <configuration>, well the sad news is that (xpath) /project/build/directory will have been evaluated into the effective pom.xml before any plugin execution, so ${project.build.directory} will have been given the literal value ${some-property-i-will-set-via-a-mojo} and that is of type java.io.File in the MavenProject so what you will actually have had happen is new File(project.getBaseDir(),"${some-property-i-will-set-via-a-mojo}"). If the <configuration> field you are injecting into is of type File, there will be no type conversion required, and hence the value will be injected straight in, and no property substitution will have taken place.
There are other edge cases, like the one outlined above, but in general property substitution will work with "mojo injected" properties (such as those provided by Mojo's Properties Maven Plugin) within the <configuration> sections. It will not work outside of those sections.
So here is Stephen's quick rule of thumb for the different property types:
System Properties
These work everywhere... but are extremely dangerous in /project/(parent/)?/(groupId|artifactId|version|packaging) as you have no control what so ever on what system properties will be defined when the project is pulled in as a transitive dependency. Use of ${...} expansion within /project/(parent/)?/(groupId|artifactId|version|packaging) should be considered as equivalent to driving a car at 200kph with a 30cm (12 inch) metal spike protruding from the steering wheel in place of an airbag... oh and no seat belt... and you've just had 10 units of Alcohol and two lines of cocaine.
pom.xml properties (and settings.xml properties)
These work in most places, but are never available within /project/(parent/)?/(groupId|artifactId|version|packaging) (as they have not been parsed when those fields are being evaluated) and are not available for consideration of the active profiles (again as they have not been parsed when profile activation is being evaluated)
Mojo injected properties
These work within <configuration> sections and may (due to the recursive interpolation of injected Mojo String parameters) work when used indirectly, but given the uncertainty involved, the recommendation is to restrict their use to the <configuration> section of plugins and reports only.
One final thing
Think about what happens when your project is listed as a dependency. If you had specified its dependencies by using a mojo to pull those from a .properties file on disk, Maven has no way to replicate that when your dependency has been pulled from the Maven repository. So Maven would be unable to determine the dependencies. Thus it could never work.
What you could do, is use an external system (e.g. ANT) to generate the pom.xml from a template with the versions replaced into that file. And then use the instantiated template to build.
This properties(dev.properties) file for build job.
<files>
<file>dev.properties</file>
</files>
For dependency you need add like below in your pom.xml file.
<properties>
<org.aspectj.aspectjrt.version>1.6.11</org.aspectj.aspectjrt.version>
</properties>