I'm running a multi-module maven project and have an unexpected behavior. First time I'm seeing this...
My parent module configures the install plugin, defining its classifier.
<plugin>
<artifactId>maven-install-plugin</artifactId>
<configuration>
<generatePom>true</generatePom>
<classifier>${env}</classifier>
</configuration>
</plugin>
<!-- ... -->
<modules>
<module>webapp-formation</module>
<module>db-formation</module>
</modules>
But when I'm running mvn install the .pom files are not generate for my modules. Only my parent is associated with a .pom file in my repositories. Thus trying to browse to my module's artifact on Archiva (after running mvn depoy of course!) it simply fails. I can browse to the parent but not its children.
So... I need to add the undocumented attribute generatePom to my plugin configuration to have the .pom files generated --copied would be a better word actually-- for all my modules. --I said undocumented attribute because this attribute is documented only for the install-file goal which is not the one ran by default. The install goal is not expecting that attribute...
Of course, if I do not configure my install plugin --so not configuring the classifier-- I have no problem and all .pom files are generated properly.
For you guys, is that a normal behavior? Something that you have already seen? Or should I just file a bug?
Thanks,
Olivier.
What you describe as an undocumented attribute is simply wrong, cause the attributes are specific on a goal base which means the given configuration will not change anything, cause the generatePom attribute is only valid for install-file goal. So you can remove it.
In general such configuration does not make sense, cause if you have different environments you should go a different way. Just removed hte configuration with <classifier>${env}</classifier> as well and try to deploy via:
mvn clean deploy
Related
I have a Maven project that builds fine even though I have specified a completely invalid plugin in my POM:
<build>
<plugins>
<plugin>
<groupId>bla</groupId>
<artifactId>bar</artifactId>
<version>1.9.553342342343</version>
<executions>
<execution>
<phase>compile</phase>
</execution>
</executions>
<configuration>
<project>
<inceptionYear>123123</inceptionYear>
<contributors>
asdad
</contributors>
</project>
</configuration>
</plugin>
</plugins>
</build>
I also don't see any errors in Eclipse and even after deleting the ~.m2\repository folder, it still builds fine. Has something changed in how Maven validates plugins? Or is it first when I declare a goal that it blows up?
Your question raises different matters, namely the different kind of validation checks that are performed by Maven, and when they are actually done. Sit tight, there is a lot to say.
Step 1: Model validation
The first set of validation is done right at the start of the build, when the model of the project is built. This process is done by the Model Builder component, and its goal is to parse the POM file into a Model object (so that later, a full MavenProject object can be created from it, performing notably dependency mediation). This validation step is actually splitted in 2 parts:
A raw model validation, which reasons on the POM, before any inheritance or anything is applied. It looks for missing required values, like the presence of a groupId, an artifactId or a version; that repositories have an id; or, in your case here, that a plugin has a groupId and an artifactId. It doesn't actually check if there is a version, because the version could be inherited, and that wasn't done yet.
An effective model validation, which is performed after inheritance, interpolation and profile/default injection. At this point, the model should be completely valid. Notably, it must have a packaging, each dependency must have a version, each plugin must have a version as well, etc. And the plugin version you have is actually perfectly valid, in the sense that 1.9.553342342343 is technically an accepted version number. In fact, practically all String qualify as a valid version number; the illegal characters are \\/:\"<>|?*. Also, the <configuration> of a plugin is not validated, simply because it can't: that is specific for each plugin, and one could potentially declare a <project> parameter. For the same reason, it doesn't check whether the plugin actually exists in a remote repository, or if any goals, phase, etc. are specified.
Therefore, at the end of this step, that POM is fully validated and perfectly OK.
Step 2: Project building
Then comes the step of actually building the MavenProject from it. Because Maven needs to perform dependency mediation on the dependencies of the project, it first has to download them. So if you have any invalid dependencies (i.e. dependencies that cannot be resolved with the configured remote repositories in the settings or the project itself), that'll stop right here.
But if we imagine that the dependencies are correctly resolved, the build will continue to invoke each plugin one by one. The important point is that plugins, and their respective dependencies, are only resolved if Maven detects that they are going to be invoked during the build. If not, Maven will not try to download anything. Furthermore, the validation of the configuration of the plugin is also done when the plugin is actually invoked and the values are injected into it, to be used by it.
Depending on the Maven command that was launched, not all plugins declared in the POM will have work to do. For example, if phases were entered by the user, like mvn clean package, then every plugin bound to a phase of the clean lifecycle, or the default lifecycle up to package, will be invoked; so any plugin bound to the install phase would not be invoked. Also, if the user entered a goal, like mvn org.apache.maven.plugins:maven-clean-plugin:3.0.0:clean then only that specific goal of that specific plugin will be invoked, and all the other plugins will be ignored.
This last part is why the POM in the question poses absolutely no trouble to Maven, and here are multiple points about it:
It is bound to the compile phase, but it doesn't have a <goal>, so even if that phase were to be executed, there is nothing the plugin could do, since no goals were defined. Maven knows about this, and doesn't try to resolve the plugin artifact.
Let's set a <goal> to foo and re-test, by adding <goals><goal>foo</goal></goals> to the plugin declaration. We have in the POM:
<executions>
<execution>
<phase>compile</phase>
<goals><goal>foo</goal></goals>
</execution>
</executions>
Running mvn clean, or mvn clean validate would still cause absolutely no issue: the compile phase was not executed. But now, if we run mvn compile, we'll finally get an error:
Plugin bla:bar:1.9.553342342343 or one of its dependencies could not be resolved
This is, after all, what we wanted. Since the plugin declaration has a phase of compile, and the command used would run that phase, Maven tries to download it (and fails).
So let's remove the phase. What would happen now?
<executions>
<execution>
<goals><goal>foo</goal></goals>
</execution>
</executions>
Actually, running any command with specific phases, like mvn clean or mvn validate, would now fail the build. The reason is that a plugin can have a default phase (see also the defaultPhase attribute on #Mojo annotated goal). Since each plugin has the discretion of providing a default phase to any of its goal, Maven has to download the plugin artifact, and find out if this particular plugin uses a default. So, our build will fail again, yay!
It's a different story if the user invokes a specific goal. Try mvn clean:clean with the above, and it will not fail. Actually, warnings are just going to get printed that Maven can't resolve the plugin artifact, but none of that is an error, since invoking clean:clean will just invoke the specific clean goal of the maven-clean-plugin. And actually, in theory, there shouldn't be any warnings here; Maven shouldn't try to download anything. It's a side-effect from the fact that using the prefix clean demands to checks to remote repositories in order to resolve it (refer to this answer to know how that works). But if you fully qualify it, without any plugin prefix resolution needed, with mvn org.apache.maven.plugins:maven-clean-plugin:3.0.0:clean, you're back to zero errors/warnings.
Finally, if we remove everything and end up with
<executions>
<execution>
</execution>
</executions>
it should be pretty clear, that nothing you'll do with result in an errors, because in no way can that plugin ever be executed. (You'll still get warnings if using a prefix).
Step 3: Plugin configuration
The last part of the question is the simple one: the configuration validation of the plugin. You'll notice that at no point this was mentioned here; this is because it only happens when the plugin is actually executed. And since it doesn't even exist, it's not likely to be executed.
Let's suppose it is, for the sake of the explanation. Each plugin is configured with a specific configurator. By default, it maps the XML elements to classes, fields, lists, maps, arrays, just like you would expect. You could provide your own configurator, but that's not a trivial task. There is actually no real validation performed: basically, if the configurator can wire the proper values in the mojo, it's done. You can check the different types of converters that are present by default, but it comes down to: not specifying a String "foo" to an expected integer value; passing a correct enumeration name if the plugin expects that; passing proper XML configuration for a custom class (i.e. each field with their own XML element)... Worth pointing out that setting "foo" to an expected boolean property is not a problem, it'll wire false into the value.
And finally, the XML configuration that did not map to any parameter of the mojo are completely ignored, so even if the bar plugin existed and didn't take any parameters, passing a <project> in the XML configuration would just be ignored, and wouldn't cause any errors.
I have a Maven project that builds fine even though I have specified a completely invalid plugin in my POM:
<build>
<plugins>
<plugin>
<groupId>bla</groupId>
<artifactId>bar</artifactId>
<version>1.9.553342342343</version>
<executions>
<execution>
<phase>compile</phase>
</execution>
</executions>
<configuration>
<project>
<inceptionYear>123123</inceptionYear>
<contributors>
asdad
</contributors>
</project>
</configuration>
</plugin>
</plugins>
</build>
I also don't see any errors in Eclipse and even after deleting the ~.m2\repository folder, it still builds fine. Has something changed in how Maven validates plugins? Or is it first when I declare a goal that it blows up?
Your question raises different matters, namely the different kind of validation checks that are performed by Maven, and when they are actually done. Sit tight, there is a lot to say.
Step 1: Model validation
The first set of validation is done right at the start of the build, when the model of the project is built. This process is done by the Model Builder component, and its goal is to parse the POM file into a Model object (so that later, a full MavenProject object can be created from it, performing notably dependency mediation). This validation step is actually splitted in 2 parts:
A raw model validation, which reasons on the POM, before any inheritance or anything is applied. It looks for missing required values, like the presence of a groupId, an artifactId or a version; that repositories have an id; or, in your case here, that a plugin has a groupId and an artifactId. It doesn't actually check if there is a version, because the version could be inherited, and that wasn't done yet.
An effective model validation, which is performed after inheritance, interpolation and profile/default injection. At this point, the model should be completely valid. Notably, it must have a packaging, each dependency must have a version, each plugin must have a version as well, etc. And the plugin version you have is actually perfectly valid, in the sense that 1.9.553342342343 is technically an accepted version number. In fact, practically all String qualify as a valid version number; the illegal characters are \\/:\"<>|?*. Also, the <configuration> of a plugin is not validated, simply because it can't: that is specific for each plugin, and one could potentially declare a <project> parameter. For the same reason, it doesn't check whether the plugin actually exists in a remote repository, or if any goals, phase, etc. are specified.
Therefore, at the end of this step, that POM is fully validated and perfectly OK.
Step 2: Project building
Then comes the step of actually building the MavenProject from it. Because Maven needs to perform dependency mediation on the dependencies of the project, it first has to download them. So if you have any invalid dependencies (i.e. dependencies that cannot be resolved with the configured remote repositories in the settings or the project itself), that'll stop right here.
But if we imagine that the dependencies are correctly resolved, the build will continue to invoke each plugin one by one. The important point is that plugins, and their respective dependencies, are only resolved if Maven detects that they are going to be invoked during the build. If not, Maven will not try to download anything. Furthermore, the validation of the configuration of the plugin is also done when the plugin is actually invoked and the values are injected into it, to be used by it.
Depending on the Maven command that was launched, not all plugins declared in the POM will have work to do. For example, if phases were entered by the user, like mvn clean package, then every plugin bound to a phase of the clean lifecycle, or the default lifecycle up to package, will be invoked; so any plugin bound to the install phase would not be invoked. Also, if the user entered a goal, like mvn org.apache.maven.plugins:maven-clean-plugin:3.0.0:clean then only that specific goal of that specific plugin will be invoked, and all the other plugins will be ignored.
This last part is why the POM in the question poses absolutely no trouble to Maven, and here are multiple points about it:
It is bound to the compile phase, but it doesn't have a <goal>, so even if that phase were to be executed, there is nothing the plugin could do, since no goals were defined. Maven knows about this, and doesn't try to resolve the plugin artifact.
Let's set a <goal> to foo and re-test, by adding <goals><goal>foo</goal></goals> to the plugin declaration. We have in the POM:
<executions>
<execution>
<phase>compile</phase>
<goals><goal>foo</goal></goals>
</execution>
</executions>
Running mvn clean, or mvn clean validate would still cause absolutely no issue: the compile phase was not executed. But now, if we run mvn compile, we'll finally get an error:
Plugin bla:bar:1.9.553342342343 or one of its dependencies could not be resolved
This is, after all, what we wanted. Since the plugin declaration has a phase of compile, and the command used would run that phase, Maven tries to download it (and fails).
So let's remove the phase. What would happen now?
<executions>
<execution>
<goals><goal>foo</goal></goals>
</execution>
</executions>
Actually, running any command with specific phases, like mvn clean or mvn validate, would now fail the build. The reason is that a plugin can have a default phase (see also the defaultPhase attribute on #Mojo annotated goal). Since each plugin has the discretion of providing a default phase to any of its goal, Maven has to download the plugin artifact, and find out if this particular plugin uses a default. So, our build will fail again, yay!
It's a different story if the user invokes a specific goal. Try mvn clean:clean with the above, and it will not fail. Actually, warnings are just going to get printed that Maven can't resolve the plugin artifact, but none of that is an error, since invoking clean:clean will just invoke the specific clean goal of the maven-clean-plugin. And actually, in theory, there shouldn't be any warnings here; Maven shouldn't try to download anything. It's a side-effect from the fact that using the prefix clean demands to checks to remote repositories in order to resolve it (refer to this answer to know how that works). But if you fully qualify it, without any plugin prefix resolution needed, with mvn org.apache.maven.plugins:maven-clean-plugin:3.0.0:clean, you're back to zero errors/warnings.
Finally, if we remove everything and end up with
<executions>
<execution>
</execution>
</executions>
it should be pretty clear, that nothing you'll do with result in an errors, because in no way can that plugin ever be executed. (You'll still get warnings if using a prefix).
Step 3: Plugin configuration
The last part of the question is the simple one: the configuration validation of the plugin. You'll notice that at no point this was mentioned here; this is because it only happens when the plugin is actually executed. And since it doesn't even exist, it's not likely to be executed.
Let's suppose it is, for the sake of the explanation. Each plugin is configured with a specific configurator. By default, it maps the XML elements to classes, fields, lists, maps, arrays, just like you would expect. You could provide your own configurator, but that's not a trivial task. There is actually no real validation performed: basically, if the configurator can wire the proper values in the mojo, it's done. You can check the different types of converters that are present by default, but it comes down to: not specifying a String "foo" to an expected integer value; passing a correct enumeration name if the plugin expects that; passing proper XML configuration for a custom class (i.e. each field with their own XML element)... Worth pointing out that setting "foo" to an expected boolean property is not a problem, it'll wire false into the value.
And finally, the XML configuration that did not map to any parameter of the mojo are completely ignored, so even if the bar plugin existed and didn't take any parameters, passing a <project> in the XML configuration would just be ignored, and wouldn't cause any errors.
If all classes are up-to-date "Nothing to compile - all classes are up to date"
so will maven create jar again?
As I am seeing in my log that jar is not creating again. so maven come to know that all classes are up-to-date.
Question: is there any process or another thing which work on this?
The Maven Jar Plugin will create a jar via its jar goal if none exists or skip its creation if existing but nothing changed.
You can force the creation of the jar via its forceCreation option (since version 2.2). From official documentation:
Require the jar plugin to build a new JAR even if none of the contents appear to have changed. By default, this plugin looks to see if the output jar exists and inputs have not changed. If these conditions are true, the plugin skips creation of the jar. This does not work when other plugins, like the maven-shade-plugin, are configured to post-process the jar. This plugin can not detect the post-processing, and so leaves the post-processed jar in place. This can lead to failures when those plugins do not expect to find their own output as an input. Set this parameter to true to avoid these problems by forcing this plugin to recreate the jar every time.
Its default value is at false, which explains the behavior you are having.
If you want to force it always, you can add to your pom file:
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.6</version>
<configuration>
<forceCreation>true</forceCreation>
</configuration>
...
</plugin>
</plugins>
</build>
...
</project>
Or just on a single build, invoke it as following:
mvn package -Djar.forceCreation=true
So, going back to your question:
is there any process or another thing which work on this?
The answer is: Yes, the Maven Jar Plugin works on this and the option above will change its behavior.
I have inherited a POM that attempts to avoid repeating build steps by using a profile
that is only activated when the step output does not exist:
<profile>
<id>run-once</id>
<activation>
<file>
<missing>target/some-output</missing>
</file>
</activation>
<build>
<plugins>
<plugin>
...
<executions>
<execution>
... slow process to produce target/some-output ...
</execution>
</executions>
</plugin>
</plugins>
</build>
</activation>
However, as maven experts no doubt realized immediately, this does not work if the developer says mvn clean install. Maven calculates the active profiles once, before running clean, and if target/some-output was present, then the run-once profile is not active. The result is that target/some-output is removed by the clean phase but is not recreated in the install phase, and the ensuing WAR is broken because some-output is missing.
Is there a standard solution to this problem (besides avoiding mvn clean install) ? I'm about to make the plugin unconditional to prevent the silent creation of a broken WAR.
More generally, is there a standard technique to prevent mvn from recreating artifacts like some-output that are up-to-date? Or is the idea that if make-style dependency management is important, one should use gradle or rake instead of maven?
I don't think there is a standard solution to this problem. There are though various options that I can think of (there are most likely others as well):
you could obviously activate the profile manually: mvn clean install -Prun-once, but then you have to remember to do that each time of course
configure the maven-enforcer-plugin together with its requireFilesExist rule to make sure the files exist and fail the build if they don't. (at least then you wont get the silent creation of a broken WAR)
modify the profile to have it create the files to a location under your src folder (i.e. src/main/gen) which is excluded from being checked into your source repository (if you are using one), and then configure the maven-resources-plugin and its copy-resources goal to copy these resources to the correct location under your build directory. This way clean wont delete them.
I'm trying to convert a project from ant to maven.
The unit tests depend on a third party binary jar, which is not available in any public maven repositories.
How do I make maven handle this situation? I have found two solutions, neither of which are acceptable. First is to use a system dependency; this doesn't work because a) the dependency should only be for the tests, and b) the dependency is not found by eclipse after generating an eclipse project.
Second is to manually install the dependency in a local repository. This seems to be the recommended way. I don't want to do this because I want users to be able to build and test with a simple 'mvn test'. If users have to read a document and copy/paste some shell commands to be able to build and test, then something's wrong.
I suppose it would be OK if maven itself installed the dependency in the local repository as part of the build - is this possible, and if so, how?
Aled.
You may want to look at install:install-file. You can make it execute in the early phase of your project (validate or initialize) via standard means.
On the second thought, if it fails because of missing dependency in the same project, there are couple more options. One is to call ant script via antrun plugin to install artifact.
Or create additional module not dependent on your artifact to be executed prior to main module and have that module install artifact as described earlier.
First of all my way would be using a repository manager such as nexus and installing this dependency to there.
However there is another solution. You can include this 3rd party jar to your project and with test plugin you can configure to include it in classpath such this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.10</version>
<configuration>
<additionalClasspathElements>
<additionalClasspathElement>path/to/additional/resources</additionalClasspathElement>
<additionalClasspathElement>path/to/additional/jar</additionalClasspathElement>
</additionalClasspathElements>
</configuration>
</plugin>
By the way, I hope that you are aware of that maven is executing surefire plugin in order to run tests by default lifecycle.