If all classes are up-to-date "Nothing to compile - all classes are up to date"
so will maven create jar again?
As I am seeing in my log that jar is not creating again. so maven come to know that all classes are up-to-date.
Question: is there any process or another thing which work on this?
The Maven Jar Plugin will create a jar via its jar goal if none exists or skip its creation if existing but nothing changed.
You can force the creation of the jar via its forceCreation option (since version 2.2). From official documentation:
Require the jar plugin to build a new JAR even if none of the contents appear to have changed. By default, this plugin looks to see if the output jar exists and inputs have not changed. If these conditions are true, the plugin skips creation of the jar. This does not work when other plugins, like the maven-shade-plugin, are configured to post-process the jar. This plugin can not detect the post-processing, and so leaves the post-processed jar in place. This can lead to failures when those plugins do not expect to find their own output as an input. Set this parameter to true to avoid these problems by forcing this plugin to recreate the jar every time.
Its default value is at false, which explains the behavior you are having.
If you want to force it always, you can add to your pom file:
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.6</version>
<configuration>
<forceCreation>true</forceCreation>
</configuration>
...
</plugin>
</plugins>
</build>
...
</project>
Or just on a single build, invoke it as following:
mvn package -Djar.forceCreation=true
So, going back to your question:
is there any process or another thing which work on this?
The answer is: Yes, the Maven Jar Plugin works on this and the option above will change its behavior.
Related
I am creating my own maven-environment-plugin that creates and bundle resources for a predefined folder structure for each environment defined in the configuration. The plugin is outputting the folder structure and resource in a zip file and placing it in the target folder.
Questions:
How can I make my plugin work like the maven-assembly-plugin so my output to target folder also ends up in my local repository when I use 'mvn install'?
Do I need to mark it or something? Its automaticallly doing it when the maven-assembly-plugin is used.
How does maven-assembly-plugin manage to make sure of this?
I am using mojo for my plugin development.
<plugin>
<groupId>dk.kmd.devops.maven.plugin</groupId>
<artifactId>envconfiguration-maven-plugin</artifactId>
<version>1.0.3</version>
<configuration>
<environments>
<environment>${env.local}</environment>
<environment>${env.dev}</environment>
<environment>${env.t1}</environment>
<environment>${env.t2}</environment>
<environment>${env.p0}</environment>
</environments>
<sourceConfigDir>${basedir}/src/main/config</sourceConfigDir>
<zipEnvironments>true</zipEnvironments>
</configuration>
<executions>
<execution>
<phase>generate-resources</phase>
<goals>
<goal>generateEnv</goal>
</goals>
</execution>
</executions>
</plugin>
You need to attach (that's the correct terminology in this case) the new artifact (the generated zip file) to the build as part of its official artifacts.
This is basically what the attach-artifact goal of the build-helper-maven-plugin does:
Attach additional artifacts to be installed and deployed.
From its official examples, the attach goal:
Typically run after antrun:run, or another plugin, that produces files that you want to attach to the project for install and deploy.
The another plugin in this case can be the plugin you developed. Hence there are two solutions to your case:
Configure this plugin to attach the generated artifact as a further pom.xml configuration, or
add to your plugin the functionality to automatically attach the generated file
The second case can be covered via Maven API, using the MavenProjectHelper and its attachArtifact method.
In your mojo, you can import is as a component via:
/**
* Maven ProjectHelper
*/
#Component
private MavenProjectHelper projectHelper;
Then use the aforementioned method:
projectHelper.attachArtifact(project, "zip", outputFile);
You should probably already have the required Maven dependency providing it, but just in case it would be this one:
<dependency>
<groupId>org.apache.maven</groupId>
<artifactId>maven-core</artifactId>
<version>3.3.9</version>
</dependency>
Note that the artifact will be attached to the build as an additional artifact via a classifier, that is, a suffix to the default artifact name differentiating it from the default artifact and making it unique as output of the build.
As a reference to real example and to further answer your (last) question, check this query on the GitHub maven-plugins repository, checking for the attachArtifact string, you will see it used in a number of Maven plugins, among which the maven-assembly-plugin, for example here in the AbstractAssemblyMojo class.
I have a maven project for which I'm running two separate builds.
In one build I want to save the build time by disabling the jar creation of maven modules in it.(There are 45 maven modules). There is a Maven-Jar-Plugin that is being used to create the jars.
I want to conditionally disable the jar creation at the command line, that is, looking for something similar to -Dskiptests used to skip the unit tests though there is a surefire plugin by default.
The maven-jar-plugin does not provide any skip option.
However, several ways are possible to achieve your requirement.
You may just skip the phase which brings by default (via default mappings) the jar creation, that is, the package phase, and as such simply invoke
mvn clean test
The additional phases would not make sense if you do not create a jar file anyway: package, install, deploy would not have anything to process. Moreover, the additional integration phases may also be impacted depending on your strategy for integration tests, if any.
Alternatively, you can configure your pom as following:
<properties>
<jar.creation>package</jar.creation>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>default-jar</id>
<phase>${jar.creation}</phase>
</execution>
</executions>
</plugin>
</plugins>
</build>
As such, the default behavior would still provide a jar creation, while executing maven as following:
mvn clean install -Djar.creation=false
Would instead skip the creation of the jar.
What we are actually doing:
We are re-defining the default execution of the maven-jar-plugin
We are overriding its execution id, as such getting more control over it
We are placing its execution phase binding to a configurable (via property) phase
Default phase (property value) keeps on being package
At command line time you can still change it to any value different than a standard maven phase. That is, -Djar.creation=none would also work.
I'm running a multi-module maven project and have an unexpected behavior. First time I'm seeing this...
My parent module configures the install plugin, defining its classifier.
<plugin>
<artifactId>maven-install-plugin</artifactId>
<configuration>
<generatePom>true</generatePom>
<classifier>${env}</classifier>
</configuration>
</plugin>
<!-- ... -->
<modules>
<module>webapp-formation</module>
<module>db-formation</module>
</modules>
But when I'm running mvn install the .pom files are not generate for my modules. Only my parent is associated with a .pom file in my repositories. Thus trying to browse to my module's artifact on Archiva (after running mvn depoy of course!) it simply fails. I can browse to the parent but not its children.
So... I need to add the undocumented attribute generatePom to my plugin configuration to have the .pom files generated --copied would be a better word actually-- for all my modules. --I said undocumented attribute because this attribute is documented only for the install-file goal which is not the one ran by default. The install goal is not expecting that attribute...
Of course, if I do not configure my install plugin --so not configuring the classifier-- I have no problem and all .pom files are generated properly.
For you guys, is that a normal behavior? Something that you have already seen? Or should I just file a bug?
Thanks,
Olivier.
What you describe as an undocumented attribute is simply wrong, cause the attributes are specific on a goal base which means the given configuration will not change anything, cause the generatePom attribute is only valid for install-file goal. So you can remove it.
In general such configuration does not make sense, cause if you have different environments you should go a different way. Just removed hte configuration with <classifier>${env}</classifier> as well and try to deploy via:
mvn clean deploy
I'm working on a complex multi-module open source ivy project, which has ant's build.xml at the top level to kick off each ivy module's build. But the goal here is not to modify the original build scripts(both ivy.xml and build.xml), and using maven as an outer layer to kick off ant build, and then fetch the built results and publish them to nexus server.
The difficulty here is that, the built artifacts here are multiple jars, and we need to publish all these jars to nexus server with maven. Since one pom.xml only maps one maven artifafct, and in this case multiple artifacts are build not through maven but ivy. So I wonder if there's a feasible way to achieve my goal.
Currently, in the top level pom.xml, I'm using maven-antrun-plugin to invoke build.xml on top level, and using build-helper-maven-plugin to attache artifacts, but it doesont' work.
Currently I'm working on a similar task to yours. We have a huge, full of legacy system with whole build written in ant. That is how we handle this task:
No matter what, you will have to accept it, maven = jar per artifact (well, you can use attachments with qualifiers, but it's a real abuse and highly NOT recommended). It has it's philosophy after it: in the end of the day your system consists of (as you said yourself) modules, so each module has to have it's version, sources and (most important) the dependencies to other modules.
To reuse the existing ant code you can look on the antrun plugin. What we did, is "simply" separated all the common build code (i.e generators execution, attachments creation, assemblies and so on) to parent poms that are of type "pom". Then we execute the relevant targets simply by activating properties in children poms. Here is an example
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<id>EXECUTION_NAME</id>
<phase>generate-sources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target if="EXECUTION_TRIGGER_PROPERTY">
<taskdef resource="net/sf/antcontrib/antlib.xml"/>
</target>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
And in the child pom we simply define
<properties>
<EXECUTION_TRIGGER_PROPERTY>true</EXECUTION_TRIGGER_PROPERTY>
</properties>
remember to look at maven lifecycle guide to choose the proper phase for your execution.
Don't forget that you can use maven plugins to make things easier. E.g instead of running <javac> task in ant, breaking to artifacts with jar type do all the compile for you. You can also find plugins that generate javadoc, jaxws and so on.
As you can see it's not that simple to make your system work with maven. It will require you to rethink how your build works. On the other hand the ability to see and understand your dependencies, the ease of working in modern IDE's, binary repositories and so on are worth it in most of the cases.
I have a multimodule project.
Last module is "assemble", which is intended to put few modules' .jar's together in a single big .jar, which I could use for distribution.
This module does nothing else, so I did this:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<appendAssemblyId>false</appendAssemblyId>
...
</configuration>
</plugin>
</plugins>
I want this behavior to be able to simply run the resulting jar from IDE (NetBeans 7.0).
Maven does exactly what I want, but says this:
[assembly:single]
Reading assembly descriptor: src/assembly/assembly.xml
Building jar: /mnt/ssd1/_projekty/JBoss/bots/JawaBot/2.0/assemble/target/JawaBot-assemble-2.0.0-SNAPSHOT.jar
Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing.
Instead of attaching the assembly file: /mnt/ssd1/_projekty/JBoss/bots/JawaBot/2.0/assemble/target/JawaBot-assemble-2.0.0-SNAPSHOT.jar, it will become the file for main project artifact.
NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic!
Replacing pre-existing project main-artifact file: /mnt/ssd1/_projekty/JBoss/bots/JawaBot/2.0/assemble/target/JawaBot-assemble-2.0.0-SNAPSHOT.jar
with assembly file: /mnt/ssd1/_projekty/JBoss/bots/JawaBot/2.0/assemble/target/JawaBot-assemble-2.0.0-SNAPSHOT.jar
This message seems like it's not a recommended way to achieve my goal.
Is there any better?
The assembly plugin will create an artifact with a 'classifier' consisting of the assembly ID from the descriptor. It won't create a main artifact AFAICT.
You might be happier with the maven-shade-plugin, plus configuring the maven-jar-plugin to set the manifest class name so that java -jar works. The shade plugin can produce a main artifact.