How to skip a maven build step without modifying the pom itself? - maven

We have a maven based Java EE project controlled by the customer. For internal reasons, we cannot execute one of the build steps, but the rest works fine and produces the jar we want.
Since editing the pom file would require taking care when committing to customer's SVN and copying the pom file would require taking care to sync changes comming from there, we are looking for a way to skip this specific step in the build section during the maven call itself, so to say mvn clean install but-leave-out-this-build-plugin-step, is there any?
Edit:
The plugin in question is the rpm-maven-plugin, which prevents the build from running on Windows. We found information how to make it work which won't really fit in our current setup. And since we cannot modify the customer's pom, I was looking for a way to trigger the skipping externally. But maybe there are other ways to just ignore/skip/fake this step?

It depends on what plugin you want to skip. Many plugins have ability to be skipped via system property (-Dblabla).
For deploy plugin it is -Dmaven.deploy.skip=true, for surefire -DskipTests=true.
Read plugin documentation, maybe you can find skip property
The rpm plugin hase a property disabled, unfortunately it is not accessible by a property. So, if setting this property in the customer's pom (or asking for editing it) with a default value of false is an option, this may be the solution.

Related

How can Gradle plugin access information about included builds?

I know you can access different modules (included using include) in a project via org.gradle.api.Project#getSubprojects(), and I know you can get the name and directories of separate builds that have been included (using includeBuild) via org.gradle.api.invocation.Gradle#getIncludedBuilds().
But how can my plugin get information such as the locations of Java source files and class files for projects included using includeBuild?
My goal here is to determine which files have changed in the current git branch (which I can do), and then collect their corresponding class files into a jar file that's used for our patching mechanism that inserts the patch jars at the front of the classpath rather than redeploying the whole application.
I don’t think it is a goal of Gradle to provide including builds with detailed information on included builds. Currently, the Gradle docs basically only state two goals for such composite builds:
combine builds that are usually developed independently, […]
decompose a large multi-project build into smaller, more isolated chunks […]
Actually, isolation between the involved builds seems to be an important theme in general:
Included builds do not share any configuration with the composite build, or the other included builds. Each included build is configured and executed in isolation.
For that reason, it also doesn’t seem to be possible or even desired to let an including build consume any build configurations (like task outputs) of an included build. That would only couple the builds and hence thwart the isolation goal.
Included builds interact with other builds only via dependency substitution:
If any build in the composite has a dependency that can be satisfied by the included build, then that dependency will be replaced by a project dependency on the included build.
So, if you’d like to consume specific parts of an included build from the including build, then you have to do multiple things:
Have a configuration in the included build which produces these “specific parts” as an artifact.
Have a configuration in the including build which consumes the artifact as a dependency.
Make sure that both configurations are compatible wrt. their capabilities so that dependency substitution works.
Let some task in the including build use the dependency artifact in whatever way you need.
Those things happen kind of automatically when you have a simple dependency between two Gradle projects, like a Java application depending on a Java library. But you can define your own kinds of dependencies, too.
The question is: would that really be worth the effort? Can’t you maybe solve your goal more easily or at least without relying on programmatically retrieved information on included builds? For example: if you know that your included build produces class files under build/classes/java/main, then maybe just take the classes of interest from there via org.gradle.api.initialization.IncludedBuild#getProjectDir().
I know, this may not be the answer you had hoped to get. I still hope it’s useful.

Why refresh of Maven repository is not enough for IntelliJ?

I had a NoClassDefFoundError problem with some test, launched from IntelliJ. In order to repair the situation, I had to make several changes in many poms of the project - adding new packages and excluding some old ones for to escape the overlapping of them. Also, I reapired the situation with different versions. But the situation did not improve. Again, some package, declared in pom, was not found where it should be.
I refreshed the maven repository by
mvn -e clean install -U
, as is advised in https://stackoverflow.com/a/9697970/715269 - so old and upvoted answer, that it surely looks as Santa.
The problem remained unchanged.
I output the maven map. It was correct and it contained all needed.
I looked at the list of the External Libraries of the project. It was the old uncorrected list of overlapping jars with same names and different versions, and without good packages I added just now, and well seen in maven tree output!
Already hapless,
I reimported packages in IntelliJ
by:
Ctrl+Shift+A, Reimport All Maven Projects.
Ho! The list of libraries got repaired. And the problem, mentioned in subj, disappeared.
The question is: How it could happen, that the same project has that very pom for everything, but gets packages differently being launched in maven and in IntelliJ?
I know about that feature "delegate IDE build to Maven". And I keep it turned off. But I am NOT talking about the different SW for building. Whether they are different or not, they should be up to the actual pom's. And whereas maven, if turned off from the automatic building won't know about changes in poms, IntelliJ KNOWS about them. It could have jars up to pom, or up to maven - it has sense, but it simply has some old rubbish. Was there some deep thought under that construction?
Every time you manually change the pom.xml file, including the dependencies you need to load these changes into IDE. IDE does it on Reload from Maven action. See also Import Maven dependencies.
Intellij doesn't use maven to bulid and run a project except you are delegating build and run action to maven:
Since, IDEA doen't really use maven to run and build, it uses the pom.xml to import the project structure and "tries" to build the project the same way was maven does.
Actually, there are quite a few differences between these to build processes.
Generating sources or filtering resources (don't know if this is still an issue) aren't done during building the project with Intellij IDEA.
In case you are using code generation you have to build the project via maven first and then - when all the resouces are filtered and additional sources are generated - you are able to run, debug aso. the project with Inellij IDEA.
That's an important thing to be aware of and that's the reason why maven and IntelliJ IDEA project structures might get out of sync.
You can enable the "Reload project after changes in build scripts" feature and select the Any changes checkbox to keep your project structure updated:
Why should you disable this feature anyway
If you are working on a build file (gradle or maven is not important) reloading the structure on any change can be very anoying. It's cpu intense, dependcies are fetched aso.
Therefore, I prefer to reload project structure only in case of an external change. This happens when pulling an updated version of the build file for example.

Maven module dependency source instead of repository jars

I have a multi-module project, i.e.
parent
module1
module2
In one dev cycle, I added a class mod1.A to module1. Class mod2.B in module2 depends on it.
I do not have the artifacts in my local .m2/repository. Running this:
$ cd prj/module2
$ mvn -o exec:java -Dexec.mainClass=mod2.B
results in an error along the lines of:
The following artifacts could not be resolved: com.example:module1:jar:1.0-SNAPSHOT
After I install the artifacts via mvn install while in the prj folder, it all works as expected.
However, this presents an issue in at least two ways:
I have to go through the slower install phase instead of the faster compile phase
I have two versions of the same project and conflicting modifications in these. I cannot run the same Java class with their respective modifications, only the currently installed modifications, considering they are both the same SNAPSHOT version
There are workaround for both (skip parts of the build for the first, different snapshot versions for the second), but they are far from usable in practice.
Is there a way to make maven use the local modules, instead of using artifacts from local maven repository?
If I understand your question correctly, it seems like you are living a bit outside the norm here: you have two local "copies" of the project with different modifications, that you want to work with alternately when running "exec:java". And Maven is getting in your way: it expects your local .m2 repository area to be in play, but the version strings in each copy are the same, so you end up with the changes interfering among the copies.
To me, it sounds like what you are trying to do is to test your changes. I suggest you just write an actual JUnit or TestNG test in module2 that tests what you want (it can just call mod2.B Main if you want). Then, from your chosen project directory, you can run mvn test -Dtest=MyTestName. It won't "install" anything and it will find the dependencies the way you want it to.
Otherwise, I can see three options.
Change the version string locally in one of the copies (mvn versions:set -DnewVersion=B-SNAPSHOT can do this for you). That way any "installed" jars from your work on that copy will not be considered by the other copy, and vice-versa. You refer to this as being "far from usable" ... I think it should be fine? These are different versions of the project! They should have different version strings! I strongly recommend this option out of the three. (You can do mvn versions:revert when done if you used :set, or you can rely on version control to undo the change.)
Select a different local repository used by Maven when working on one of the projects, with a command-line flag as per https://stackoverflow.com/a/7071791/58549. I don't really think this is a good solution, since you would have to be very careful about using the right flags every time with both projects. Also you'd end up having to re-download Maven plugins and any other dependencies into your new local repository anyway, which is kind of a waste of time.
Try to avoid using any local repository at all. You seem to be trying to make this option work. I don't think this is a great approach either; you're fighting against Maven's expectations, and it limits your flexibility a lot. Maven will indeed find dependencies from the "reactor" (i.e., the executing mvn process) first, but this means all of the required modules must be available in the reactor to be found, which means you can only run mvn at the top level. So if instead you want to just do "mvn exec:java" inside a single module, mvn needs to find that module's dependencies somewhere ... and that's what the local repo is generally used for.
If you're dead set on going with option 3 (instead of option 1), then I suggest you follow the comments on your question and create a profile that runs your exec selectively against module2 and binds it to a lifecycle phase. But this is in practice very close to just wrapping it with a test.
For IntelliJ users:
I solved this problem using IntelliJ's Run configuration. It has the options Resolve workspace artifacts and Add before launch task -> Build. See this picture for clarification:
Run configuration example
The whole point of modules in Maven is to create decoupling between them. You either build each module independently, so that you can work on one module without touching the other, or include both modules as sub-modules in the parent pom and build the parent, which will resolve dependencies between its sub-modules and trigger their builds.
It looks like you have two options here:
Review the structure of your project. Do you really need to split it into two separate modules, if you change code in both of them simultaneously?
Import the project into a Maven-aware IDE (IntelliJ IDEA is very good at working with Maven), and let the IDE handle the compilation. Once finished and stabilized the code-base, build normally with Maven.

Maven plugins and resource substitution

I have a regular requirement to execute several goals in Maven so I decided to write a plugin for it. It seemed easiest to define a new lifecycle for this, each phase executing the relevant plugin goal. I need to pass configuration to the plugin, specifically a directory and a version number.
I found that if I use a variable in lifecycle.xml, such as ${projname.directory}, the variable appears to be resolved not at plugin compile time but at project compile time. I'm guessing that lifecycle.xml is used within the project and not touched by the plugin. Is there any reference for understanding exactly how this works?
Also, I'd like to be able to use a default directory name if the projname.directory property is not set - storing this in the plugin source somewhere. I have no idea how to go about this - is there an easy way to do it?
Thanks,
-Dave

specify a maven2 dependency version from the commandline

I'm working a contract that has some build oddities... they're using maven, but the pom file is actually edited by the build script to replace the version number with the jenkins build number, and then that same number is used to replace the version of other internal projects which will be used at build time by this project. I am new to maven, but know enough to know this feels wrong.
I can pass in the version number, but putting the same property in the dependency block doesn't seem to work.
I know the tao of maven is serious business, please understand this is a very short term contract and build system isn't in my statement of work - I just want to get to a place where the source controlled files aren't edited by the simple act of running a build.
You can definitely define a Maven property with a version value, and reference it in the dependency declaration. And, with Maven properties, they can be passed into maven using the "-D" command line option.
What I'm not sure of it whether the timing of how Maven runs will allow this to change the dependency version. I think (so, I'm not 100% certain) that the dependency management will be managed before command-line options are processed.
I'd try defining a maven property with the dependency version in it, and reference the maven property in the dependency declaration appropriately. Then, when running mvn, supply the desired version as a property value. That'd be the most-likely approach.

Resources