I understand that this is against maven best practices, but maybe my situation is one of the few exceptions from the rule - at least I'm stuck with thinking of alternatives :(
The environment is this:
we have a legacy application with proprietary technology based interfaces to the outside world
we want to use flash as the new frontend
based on the legacy interface we generate flash classes and package them in a flash swc to be used by the frontend developers
based on the legacy interface we generate java classes which bridge the flash service requests (coming via blazeds) to our legacy interface
to make it more difficult, we don't want to / can't use a pom on it's own for each interface as we have dozens of them (interfaces) and they would only differ in their artifactId. Instead I use a "generic" project structure which will get parameterized (by jenkins) for each build. The project will only be used in an fully automated environment.
First I've tried to put all these in one "simple" project, which works up to the point where the artifacts should get installed.
My current approach is a multimodule project structure inspired by maven reference chapter 13, which has some disadvantages on it's own:
GenericProject
|
+-- GenerateSources from legacy interface
| +-- pom.xml
|
+-- Java
| +-- pom.xml
|
+-- SWC
| +-- pom.xml
|
+-- pom.xml
This approach has the disadvantage, that I have references from "Java" & "SWC" to the internal structure of "GenerateSource" which is ugly but tolerable.
What really gets in my way is that I have to heavily tweak the install & the deploy plugins to get artifacts with the name & version of the legacy interface which triggered the whole process.
I got it running now, but it looks very brittle.
I considered splitting/duplicating the project in two simple projects:
GenerateSources & Java
GenerateSources & SWC
But this would only solve the minor annoyance with the cross-references.
As Aaron pointed out in his comment, I'm unclear in stating the problem.
After some more experiments this got a lot clearer to me:
Essentially I have two problems to solve
install/deploy two artifacts together
name the artifacts different than the project.artifactId
Any suggestions to make the whole process more maven-like?
Thanks in advance.
After some detours with the multimodule approach I came to the following pragmatic solution:
use the build-helper-plugin to attach a secondary artifact to be installed/deployed automaticly
two-phase build:
2.1 generate a pom.xml via sed which contains resolved project.artifactId & project.version
2.2 run the maven build
Although you theoretically can use expressions as project.artifactId & project.version, maven gives you a warning for this . . . for a good reason:
When you try to reference the produced artifacts, nexus will give you a "Failed to read artifact descriptor for . . ." error.
I suspect this is because in the stored pom in the repository the expressions are unresolved!
You should write a small Maven plugin that you attach to the generate-sources phase. See the maven-annotation-plugin for an example (main class).
That will include the generated sources in the output of the GenerateSource and you can consume those classes just by including the dependency in the other POMs. Note that you should create those files under target/, not in src/.
As for install/deploy: Those plugins get their names from the plugins which create artifacts. So there must be something wrong with how you set the property. In your case, that's the JAR plugin. The documentation has an example how to set the name of the default artifact.
Try using Maven overlays, it's used to share resources between multiple web applications.
http://maven.apache.org/plugins/maven-war-plugin/overlays.html
Related
Which maven build lifecycle phase is executed by clicking the " load maven changes " in intelliJ (you know, the little m letter that pops up each time you change something in the pom).
Cheers!
Well, thats a tricky question.
IDEA get project model using maven components classes, during import it does not execute maven goals per se.
IDEA loads maven libraries, injects into maven process, and calls maven classes directly.
In short words, at first IDEA reads project model (using
org.apache.maven.model.io.ModelReader class, I don't know if exact lifecycle exists)
Then dependencies and plugins resolved (well, again using maven classes directly, but this could be mapped to dependency:resolve). You can look implementation at org.jetbrains.idea.maven.server.Maven3XServerEmbedder
To generate sources, phase which set in File | Settings | Build, Execution, Deployment | Build Tools | Maven | Importing |Phase to be used for folders update used
Frameworks detection/compiler settings/language level/artifacts configuration/etc, are not taken from maven execution at all. For such things IDEA read pom.xml files using own parser.
But what do to want to achieve? If you describe your issue and what do you want to get in result, I'll try to help you find a solution.
Not a complete answer but for illustration a screenshot from IJ 2020.2.4. Guessing from the text "Analyzing..." showing up when clicking the "m" IntelliJ performs a mvn dependency:analyze using Maven dependency plugin.
[]
I have to make build architecture of my small project and I wonder how to proceed?
I have classes in one project for the so call core-part which can be used in other projects and api part in the first project which uses core-part.
So how to act?
To make multi-module Maven build and to produce 2 artifacts (which to upload to our company repo) and api-part have project dependency to core-part:
pom.xml
|
/core-part
|
/api-part(depends on core-part)
Create separate projects
Project1
/core-part
|
pom.xml
Project2
/api-part
|
pom.xml -> depends on project1's GAV artifact
Project3 ,...n depends on project1?
"Strong advice" : a project belongs in a multi-module build if release of that build requires release of that module, and vice versa. Multi-module projects should consist of things that must be released together.
An API should [practically] never depend on the release of its implementation, whereas the release of an API [nearly always] dictates the release of a new dependent implementation.
If you're not doing formal releases (why aren't you doing formal releases?) then this advice is still in force but less strong.
1. is the way if you'd like to build core-part and api-part within one build by building the aggregator/multi-module project (which has to have <packaging>pom).
There is actually a third artifact created (and installed and deployed to the repositories) then: The one for the aggregator project.
2. is the way if you'd like to handle (build, install, deploy) core-part and api-part individually. (An option you still have with 1. anyway.)
See:
POM Reference, Aggregation (or Multi-Module)
Maven: The Complete Reference, 3.6.2. Multi-module vs. Inheritance.
I was trying to build Maven pom in something similar to the following hierarchical form:
root
+-- A-POM
+-- B-POM
+-- C-POM
+---D-POM
I was hoping that this could take care of my changed module problem. That is, if C is changed, then A must be rebuilt, etc.
But I ran into the issue that it seems the packaging at root is "pom," and after that I can't have A as packaging "war" then continue to drill in to have A include B, C as its modules. It seems to me that any POM which does not have "pom" in the then it can't have child modules. Is my understanding correct? Is there a way to do what I wanted to do?
In addition, I don't seem about to chain the "changed" mechanism in Maven (must due to my lack of knowledge). I like to have Maven detect a dependent project has changed and rebuild all the affected projects.
Thanks so much!
the reactor project (the root of the multimodule project) must have pom packaging. So your nested structure is invalid since A is not of type pom and I'm pretty sure you won't get it to work this way.
Second point is that Maven is a modularized build system and uses repository mechanisms to locate pre-built artifacts instead of checking out all modules from version control and building them in a monolithic way like in the old days ;) This means that Maven cannot know what to rebuild when you change something at your module since it simple does not have all the other module there at this time.
I think this is more a CI task than that should be handled by the build system itself. I know that your can achieve such a behavior with an appropriate build/CI Server like Jenkins that supports upstream and downstream projects. This means it is able to detect dependencies between the projects and trigger other builds as soon as a dependency has been built. This comes close to the behavior you are trying to achieve.
Btw. rebuilding other projects is only required for SNAPSHOT dependencies. Jenkins with the maven plugin supports this behavior but, depending on the number of SNAPSHOT dependencies of your project, this can cause long chains of project builds on the server. Some folks are of the opinion that in general SNAPSHOT versions are hell for CI tasks since these artifacts can change over time and are not reproducible. You could think over completely omitting SNAPSHOT versions and building final versions each time. This would also obviate your requirement to rebuild other modules as soon as a module changes. There are simply no changes until you upgrade dependency versions.
In a project I'm working on, we auto generate the interfaces API in a folder called api/ which contains several sub-folders, where each of them has a pom file able to compile the content of the module.
project-root
- api
- module-api-1
- pom.xml
- module-api-2
- pom.xml
- module-api-3
- pom.xml
- module-api-4
- pom.xml
- build
- pom.xml
Basically the pom.xml triggers the code generator which then generates all the api/* modules. By the time I run maven clean install within the folder build/, the api folder is empty, because it will be filled by the code generator in the generate-code Maven phase.
Is there a way to tell the build/pom.xml to handle the modules inside api (the names are known) within the same build?
If I specify a <module> which does not exist, maven verify will complain.
Thanks
I believe the resolution depends on flexibility of the API's list
if the list of API-modules is dynamic, it's simply not possible to declare a dependency on a particular module - nobody knows them in advance. I would consider generation of source folders and adding them to a single module. As the result, you'll have single all-in-one module, which will contain all generated and compiled code. Other projects can use it as dependency
if the list of API-modules is fixed, their POM files, which declare GAVs, should not be generated. Then other projects can use any of them as dependencies, although their code is generated only during the build
If it was my Project, I would declare the references to the modules static in the pom (modules/ module-api-1 module-api-2 ...) and also have the module-Projects in a generated state so it theoretically could compile without generating the apis. So what I'm saying is - just treat these modules as full fledged module projects.
Then and I assume this is important for you, if you have a Change in the code that causes a Change in one or more apis, i would run the Generator. If you need to reflect this changed api in a repo, you can still just install the changed module.
I know this propably isnt what you wanted to do, but I'm pretty sure you'll have less Problems taking "the conservative way".
We are trying to migrate our builds from ant to maven. Project I am working for is using ant since ages. Scripts are real complex where build artifact is a zip file having a definite directory structure. Build creates about 108 unique jars and packages them to this zip file along with many config files and other 3rd party jars.
We need this zip file to be same as now even after migration to Maven. I am just a learner on Maven as of now. My question to you guys is that if there is a way using which I can use one pom.xml to produce more that one jar file, providing list of jars and includes, excludes packages for each.
I googled and found that in case we need multiple jars from one project but different packages, we can do so by placing one pom.xml at each package (jar will be created from this package) and bonding all using dependency management.
But this does not solves my problem, as having 100+ pom.xmls does not seems to be a good idea.
Hope I have made my question clear. Please suggest if there is a way out.
Adding to Udo's answer here's another Sonatype blog posting with a diagram explaining the Maven anti-pattern of generating multiple jars from a single Maven project.
Both articles recommend adopting a modular structure to your code rather than fighting Maven's approach of decoupling large projects into a set of interrelated sub-modules.
Incremental publish approach
Start with looking at the arifacts you actually plan to share. Your mail suggests that the only file you're actually publishing is a large zip file containing 100+ jars and other files?
You could invoke the Maven command-line tool to publish this zip to the Maven repository:
mvn deploy:deploy-file \
-Durl=$REPO_URL \
-DrepositoryId=$REPO_ID \
-DgroupId=org.myorg \
-DartifactId=myproj \
-Dversion=1.2.3 \
-Dpackaging=zip \
-Dfile=myproj.zip
This approach can also be used to publish jars and POMs (containing dependencies). Eventually you'll be overwhelmed with the number of POMs to maintain.... AT which time it would be simpler to restructure the building of that jar into a sub-module.
Alternative to switching build technology
It's very difficult to walk away from a legacy ANT build. These often contain complex, custom and difficult to reproduce build logic. For such projects I recommend using Apache ivy to externalise 3rd party dependencies and share artifacts with other projects (who might be using Maven).
To that end I wrote an ant2ivy script for generating an initial ivy setup, based on the jars that already exist in the ANT project's directory (normally committed alongside the source).
Using ivy doesn't get you away from the fact that it's a good idea to create project sub-modules. However it does enable you to modernise your ANT build.
Update
Yes, there are Maven ANT tasks available. I don't use them because they based on Maven 2. I'm disappointed that we're still waiting for their Maven 3 replacement aether-ant-tasks (Only available from GitHub). Ivy is still the no1 choice for integrating non-Maven clients with a Maven repository
Well you can generate multiple jars out of one project.
Its not really considered best practice. Look at the supplied article and decide it yourself.
In the includes your are not limited to packages, however this surely makes it easier. :)