Background: we are migrating an application and have to adopt some new structures so the autmatic installation process works at our customers site.
Until now we are using maven-classifiers to build 3 different flavours of the same module (say flavours A, B and C). The problem is the automatic installation process at our customers site can't handle this as we expect. While downloading from the nexus, the 3 artifacts overwrite each other and only the last is actually there.
There is absolutely no way, that we can fix this issue from the customer side (although it would be trivial!). Actually, our customer won't change absolutely anything on his site - we have no control whatsover of the enviornment on our customers site.
So the proposed solution to make this work, to deploy 3 artifacts with different artifactIds. I.e. modA, modB and modC
My question is: how to achieve this in the 'most elegant way'?
We don't want to restructure the whole project as truly triplicating the number of modules would be horrendous.
My ideas so far of which I'm not even sure if they would work:
Use antrun plugin to copy/rename files and generate the corresponding pom files from a template and then use deploy-file to upload to the nexus.
Using assemblies? How?
Creating new submodules with the corresponding pom files (and in there the GAV the costumer requires) inside each one of the modules which should be deployed, but each of these submodules would basically only copy the artifacts from targetA/app--, targetB/... and targetC/... This way I hope at least having a more or less clean deploy step (without using a deploy-file to deploy a WAR and a pom). The pom files would be minimal and viewed from the root-module we would still see only one module 'mod' and hide the ugliness inside this rather than having 3 modules 'modA', 'modB' and 'modC'. This way we would also decouple the devs-view and the deployments-view on the project. Devs could continue the way they are used to and integrators would take care of the 'putting things in the way the customer wants it'-part. Now here my question is: how can I create a maven project where the 'complete build' is actually nothing more than copying a war?
It is clear to me, that all these choices are horrible. If you know of the 'right way' to solve my problem I would really appreciate it. Otherwise I would really just like to know how to achieve the 'copying a war' part.
Related
I've inherited a few maven projects which have added a /dependencies directory to capture Java jar libraries that aren't part of the project war and must be installed by a DevOps into a Tomcat installation.
The libraries in this directory seem to fall into four categories:
"provided" scope libraries,
downstream dependencies of those provided libraries, and
discoverable implementations of api jars
"mystery" libraries, i.e., not available in an external repository, and maybe unsure where they ever came from.
Is there a strategy to get Maven to help manage these dependencies and perhaps fetch them for external install?
There are probably several strategies to choose from.
Number one: leave it as it is. If it works and the build is reproducible (on different environments) that seems one valid solution.
The "mystery" part of the build might not be more of an issue for new people working with it.
I think it is valid to create an own maven module to be delivered to the infrastructure team. This module can contain the jars in the /dependencies folder.
What you would need to do is create a pom.xml and add all dependencies currently in that directory (of course not the transitive ones). The magic ones would need to go in a repository proxy (nexus, artifactory, ...). If you don't have a maven repository yet: you want one! (its easy to setup and it does help a lot!)
I would then use the assembly plugin or some ant task to build the zip do be delivered. So the infrastructure team is able to just unzip / copy the files where they need to be. This step can then even be scripted (so the upload / unzip is done through SSH or something like that).
This is probably only one way to do it. I would assume to resolve the jar's in the /dependencies directory may be a bit of a pain.
The advantage is obviously that you document and simplify the management of those libraries. I would also assume if you update some of them it is easier across branches to merge since there are no binary files around. So it may be worth the effort.
We are using the latest version of Jenkins CI and we have a large number of projects, which have Maven dependencies on other projects. We also are using Jenkins views to group associated projects together.
I would like to be able to generate a graphical representation of the project hierarchy within a view. I am aware that if I select a project that I can see the upstream projects, but going through approximately 40 projects, writing this down and compiling it into a tree would be tedious, time-consuming and error-prone.
Does anyone know of a technique or plugin for Jenkins that could achieve this? Ideally it would work against all the projects within a view.
I would prefer an automated technique rather than performing it manually, since this process would need to be run periodically (say once a month) for a management report.
Update
Having investigated this question, I am not averse to writing a script to query the Jenkins API to get the JSON or XML for the projects within a view and then asking each for its upstream projects. But I'd rather save myself some work and using someone else's tool :)
You can use Maven to generate the dependencies for each project (http://maven.apache.org/plugins/maven-dependency-plugin/tree-mojo.html).
It won't give you a dependency tree for all your Jenkins projects though. Maybe you could pull from all maven outputs and create your own? Or maybe (not really) create a super project in which all modules are your existing projects (again, not really).
There is a Downstream Buildview plugin. It's per job, and it displays job names, but if you job names are named after maven modules, it shouldn't be an issue.
We are using maven for building the project. It's legacy and huge one.
We newly added few .keystore files to it's resources folder.
The problem is, once the build is done, the .keystore files are getting tampered [may be maven is trying to replace/search for some placeholders]. Since it's legacy one, the project structure is so much messed up and we don't have separate distributions or no other choice but to go with plain build.
What I want is, tell maven to copy these sort of files without touching them and keeping the build as usual like before.
Between, there's no explicit is mentioned in pom.xml, tried to doing with that as per this http://maven.apache.org/plugins/maven-resources-plugin/examples/include-exclude.html but it's messing up the project build.
I don't want to tamper the build, since it's legacy and huge one. We are using Ant plugin
Just switch off filtering for the respective <resource/> or add an <exclude/> for it.
After going through lot of sources, Found the solution http://maven.apache.org/plugins/maven-resources-plugin/examples/binaries-filtering.html
Thanks :)
We check all of our source code's dependent third-party JARs into source control along with our source code. When needed, we manually download updates to third party JARs and replace those JARs that are in source control with the newer versions. We haven't felt the need to use Maven yet as this process seems simple enough for us. But are we missing something of great value by not using Maven? Or does our scenario not warrant using Maven?
"JARs dont change much", I hear this all the time.....
Storing jars in the SCM is simple in the beginning of the project. Over time the number of jars gets larger and larger.... Wait 2 or 3 years and nobody remembers where the jars came from, what their licensing terms were and most commonly what versions are being used (important to know when analysing security vulnerabilities).....
The best article I've read recently making the case for a repository manager is:
http://www.sonatype.com/people/2012/07/wait-you-dont-have-a-repository-manager/
A little irreverant, but does make a valid point about the kind of technical inertia one encounters all the time.
Switching a project team from ANT to Maven can be scary.... Maven works quite differently, so I find it is best deployed with greenfield or adventurous project teams. For the old-school ANT users, I recommend using the Apache ivy plugin. Ivy allows such teams to outsource the management of their dependencies but keep the build technology they're comfortable with.
Ultimately the biggest benefit of using Maven are not dependency management. It's the standized build process. I've seen several failed attempts to create a "standard" ANT build process. Problem every build engineer has his opinion on what the standard should be.... Maven's approach of forcing users to write build plugins may appear restrictive in the beginning, but just like the iPhone eventually developers discover "there's a Maven plugin for that" :-)
When it comes to dependency management Maven really can be quite valuable. As Mark O'Connor suggests, running a local repository manager would likely be better than checking the artifacts into source control.
There are many tools (like m2e in eclipse) that can help with dependency management and provide valuable feedback on which modules or dependencies require which other dependencies. Maven will also make sure to get the appropriate version of a dependency even if different modules depend on different versions of a given library. That will help prevent duplicate versions of the same jar showing up in your deployed project as long as they have the same group and artifact id.
Even for a very simple project I don't think I would resort to checking dependencies into the source control system.
It's not only about 3rd Party Libraries. Mostly if you have multiple repositories. In our case, we had four repositories with lots of inter- and intra-dependencies.
Actually I started this answer and then I had to go for 15 minutes to talk to some colleague about a problem happened after someone forgot to update the .jar of one project in the other's lib directory.
And it looks more professional :)
Hi there i need some information or general tips on a problem with maven.
Context:
We just migrated one big eclipse project into 4 maven project. (Thats one step in the good direction!)
We were building that/those project with an ant script (build.xml) We were selecting the task to do "on-demand"
To keep it simple here are the 4 project : Core, Client, Server, Admin.
Each of those maven project build into a jar. This have been establish and it is working perfectly. Core is a dependency to Client and Server.
We use Jenkins-CI and Artifactory on a remote server.
Problem:
I need to create some kind of "parent project" that will build all those other maven project and add some task "on demand" that we were doing with an ant script.
Exemple: We want to build locally (So we don't use jenkins and artifactory on this side) for our developper so they can test manually their update (yes we have no test for now, we are working on a legacy system). On this build, we do not want to obfuscate the code or sign our jar..etc
We also want a "customer build" (The real release that we push on the server, so it does use jenkins and artifactory) That will add some task on some of the 4 project like obfuscating the code, signing the jar ..etc
For this "customer build", we need to be able to select our dependency of a library "on-the-fly" or more like "On demand". Our program is an extension to another software and all our customer don't use the same version. To make it simple the library "y" can be y-2.0.1.jar or y-2.0.2.jar.. etc
All of those "task" i need can be done in different maven-plugin with no problem.
Question: What would be a good practice to solve my problem. We would really like to get rid of our ant script. Also we are cleaning a big big dirty project so i would like a clean solution without a lots of duplicated stuff or lots of manually task to do each time we want to build either locally or on the remote server for our customer.
Idea: I though i could use different maven profile in all those 4 project as i saw there:
Ant to Maven - multiple build targets But i will have a seriously huge pom.xml for each project with lots of duplicated stuff so I really don't like this idea. I though we could have a parent maven project but this would contain no code so i think I'm wrong with this idea also.
Thank for answering and for your time!
Going with Maven Profiles is the right thing to do for this kind of customization. Then you'll probably have developerProfile and releaseProfile or such.
And yes, your poms will be big and complicated.
Looks like your demands are a little bit to much for what Maven can provide out-of-the-box, and it's not the best tool for doing highly-customized builds. Since (as I understand) you are on pretty early days with your new build infrastucture, I'd advice to look at Gradle. You could reuse your ant tasks and both Jenkins-CI and Artifactory work great with Gradle.