Maven for deployement for a client/server app? - maven

We're currently moving from a quite old Perl + Ant build system to maven.
The software itself is a client/server app using RMI as main communication system and the build produces a (large) number of jars.
This piece of software existed long before usual frameworks become mature, so the it does not make usage of :
any usual persistence layers (Hibernate/JPA like)
any MVC application frameworks like Spring
any Java EE infrastructure ( no war/ear packaging)
In fact it is shipped as a set of jar and conf files, with a set of startup/admin scripts.
We ported the compilation/building system to maven3. The client and server deployment trees are currently being ported to maven with the assembly plug-in.
What we're currently missing, in the maven framework, is to be able to deploy the app on development stations and build servers and to run some basic integration tests.
Ideally we'd like to use maven in to deploy an image of the client, server and a DB fixture ( that is actually code generated as well).
I've found a lot of pointers on how to deploy a web app, few on the integration tests themselves, and almost none on how to handle the deployment of a (very) old school C/S architecture using maven.
So my questions would be:
are there are any known and mature plug-ins capable of deploying the app. and handling the DB fixture in a C/S manner?
If not, what would be your advice(s):
use ant rules and bind them to maven life cycle in order to handle the deployment process (we have a couple of ant targets that could be adapted to fit in the process)?
write a maven plug-in that would fit the exact deployment needs?

Related

How to setup multiproject structure with maven?

I'm fairly unexperienced and all new to the whole world of build tools so here's my situation: I am developing a webapp with JSF, PrimeFaces and Hibernate on wildfly-9.0.2-final. All java files (incl. ManagedBeans, DAOs, Model classes, etc.) are currently in a regular eclipse java project called MyApp-CORE. There is no html or any other resources in that project, but all the third-party libraries like PrimeFaces, commons-xy, etc. Then I got two dynamic web projects with all the .xhtml files and stuff. Both web projects include the CORE in their build path (all done via eclipse built-in tools). Basically I followed Structure for multiple JSF projects with shared code so far. All projects are versioned using Git. I was now asking myself how to mavenize the whole thing and also how to properly include tests. The final result should be:
I want a build file for each web project that includes the CORE dependency and all of it's transitive dependencies, creates a .war file and deploys it either on the production system or locally (Depending on some parameters I want to be able to maintain).
This buildfile could then for instance test and build the CORE and then the .war file.
Since I'm using JSF, mostly the only option for testing is JSFUnit. Should I test each web project individually and put all the test cases there (which would be highly redundant because they're mostly the same, just a few features are different) or should I rather create a separate web project called MyApp-TEST which tests the CORE.jar and also - depending on some configuration - each web project.
I've already created a structure that makes it possible to include the core in the web project but unfortunately I loose the perks of hot deployment in wildfly when just including it as a dependency from my local maven repository.
So, to summarize it:
What would be a best practice for this setup, eventually leading to a continuous integration scenario?
How should I include the test cases (full integration tests that test actual UI behaviour)?
Which Tool (Maven, Gradle, Ant, etc.) would be best for that task?
Keep using hot deployment for smooth development?
Thanks in advance for any comments, hints or shared experience!

OSGI vs Maven which is better packaging tool

We have a very big web application containing many features.Now for maintainability we want to split the application in components so that can remove / add particular components (jars). For that one suggestion is coming is to use OSGI. I think converting jars into bundle will take huge effort. I think same functionality can be achieved by Maven. According to my understanding OSGI is packaging tool. If I can make Maven plug-in for each component then any particular component can be included or removed at compile as opposed to run time as in case of OSGI.
Modularizing the application using Maven will be simpler than OSGI. I have read similar post on this site and it commented that OSGI and Maven are like comparing apple with orange. But I think in one sense both are same as they both meant for packaging difference is one is used at run time and one for compile time
Looking forward for well though answer :)
best wishes
Shailesh
As you already hinted at yourself: you're comparing apple with orange.
OSGi is not a packaging tool.
OSGi bundles are plain JAR files with some OSGi-specific metadata in the Manifest file.
You can create OSGi bundles using Maven e.g. using the Maven Bundle Plugin (I can recommend this approach). So regardless if you're using OSGi or not I strongly recommend using Maven.
Here some use cases for OSGi:
You want to create different versions of your application e.g. for different customers. With OSGi you can just add/ remove bundles without having to touch any other configuration.
You need a plugin system so 3rd parties can provide plugins to your application
You want your application to be truely modular
You want to share some code with other applications but want to hide some internal classes
...
OSGI is much much more than a packaging tool. You could say that OSGI has a packaging tool inside. Maven is a packaging tool and a dependency manager. I'd say that, given the level of complexity and the use you say you'll make of this technology, go with Maven.

OSGI Apache felix- Hot deployment support

A little back ground: We are using Apache Felix implementation of OSGI for our web development (Adobe CQ5 which inturn is built on apache felix). We have a few bundles of our own (around 10) and each of them are configured as a project.
Issue: During the development lifecycle, we make changes to a bundle and then use an ant script to create the bundle and deploy it in the felix. I am wondering if there is some way to enable hot deployment of the changes I make during development mode that would save developers time.
Based on my research, we can use the felix file install which will monitor a folder(s) for changes to any bundles and can deploy them automatically. But this again means I need to run ant script to build the jar file and move it to the auto deploy folder the file install is watching. Is there a better/fast way to achieve this? The script is currently taking a around 10 seconds (approx) to compile the classes, create osgi specific meta data files, bundle the classes+metadata in a new jar. Is there some way to do hot deployment, so that any change I make to a java file is automatically reflected in the bundle?
Many thanks
If you develop your project in Bndtools, and run from the built-in launcher, then Bndtools will handle immediately building any Java code that you change, and deploying the updated bundle into the runtime. This leads to an extremely quick code/test/debug/fix cycle.
Having said that, I'm amazed that it takes 10 seconds to compile and build your bundles currently! Are you building on an extremely ancient computer? Or is the bundle multiple gigabytes in size?
We tried DCEVM and it does almost everything we expected to reduce the develop+fix+test life cycle. I recommend this to all java developers using big web applications. Thanks for your suggestion on bndtools Neil.

Play framework core classes externalized in maven module

I'm starting a new project development and for now I'm quite pleased and interested by what Play! 2.1 offers. So I'd like to use it for the main website and end-user experience. However I wouldn't want to be dependent on Play for the rest of the project. I'd like to be able to write the admin module with say GWT anytime and so on.
So I'm thinking to use Play! as a kind of a front-end for my application which I'd write using Spring framework, Hibernate and Maven. I'll then package it as jar and add it to play as a SBT dependency.
The problem is this: how do I setup the development environment so I'll have 2 projects: one for the core classes using maven and the other for the website using play so that whenever I change code in the core project I won't have to build and redeploy the jar files to an internal maven repo from where play would be reading them. How can I achieve this kind of a setup?
BTW I'm using IntelliJ IDEA if this matters.

Maven: Dealing with a truly circular dependency

I have a somewhat complex situation and I'm not sure what the best way to set up my Maven environment is.
I'm writing a framework to allow the creation of tests for a particular system. The system can host a variety of applications, and I have a set of tests for each application, which I'd like to keep separate from the framework (which handles the general concept of a "test", and message sending/receiving etc). Also, the framework provides a generic UI, so it can be built as a war and deployed allowing you to run and configure tests.
What I'm currently doing is building the framework both as a jar and war, listing the framework as a dependency in each application test suite, and pulling in all the framework code using an overlay so each suite can build its own war to deploy. This is gross.
What I'd like is to be able to specify (probably via profiles) which test suites to build in the framework, and end up with a single framework.war file with the specified suites included. Trying to build the poms for this I keep running into a circular dependency because:
To build the tests, the test projects must depend on the framework
To pull in the specified test jars, the framework must depend on those test projects
Things I've tried:
Make the test suites sub-projects of the framework:
This doesn't work as (I think) I can't package the final result as war (only pom packaging for aggregator projects)
List the test .jars as system dependencies:
This works, but it's gross to have to manually specify a path to the jar
Put the tests as java packages inside the framework and compile only what you want via filters:
technically possible, but I would really prefer the logical separation into separate maven projects as each test suite can be configured too, and I'd like to keep all that config out of the framework pom
What would be ideal would be a parent project pom that would:
compile the framework with no tests
compile the specified test suites
rebuild the framework .war, including the specified test suite jars
I'm not sure if this is possible and/or even advisable, but it seems the best solution to me. Thanks in advance for any suggestions for organizing this project.

Resources