Application Server for non-Web Spring/Hibernate Application - spring

We are developing a open source trading platform based on Springframework and Hibernate http://code.google.com/p/algo-trader/ and http://www.algotrader.ch. The application consists of a trading framework and several strategies that can be started independently. So far, these different parts have been running in separate JVM's communicating through RMI and JMS.
To avoid unnecessary serialization and network overhead we would like to run the entire application within some sort of container (potentially an application server). We do however have the requirement, that the individual parts of the application can be deployed, started and stopped independently.
We have looked into OSGi, but a lot of the libraries that we use are not OSGi ready yet, so this is not currently an option. Also please note, there is no web-GUI in our application.
Any suggestions on this?
Thanks
Andy

If OSGI is not an option then functionality can be broken into smaller units and then deploy them as utility jar, if deployed as utility jar they can be managed independently.
For application server I feel either glassfish or Jboss will be a good option considering they are open source and free.
Though at a later point in time you can check with Weblogic (Dev free).
So in your case you would like to break the static data configuration(Counterparty, Currencies), Dealing(Pricing, Quoting, Booking) as two separate feature.

For your choose of an application server i advise you Jboss and specially in his version 7.1 which is faster and more stable!

Related

Update for JavaEE application

Our application are built on Spring boot, the app will be packaged to a war file and ran with java -jar xx.war -Dspring.profile=xxx. Generally the latest war package will served by a static web server like nginx.
Now we want to know if we can add auto-update for the application.
I have googled, and people suggested to use the Application server which support hot deployment, however we use spring boot as shown above.
I have thought to start a new thread once my application started, then check update and download the latest package. But I have to terminate the current application to start the new one since they use the same port, and if close the current app, the update thread will be terminated too.
So how to you handle this problem?
In my opinion that should be managed by some higher order dev-ops level orchestration system not by either the app nor its container. The decision to replace an app should not be at the dev-ops level and not the app level
One major advantage of spring-boot is the inversion of the traditional application-web-container to web-app model. As such the web container is usually (and best practice with Spring boot) built within the app itself. Hence it is fully self contained and crucially immutable. It therefore should not be the role of the app-web-container/web-app to replace either part-of or all-of itself.
Of course you can do whatever you like but you might find that the solution is not easy because it is not convention to do it in this way.

Spring boot project publish to production environment choose war(standalone tomcat) or jar(embedded tomcat)?

Latest project I used Spring boot, and prepare to deploy to production environment, I want to know which way to run application have better performance or have the same performance?
generate a war package and put it in a stand-alone tomcat
generate a jar package and use embedded tomcat
In addition, when publish to production environment if should to remove devtools dependency.
This is a broad question. The answer is it depends on your requirements.
Personally, I prefer standalone applications with Spring Boot today. One app, one JVM. It gives you more flexibility and reliability in regard to deployments and runtime behaviour. Spring Boot 1.3.0.RELEASE comes with init scripts which allows you to run your Spring Boot application as a daemon on a Linux server. For instance, you can integrate rpm-maven-plugin into your build pipeline in order to package and publish your application as a RPM for deployment or you can dockerize your application easily.
With a classic deployment into a servlet container like Tomcat you will be facing various memory leaks after redeployment for example with logging frameworks, badly managed thread local objects, JDBC drivers and a lot more.
Either you spend time to fix all of those memory leaks inside your application and frameworks you use or just restart servlet container after a deployment. Running your application as a standalone version, you don't care about those memory leaks because you are forced to restart in order to bring you new version up.
In the past, several webapps ran inside one servlet container. This could lead to performance degradation for all webapps because every webapp has its own memory, cpu and GC characteristics which may interfere with each other. Further more, resources like thread pools were shared among all webapps.
In fact, a standalone application is not save from performance degradation due to high load on the server but it does not interfere with others in respect to memory utilization or GC. Keep in mind that performance or GC tuning is much more simpler if you can focus on the characteristics of just one application. It gets complicated as soon as you'll need to find common denominator for several webapps in one servlet container.
In the end, your decision may depend on your work environment. If you are building an application in a corporation where software is running and maintained by operations, it is more likely that you are forced to build a war. If you have the freedom to choose your deployment target, then I recommend a standalone application.
In order to remove devtools from a production build
you can use set the excludeDevtools build property to completely
remove the JAR. The property is supported with both the Maven and
Gradle plugins.
See Spring Boot documentation.

When do we need to run a Java application in a container?

Lately I started to learn Java EE and related technologies and there are some concepts which confuse me. Somewhere I read that whenever one is building a Java EE application then it is sort of mandatory to use a container.
Currently, I am learning Spring framework and trying to build a small application with it to get hands-on. Now in that I am not sure if it is mandatory for me to use a container (say Tomcat) or it depends application which I am building that I need a container or not.
If it depends on the application that one is building, then what are the factors which help to decide whether a container should be used or not?
Puuhhh, this is a very big question and there is no simple answer. But I will do my best to explain my own opinion at least:
What are containers?
Containers provide functionality to you. Such a functionality can be to handle web request and dispatch them to servlets - in this case we call them servlet containers (e.g. Tomcat or Jetty).
But containers can also provide other things, e.g. they can provide user authentication, logging or the connection to a database. Most containers (e.g. Tomcat) do multiple of those things (e.g. Tomcat does all I mentioned). Some containers do more then others, e.g. JBoss can do much more than Tomcat.
Trade Off
However, there is a trade off: If you use a simple container (like Tomcat), you need to do a lot of things on you own or by using other Frameworks (like Spring). But if you use a powerful container, you must know the container very well and the chance is high that your application will depend on this concrete container sooner or later.
The point, that using a container is not mandatory. It is a decision. Some people will argue for it, others against it. But depending on the books you read, this decision is already made (e.g. J2EE needs a J2EE container, that's how it works).
The trend (IMHO)
Years ago the trend was to use big and powerful (J2EE) containers which provide as much as possible. IMHO the trend today is to use smaller and light-way solutions. Most developers would prefer to use a Tomcat server instead of a JBoss server today.
Frameworks without containers
While J2EE needs a container, there are other frameworks/technologies which supports the development of web applications without any external container. Such frameworks are Play! or Spark Java.
Note
If you are not familiar with containers and Spring, take care to don't get confused. Most applications you will develop with Spring are web applications which will be deployed to a servlet container. This is very common. But Spring doesn't relay on that. You can also use Spring without such a container, e.g. to develop a desktop application. But if you want to develop a web application, the Java-way is to use a servlet container.
If your application is using servlets, you'll need a container to handle the requests. Tomcat is a very popular choice.
I'll anticipate your next topic to cover with this discussion of "application server" versus "container."
There are two containers. One is Web Container (IIS, Apache) to run Web Applications and another is "Application Container" to run Enterprise Applications.
Web Applications = Apps developed using HTML, XML, CSS and JSPs
Enterprise Applications = Apps developed used JAVA, J2E and Serverlets in addition to HTML and XML.

JBoss Deployment Info

More of a standard practice questions:
Is there any difference in deploying an app as EAR vs WAR? How do you decide? (I know WAR is just a web application may or may not have Java EE features like messaging)
Lets say I have a Spring MVC application stack with Hibernate (MySQL DB), should this be deployed as a War or EAR?
When do we need to worry about JBoss deployment descriptors, if I am not using EJBs. (Just Spring MVC). Lets assume I have JMS as well. Do we need to configure/update/create any other JBoss related config files?
When we package our application EAR/WAR, it include EVERYTHING that we need for our app. Is there a scenario where we need to keep some config / xml files outside of this archive in a specified JBoss folder?
Is it common practice to deploy directly from Eclipse or better to use Ant, etc? Advantage / Disadvantage?
Obviously, I am a newbie :-). Trying to understand this.
1.
This is not always an easy decision, but for beginners and for small projects I would say it's nearly always a WAR. The reason for using an EAR is mainly to isolate a business layer from a UI/Web layer. See this question for more details: How can one isolate logical layers of an Java EE application
2.
I might be mistaken but I think that Spring people typically prefer WARs.
3.
JBoss (vendor) specific deployment descriptors are mostly needed to configure so-called "administered objects" and security. Sometimes they can be used for extra features that are not covered by the Java EE specification (e.g. setting the web root for a WAR). Administered objects are typically data sources (connection to a database) and JMS destinations (queues and topics).
In the traditional Java EE approach these have to be created as far away from the code as possible, which typically means a system admin would create them inside the target AS using some kind of GUI or admin console. In this setup, you as developer would throw a WAR with "unresolved dependencies" over the wall, and a system admin (or "deployer") would then spend days figuring out what those unresolved dependencies should be.
If the communication is relatively good between developers and deployers, the WAR or EAR might be thrown over the wall together with a readme-file, that at least gives some insight into which resources are needed. Depending on the organization the development team might not get any access or feedback about how those "unresolved dependencies" have been resolved. E.g. a data source with a max of 5 connections may have been created, but this may be insufficient if some code does say 10 parallel queries. Without the development team knowing the exact data source configuration, some classes of runtime problems and performance issues may be relatively hard to solve.
To mitigate these problems, some vendors, for some artifacts, offer the developer to create those "unresolved dependencies" instead using proprietary deployment descriptors which are then embedded in the WAR or EAR. For simple local JMS destinations this is then in most cases the end of it, but for data sources there is a little bit more to it. Namely, there has to be a mechanism to switch between data sources for different stages such as Dev, Beta, QA, Production etc. Additionally, it's rarely a good idea to have production passwords in the source code.
If you have a simple app that you want to try out locally, stages and production passwords are not a concern. If you deploy for a (large) company it is.
In Java EE 6 you can define a data source using a standard descriptor (web.xml, ejb-jar.xml or application.xml), and in Java EE 7 you can do the same for JMS destinations. There is no standard way to configure those based on stage, but there is a glimmer of hope that Java EE 8 will address this (see e.g. JAVAEE_SPEC-19). Vendors are not universally happy with those standardized methods, and their main documentation will almost always extensibly tell you how to do those things using their proprietary tools and descriptors, and if you're lucky as a small note tell you there's a standardized way (and then sometimes downplay that or scare you by saying it's not recommended to be used in production).
4.
See answer to 3 mostly. One option to solve the problem of how to switch between stages and keep production passwords out of the WAR/EAR, is to have the full definition of said data source inside the AS (inside JBoss in your case). Every AS installation is tied to a specific server in this setup. If data sources need to be updated, removed or new ones added, you have to communicate with your operations team (if any). As said, depending on your organization this can be anything between trivial and practically impossible.
5.
When developing you most often use your IDE to do a deployment. For production you would never do that. For production you may build with Ant (or Maven) and deploy via something like Jenkins, or e.g Chef.
Check here : .war vs .ear file
If you read the preceeding response, you'd guess that "WAR" it is.
Deployment descriptor are needed to manage the modules of JBoss, if you don't have any conflict or don't need any tweaking, you won't need any deployment descriptor.
You may need to play with some JBoss file if you want to add modules to JBoss, or configure datasources, etc. Read the JBoss documentation for more info.
You can deploy from eclipse during your development phase, but as your other environments (qualification, production, test, etc) should be separeted from your developing one and that they won't have any eclipse installed on them, you should get used to manage your server from the command line and drop your war's in the right directories.
It's a short answer, but I hope it will help.
Read JBoss documentation for more info.

Websphere classloader delegation mode

We are using WebSphere 6.1 application server with default classloader delegation mode i.e. PARENT-FIRST. We think about changing it to PARENT-LAST to be able to choose our jsf implementation or our webservices stack.
As PARENT-FIRST is the default I wonder how many people switched to PARENT-LAST, and what was the reason to switch, and if your life became better since you switched :)
We have a lot of applications in production so I cannot just switch to see what happens, if we do it we will have a lot of testing so I’d like to have to some feedback if you have switched to PARENT-LAST.
Thanks
On projects that I'm assigned to, we actually do switching to PARENT-LAST for most of our applications. The reason for that is usually an app-specific implementation of something, or a need for app-specific property bundle that Websphere uses too (overriding the Websphere setup of commons-logging, for example).
If something breaks after the switch, it is usually because of somewhat wrong setup of the application that suddently starts to be used (while before the switch it was overriden by Websphere's resources).
Portlet applications (deployed on WebSphere Portal Server) always switch their configuration to parent last. In my experience it is always better to switch to parent last, especially if you are using commons logging. This is because WebSphere includes a truck load of stuff in its own classloaders which are often a different versions/configurations to the one that you want to use.
If you are doing it, I would recommend that you script up the deployment of the application because it can be one of those things that are missed when you do a deployment.

Resources