Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
Am planning to have multiple restful services (microservices) deployed in AWS. What are the essential components in AWS needed in order to deploy them and how can the deployment be scripted? Am checking on among the below two approaches as to which suite the best.
Approach 1:
Create the webservice as a Spring Boot application packaged as jar and deploy in AWS.
Approach 2:
Create the webservice as a Spring Boot application packaged as war and deploy to an application server in AWS.
The requirement is part of developing an enterprise application in AWS.
You may have to probably Dokerize your Microservices. Springboot comes with default Tomcat server, you can either use it or use undertow.
Once the Microservices are Dockerized, you can either use Docker Hub or Docker Trusted Registry or (any other tool of your choice) to save the Microservices Images.
The microservices Images can deployed in different Virtual Machines in a Cloud Environment (AWS/Azure/Any) using orchestration tools such DC/OS, Docker Data Center, Rancher, etc. If you're still planning to use your microservices like the way Monolith applications work then it will defeat the whole purpose of Microservices.
There are plenty of tools available to automate your continuous deployment. The right place to start is to try out different tools and analyze the pros/cons and come to a conclusion that fits best for you.
Please refer below for a head start:
http://www.slideshare.net/mongodb/webinar-enabling-microservices-with-containers-orchestration-and-mongodb
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I have several Spring Boot applications that I deploy as uber JAR files. They are deployed to a VM running Linux. Given that:
All required library dependencies (Tomcat, etc.) are self-contained in the uber JAR.
Some applications need to access files on the OS.
I have complete control over the Linux machine and OS.
All the apps are Spring Boot applications (with some being web applications).
I currently run these applications using a process manager, Supervisor.
What benefits are there to running each application in a container such as a Docker container?
The purpose of running applications in a docker container is that you can build docker image once and run it anywhere: on any cloud, or on any machine.
If you plan to run your application only in one single on-premise environment, then you might not need to care about docker containers. But as soon as you need to share your application with other people, who would run it on their own machine, then docker containers are your best choice.
You will never hear "It works on my machine" anymore, from people, who share docker images. If it works on your machine, it works anywhere.
But make sure you move all environment-specific parameters to environment variables. For example, path to local files on your OS, must be configurable by these environment variables, and documented in your docker image.
To summarize: if you run application in docker containers, they will run anywhere, provided user correctly configured environment variables.
In any other cases: for example if you distribute your application as jar file, it will not run on a standalone tomcat server. You would have to specify "web.xml" descriptor and package it as "war" file.
On the other hand, if you distribute your application as a war file, it will not run in your jvm environment. You would have to change "pom.xml" to change the packaging of your application to "jar".
In both of these examples, you have to adapt source code (pom.xml or web.xml) to run it in correspondent environment.
So, if I want to run my spring boot application locally, I package it as a "jar" file and run it as spring boot application with "java" command.
But when I want to run it on a standalone tomcat server, which is installed on Amazon Cloud EC2 Instance, for example, I would have to change source code pom.xml to package it as war file, and add "web.xml" descriptor.
With docker containers you would never have to make any changes to any source code, to run it anywhere else: all environment-specific configuration is exposed as environment variables.
For more details, see "Store config in the environment" of Twelwe-Factor App Manifesto:
https://12factor.net/
Docker containers, actually, comply with all 12 principles of Twelwe-Factor App Manifesto, so this is another reason to use docker containers.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
Can someone share the experience of converting existing spring application to Quarkus
Existing spring application has dependencies with 1) Swagger 2) Oracle Jdbc Driver 3)Logging framework 4) Spring auto wiring
It depends if you want to generate native executable or not.
In JVM mode:
we have a Swagger extension based on OpenAPI that gets you the Swagger UI in dev mode (https://quarkus.io/guides/openapi-swaggerui-guide). If you want to use Swagger itself, well, you should be able to include it without any issue.
the Oracle JDBC driver should work out of the box
logging wouldn't be an issue. We come with JBoss Logging which has several adapters for other frameworks.
we have a Spring compatibility extension for autowiring that translates the Spring annotations to CDI: https://quarkus.io/guides/spring-di-guide
And then, there's the GraalVM native executable mode. And then comes the bad news: I don't think the Oracle JDBC driver will work for now. We don't have an extension for it and I'm pretty sure it won't work out of the box for now.
But Quarkus has benefits even in JVM mode, so it would be worth it anyway.
If you start this journey, we're interested in feedback, either on the mailing list or in GitHub issues.
Which is good in production environment? Individual embedded tomcat for each app or one tomcat for many apps?
The idea of making micro services is mainly to build services that are developed/ deployed/scaled independently. When you try to deploy multiple microservices inside a same container/jvm then you may not be able to leverage all these benefits. Also you CI/CD/integration testing may be hard. Try using embedded containers or container technologies like Docker which ensures complete isolation of the microservices.
The decision also depends on what is the deployment environment of yours. If its cloud going for Docker would be a good idea
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a legacy financial application which is written using EJB2.1(Making use of Entity Beans, Stateless Session Beans and Couple of MDBs).
I want to migrate the application to latest Java EE or Spring framework. The application consists of around 400 entities and Entity beans are mainly used for Creating and Updating.
For the viewing part a separate DAO layer is there and I don't want to touch that part.Also I want to keep the business logic written in service beans as it is very complex to re write.
i.e., I simply want to replace the ORM part of the system. The application is making use of JTA transactions.
Sorry to ask a very high level question, but which technology I can use to replace the ORM.
Spring/Hibernate
Java EE
The primary considerations for the application would be scalability, performance also ease of deployment.
I just want opinions on who have used these technologies, I don't want to start a war between 'evangelists'.
If you find the input is not suffcient please ask me I can provide more details.
the argument here is really between EJB-3.x versus Spring/Hibernate the first caveat being that one does not necessarily mutually exclude the other. (annotations, container, testing, etc.)
there's lots of support in migrating to EJB 2.1 to EJB 3.x and lots of toolsets to assist. one of the principal challenges that i've seen with EJB is integration testing outside the container. (for example. in a continuous integration environment) There are JTA solutions, JNDI solutions and others to support but on the whole, i've found that there is more 'out-of-container' testing support on the Spring migration path than Java EE. that said, there are foundation frameworks such as Arquillian from JBoss designed to support this.
so i would suggest you look at EJB 2.1 to EJB 3 migration paths and then look at the Arquillian framework for integration testing support
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I am new to programming with OSGI. Can anyone provide me a working example of a client/server osgi service invocation.
I have been trying to acheve this for the last 2 weeks without any success.
My service is being discovered and executed by an eclipse instance in the same machine, but when I try the same thing from another machine it fails.
Any help will be appreciated.
Thanks.
In OSGi platform (Release 4 Version 4.1) services discovered through OSGi service registry are local services available only inside single OSGi framework instance (i.e. single JVM). You can not expect to execute an OSGi service running on a different machine.
If you want to call OSGi services across multiple framework instances (i.e. multiple JVMs / multiple machines) you should take a look at Distributed OSGi Specification (RFC 119) that will be part of upcoming OSGi specification (Release 4 Version 4.2) with CXF as a reference implementation.
Update: Another way to call remote OSGi services is to use R-OSGi. It is a middleware that provides an almost transparent way to access services on remote OSGi platforms.
OSGi services are intra-vm, not inter-vm, unless you add distributing on the top.
You might want to look at Brian's tutorial which does a good job of showing how OSGi services can be exported and use ECF to perform the remote distribution. There's quite a few bundles involved but he does a good job in explaining it.
Unless you are playing with either CXF's or Eclipse's Distributed OSGi implementations, there's nothing related to remoting in OSGi. You should be able to make any remoting implementation work between 2 OSGi-based processes.
What I will say is you will probably have class loader issues if you try and use RMI or any of the RPC patterns available in Spring's remoting. This is solvable, but requires a good understanding of OSGi and class loaders.
Does your code work if you run it outside of OSGi? Are you using a firewall? Can you run any network-based service on your PC that is visible to other PCs on the network?
As described, the problem looks more network-related than OSGi-related.
Also, you didn't mention what failure you get when running across different PCs.
The Riena platform of the eclipse foundation provides OSGi remote services by publishing the services as web-service endpoints.
Maybe the answers should be updated, since they are not valid anymore.
Now there is OSGi Remote Services available You can read about it in the OSGi Enterprice Specification Chapter 100. There are two main implementaions: Eclipse ECF and Apache CXF. There is a good example for ECF here