I've 2 applications each one uses different spring application context configuration on the same JVM, and every time i tries to run both of them together i found a problem that the last one configuration always overrides the previous one so the spring context loaded with the last one configuration, any advices how to overcome this, by letting every application runs with it's configuration without affected by other spring context.
Related
I am trying to improve performance of medium tests in Spring Boot.
I am using the Spring Boot - testcontainers library.
For an individual test this works really well, with a few annotations I can get access to kafka, zookeeper, and schema-registry. These are full services so it takes a few seconds to start everything up, all together setup takes about 40 seconds. The test accurately recreates a realistic deployment, it's beautifully simple.
This would be fine if it just happened once but it happens every time a Spring Context is created. That means every test that uses #MockBean incurs that 40 second cost.
I've tried refactoring into a single TestConfiguration class and referencing that. I've looked into using ContextHierarchy but I think that means I'll lose all of the Spring Boot niceties and I'll need to recreate the context (which means it won't look exactly like the context created by the production app).
Is there a better way to do this?
Spring framework already took care of this scenario.
There is a concept of caching the application context for test class/classes.
See the documentation.
Few lines from the documentation:
The Spring TestContext framework stores application contexts in a
static cache. This means that the context is literally stored in a
static variable. In other words, if tests run in separate processes,
the static cache is cleared between each test execution, which
effectively disables the caching mechanism.
So essentially you need to structure your code or context configuration in such a way that you use cached context in your desired test cases.
But use this capability wisely, if not thought through properly this could lead to undesired side-effects
We have 40+ spring boot apps and when we try to start all of them together parallel, it takes about 9 to 10 minutes. And we notice that CPU usage is always 100% throughout this entire duration.
After all apps come up successfully and registered with Eureka, CPU usage is back to normal (on average ~30-40% CPU usage after startup).
It seems each spring boot app is taking at least about 15-20 seconds to startup, which we are not happy with since application is relatively small to start with.
We also disabled spring boot auto-configuration so to make sure only required "matching" classes are loaded at start up by spring boot. And we only gained about 1 or 2 seconds at startup after this change.
We seem to have enough system resources with 8 core CPUs and 32 gb of memory on this VM.
Spring boot version is 1.3.6.RELEASE.
Is it something to do with Spring boot? Because even when we startup single spring boot app it spikes CPU to 70-80% usage. Your help is very much appreciated!
This is more of how many beans and Auto Configurations that get executed while the application being started.
For even a simple web application along with JPA, there is a webcontainer and its thread pools, DataSources initializations and many more supporting beans and auto configurations that need to get initialized. These are some serious resource taking actions and they all are rushed at the start of the application to get application booted as soon as possible.
Given that you are starting 40+ apps like these simultaneously, the server will have to pay its toll.
There are ways you can improve the application boot time.
Remove unnecessary modules and bean definitions from your application. Most common mistake a developer makes is to include a spring-boot-starter-web when the application doesn't even need a web environment. Same goes for other starter modules.
Make use of Conditional Bean definitions with the use of #ConditionalOnMissingBean #ConditionalOnProperty #ConditionalOnClass #ConditionalOnBean #ConditionalOnMissingClass #ConditionalOnExpression. This might backfire if you make spring to check for beans with lots of conditions.
Make use of spring profiles. If you don't want a specific set of beans not to be part of that running instance you can group them into a profile and enable them or disable them
Configure initial number of threads a web container can have. Same goes for Datasources. Initiate your pool with only required number of active threads.
Using lazy-initialization for beans by annotating your classes or beans with #Lazy. This annotation can be per bean or against an entire #Configuration.
If that doesn't satisfy your needs, you can always throttle the CPU usage per process with commands like nice or cputools.
Here is an article around cputools.
I have 1 main spring boot application and 2 dependent spring boot applications. Dependent app load some data(settings, and so on) from main app on start. Some external service may change this data in main app and dependent app should reload this data. Is there some library or framework.
There are 3 solutions:
Always retrieve data from every requests. I don't think this is your case since you have loaded them on the beginning to avoid this solution.
Refresh the initial data after some amount of time (cron job).
Main application acts as micro-service orchestrators and inform all its dependant application that their cache may be invalid. Every dependant application answer this request by refreshing their cache.
For solution number 2 take a look at spring boot #Scheduled annotation. For example you can refresh the data every 5 minutes #Scheduled(fixedRate = 300000).
Just looking for some information if others have solved this pattern. I want to use Spring Integration and Spring Batch together. Both of these are SpringBoot applications and ideally I'd like to keep them and their respective configuration separated, so they are both their own executable jar. I'm having problems executing them in their own process space and I believe I want, unless someone can convince me otherwise, each to run like they are their own Spring Boot app and initialize themselves with their own profiles and properties. What I'm having trouble with though is the invocation of the job in my SpringBatch project from my SpringIntegration project. At first I couldn't get the properties loaded from the batch project, so I realized I need to pass the spring.active.profiles as a Job Parameter and that seemed to solve that. But there are other things in the Spring Boot Batch application that aren't loading correctly like the schema-platform.sql file and the database isn't getting initialized, etc.
On this initial launch of the job I might want the response to go back to Spring Integration for some messaging on Job Status. There might be times when I want to run a job without Spring Integration kicking off the job, but still take advantage of sending statuses back to the Spring Integration project providing its listening on a channel or something.
I've reviewed quite a few Spring samples and have yet to find my exact scenario, most are with the two dependencies in the same project, so maybe I'm doing something that's not possible, but I'm sure I'm just missing a little something in the Spring configuration.
My questions/issues are:
I don't want the Spring Integration project to know anything about the SpringBatch configuration other than the job its kicking off. I have found a good way to do that reference to the Job Bean without getting my entire batch configuration loading.
Should I keep these two projects separated or would it be better to combine them since I have two-way communication between both.
How should the Job be launch from the integration project. We're using the spring-batch-integration project with JobLaunchRequest and JobLauncher. This seems to run it in the same process as the Spring Integration project and I'm missing a lot of my SpringBootBatch projects initialization
Should I be using a CommandLineRunner instead to force it to another process.
Is SpringApplication.run(BatchConfiguration.class) the answer?
Looking for some general project configuration setup to meet these requirements.
Spring Cloud Data Flow in combination with Spring Cloud Task does exactly what you're asking. It launches Spring Cloud Task applications (which can contain batch jobs) as new processes on the platform you choose. I'd encourage you to check out that project here: http://cloud.spring.io/spring-cloud-dataflow/
We are using cucumber-jvm to write an integration test layer in our application. One of the challenges we are finding is managing the database between the tests and the web application.
A typical scenario is that we want to persist some entities in a Given step of a scenario, then perform some actions on the user interface that may, in turn, persoist more entities. At the end, we want to clean the database. Because the cucumber-jvm tests are in one jvm and the web application is running in another jvm we cannot share a transaction (at least in a way of which I am aware) so the database must be cleaned manually.
My initial thought was to use an Embedded Tomcat server running off of an embedded in-memory database (HSQLDB) in the same JVM as the cucumber-jvm test. This way we might be able to share a single spring container, and by extension a single transaction, from which all objects could be retrieved.
During my initial tests it looks like Spring gets loaded and configured twice: once when the test starts and the cucumber.xml is read, and a second time when the embedded tomcat starts and the web application reads its applicationContext.xml. These appear to be in two completely separate containers because if I try to resolve an object in one container that is specified in the other container then it doesn't resolve. If I duplicate my configuration then I get errors about duplicate beans with the same id.
Is there a way that I can tell Spring to use the same container for both my test application and the embedded tomcat?
I'm using Spring 3.2.2.GA and Embedded Tomcat 7.0.39 (latest versions of both libraries).
Am I crazy? Do I need to provide more technical details? Apologies if I use some incorrect terminology.
Thanks
p.s. If my problem seems familiar to you and you can suggest an alternative solution to the one I am trying, please let me know!
Jeff,
It is normal that spring is loaded twice. There are two places where two spring contexts are created:
In the servlet container listener org.springframework.web.context.ContextLoaderListener that is configured in web.xml. This one reads its configuration from the file set by the context-param contextConfigLocation.
In the implementation of ObjectFactory provided by cucumber-spring plugin cucumber.runtime.java.spring.SpringFactory. This one reads its configuration from cucumber.xml.
The two spring contexts are totally different and their instances are kept in two different places. As a servlet context attribute for the former and kept by the JavaBackend for the latter.
When starting the embedded tomcat, it is possible to get access to the servlet context and thus set ourself the spring context used bt tomcat with the one from cucumber. But, spring has a special class called WebApplicationContext for context used in a servlet container. The cucumber SpringFactory on other hand creates its context through ClassPathXmlApplicationContext. So unless there is a way to specify the type of application context from the xml config, we will have to provide an ObjectFactory that shoots a WebApplicationContext.
What we can do is to have two web.xml. One for the normal and one for the test. For the test, we use our version of the ContexLoader listener.