Spring Cloud Task - Support Multiple Application Contexts - spring-cloud-task

It appears that Spring Cloud Task lifecycle is incorrectly managed when spring boot application has hierarchical application contexts.
When i add #EnableTask annotation to the parent ApplicationContext, it registers the task, but records execution time from the parent context, failing to record accurate execution time and exit code (always success as parent context closes successfuly).
On the other hand, if i add annotation to the child context (which actually runs CommandlineRunner), it fails to start the task at all with below exception:
o.s.c.t.listener.TaskLifecycleListener : [] [] An event to end a task has been received for a task that has not yet started.
s.c.a.AnnotationConfigApplicationContext : [] [] Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextException: Failed to start bean 'taskLifecycleListener'; nested exception is java.lang.IllegalArgumentException: Invalid TaskExecution, ID 132515 not found
Looking at the TaskLifecycleListener source, it appears that it reacts to ApplicationEvents from parent context and catches ApplicationReadyEvent from parent context before the task is started.
Spring Boot 2.2.6 / Spring Cloud Task 2.2.2
Any thoughts?

This is by design. The "task" is registered as a single execution of a Boot application, not the execution of an ApplicationContext. In a microservices world, you would want to break up your tasks into independent artifacts, therefore running them as independent Spring Boot applications. There is no support for Spring Cloud Task to support multiple "task" executions within a single Spring Boot application as that goes against what its intent is. If you feel this is something that should be added, feel free to open up an issue on Github where we can explore your use case deeper.

Actually, I realized what happens is that Spring auto configures task in my main parent context which closes the task. And same appears to be happening in my child context. Since executionId is passed as parameter, it is already updated by the parent as executed, so it fails in my child context.
Solution I found to this was to exclude autoconfiguration of SimpleTaskAutoConfiguration from my parent context.
#SpringBootApplication(exclude = {SimpleTaskAutoConfiguration.class})
Instead I manually import configuration class in my child context which executes the task:
#EnableTask
#Import(SimpleTaskAutoConfiguration.class)
That now tracks actual time it took to execute the command line runner as a task within child context, although it doesn't account for time of running whole app, at least it reflects more accurate picture.

Related

Running scheduler in Spring boot is spawning a process external to Spring boot application context

I am scheduling a task that runs at fixed rate in Spring boot. The function that I am using to schedule a a task is as below:
private void scheduleTask(Store store, int frequency) {
final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
Runnable task = store::scan;
scheduler.scheduleAtFixedRate(task, 0, frequency, TimeUnit.MILLISECONDS);
}
This works fine but if if there is an exception at application startup, the application should exit on exception. What is happening is that I get the exception in the log and the message "Application Failed to start" but looks like the scheduler shows as still running although it looks like only the scheduled thread is still running.
Any hints on how to properly schedule an asynchronous task in a Spring boot application? I tried the #Scheduled annotation but it does not run at all.
The #Scheduled should work. Have you added the #EnabledScheduling annotation to a #Configuration or the #SpringBootApplication? The Scheduling Getting Started explains it in detail.
Regarding the scheduleTask method: What calls that? Is it started outside the Spring context? If yes then Spring won't stop it. You have to take care of the lifecycle.
You should try to use the #Scheduled as it will manage the thread pools/executors for you and most people will find it easier to understand.

Run Spring test cases concurrently not sequentially

My Spring Boot application also starts a gRPC service along with its REST (HTTP) service. I've written specific tests for gRPC and REST. When I run a gradle test these tests are run sequentially, however; there is no reason they can't be run in parallel.
What I'm shooting for here is a single instance of my Spring Boot application running while the tests are executed in parallel.
I've tried setting the test section in my gradle file so it has 'forkCount', I also tried setting options such that parallel="classes", but this produces an error about the 'parallel' being an unknown property (maybe a junit 5 thing?)
test {
options {
parallel = "classes"
// forkCount = 2
}
}
The forkCount option is not what I'm looking for since it will start multiple instances of the spring application.
I've also tried removing the #RunWith from the test classes and making a separate test class (which has the #RuWith annotation) that has the following method in it
#Test
void testRunner() {
JUnitCore.runClasses(ParallelComputer.classes(), {GrpcTests.class, RestTests.class});
}
But the tests still appear to run sequentially.
I've tried several other things as well, sorry I don't have all of them handy.
Goal
Ideally what I'm hoping for is a single instance of my Spring Boot app running while the test classes run in parallel (bonus kudos if I can get the methods to run in parallel too)
Java Version: "1.8.0_171"
Spring Boot Version: 2.0.4.RELEASE
Per the recommendation I tried adding the
#Test
public void contextLoads() throws Exception {
}
And adding the 'maxParallelForks' entry in the gradle file, I had already been using the #SpringBootTest annotation but this behaved the same as when I used 'forkCount` in that at least 2 instances where started as can be seen by the test shutdown log
2019-04-25 10:24:17.245 LogLevel=INFO 53838 --- shutting down gRPC server since JVM is shutting down
...
2019-04-25 10:24:30.125 LogLevel=INFO 53839 --- shutting down gRPC server since JVM is shutting down
You can see I get two shutdown messages and the PIDs are shown (53838 & 53839).
You need to combine #SpringBootTest with maxParallelForks.
Annotate your unit tests with #SpringBootTest. #SpringBootTest will boot up a Spring Boot context that will be cached across all your tests.
"A nice feature of the Spring Test support is that the application context is cached in between tests, so if you have multiple methods in a test case, or multiple test cases with the same configuration, they only incur the cost of starting the application once"
See:
https://spring.io/guides/gs/testing-web/
Add the following to your build.gradle. To run multiple test at the same time.
tasks.withType(Test) {
maxParallelForks = 4 //your choice here
}
See https://guides.gradle.org/performance/#parallel_test_execution

What are advantages of using #ContextHierarchy over pure #ContextConfiguration

Hi I don't understand what advantages give me usage of #ContextHierarchy like below:
#ContextHierarchy({
#ContextConfiguration("/test-db-setup-context.xml"),
#ContextConfiguration("FirstTest-context.xml")
})
#RunWith(SpringJUnit4ClassRunner.class)
public class FirstTest {
...
}
#ContextHierarchy({
#ContextConfiguration("/test-db-setup-context.xml"),
#ContextConfiguration("SecondTest-context.xml")
})
#RunWith(SpringJUnit4ClassRunner.class)
public class SecondTest {
...
}
over usage of single #ContextConfiguration with locations argument, like below:
#ContextConfiguration(locations = {"classpath:test-db-setup-context.xml", "FirstTest-context.xml", "SecondTest-context.xml" })
In each case, application contexts are shared across diffrent junit test classes.
The difference is that beans in each context within the context hierarchy can't see beans in the other the context. So you can isolate different parts of your item under test.
An imortant thing to note here is that In case of #ContextHierarchy we get SEPARATE contexts that have SEPARATE life-cycles (initialization, shutdown). This is important because for example they can fail independently.
A practical example from my yard. We have a Spring application that communicates with some external services. We wanted an E2E test that starts these dependent services and the runs the tests. So we added an initializer to our #ContextConfiguration:
#ContextConfiguration{classes = TheApp.class, initializers = DockerInitializer.class}
public class TheAppE2ETests {
// ...
}
The initializer was preparing the external services (starting Dockers), customizing the properties so that The App can run and attaching to the close context event so that the Dockers could be cleaned up. There was a problem with this approach when The App context was failing to load (e.g. due to a bug):
After a failed initialization the ContextClosedEvent is not fired - the dockers were not stopped and cleaned up.
When the context fails to load the initializer is called over and over for every test which is run (not only for every test class - for every test method!).
So the tests kept on killing our CI environment every time a bug in The App's context caused the initialization to fail. The containers for the dependent services were started for every single test method and then not cleaned up.
We ended up using #ContextConfiguration and having two separate contexts for dockers and The App itself. This way in case of an above-mentioned situation the dockers are started in a separate context and therefore live there own live and can even be shared across multiple Spring tests (due to Spring's context caching mechanism).

Spring on WebSphere 8: Quartz job with web service call throws JAXBException "<class> is not known to this context"

I'm facing a JAXBException " is not known to this context" when calling a web service from within a job controlled by Quartz on Spring:
javax.xml.ws.WebServiceException: javax.xml.bind.JAXBException: com.xxxx.yyyy.zzzz.ImageMetaData is not known to this context
at org.apache.axis2.jaxws.ExceptionFactory.createWebServiceException(ExceptionFactory.java:175)
at org.apache.axis2.jaxws.ExceptionFactory.makeWebServiceException(ExceptionFactory.java:70)
at org.apache.axis2.jaxws.ExceptionFactory.makeWebServiceException(ExceptionFactory.java:128)
at org.apache.axis2.jaxws.marshaller.impl.alt.DocLitWrappedMinimalMethodMarshaller.demarshalResponse(DocLitWrappedMinimalMethodMarshaller.java:624)
at org.apache.axis2.jaxws.client.proxy.JAXWSProxyHandler.createResponse(JAXWSProxyHandler.java:593)
at org.apache.axis2.jaxws.client.proxy.JAXWSProxyHandler.invokeSEIMethod(JAXWSProxyHandler.java:432)
at org.apache.axis2.jaxws.client.proxy.JAXWSProxyHandler.invoke(JAXWSProxyHandler.java:213)
at com.sun.proxy.$Proxy299.findAllImageMetaData(Unknown Source)
I'm having a Spring 3.2.4 Java EE application with JSF running on IBM WebSphere v8.
When calling a specific web service from the JSF part of the application (i.e. from an action or a service), everything's ok.
The exception occurs only when the call is done from within a Quartz/Spring triggered job.
Executing exacty the same job code from the action does not result in an exception.
I tried a lot of different things like using a corresponding #XmlSeeAlso annotation in the JAXB generated classes but even using the annotation in the webservice interface itself does not solve the issue.
I also updated the Spring and Quartz libraries to more recent versions but this didn't help.
Anyone any idea?
I've finally solved the issue.
After much analysis I encountered the following issue in the Spring framework:
https://jira.spring.io/i#browse/SPR-11125
When a job is triggered via Spring/Quartz on WebSphere, the wrong ContextClassLoader is set.
This may cause many different problems - among them is the JAXBException as described.
The Spring bug is still open - so as a workaround I had to overwrite the context class loader of the current thread by the correct one:
ClassLoader cl = invoiceService.getClass().getClassLoader();
Thread.currentThread().setContextClassLoader(cl);
The correct class loader can be simply retrieved by a class that has been loaded by the container. Using this class loader as the context class loader for the current thread solved my issue.

How to refresh the Spring context when using CXF?

We have a web application that uses Spring (3.0.5) and CXF (currently 2.4.2 for various reasons but upgrading is an option if that makes any difference) and is deployed on Tomcat.
The application is initialized using the org.springframework.web.context.ContextLoaderListener.
Starting and shutting the application down works like a charm but if I try to refresh the Spring application context, using
((ConfigurableApplicationContext)applicationContext).refresh();
I run into problems. The application context first destroys all its beans (including CXFBusImpl, or rather its subclass SpringBus). SpringBus however calls close() on its application context - leading to a NullPointerException when the application context shortly after tries to close its bean factory:
java.lang.NullPointerException
at org.springframework.context.support.AbstractRefreshableApplicationContext.closeBeanFactory(AbstractRefreshableApplicationContext.java:152)
at org.springframework.context.support.AbstractRefreshableApplicationContext.refreshBeanFactory(AbstractRefreshableApplicationContext.java:124)
at org.springframework.context.support.AbstractApplicationContext.obtainFreshBeanFactory(AbstractApplicationContext.java:467)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:397)
Is there anything I can do to avoid this (other than modifying CXF)? If I skip CXF everything works.
I don't think you can tell CXF not to work that way. What you could do though is to isolate the parts of your application that need restarting into their own context that you build and and tear down as you choose without involving the main context over much. Perhaps you'd do that with a ClassPathXmlApplicationContext, though there are a few choices. I think you'll be setting the outer context as the parent of the inner, and referring to outer beans with XML-config syntax like:
<ref parent="foo" />
You'll then need to create some way of proxying the activity with CXF in the outer context to the beans in the inner context. This is the tricky part, as it is usually considered bad form for references to go that way round. You'll probably have to have some kind of registry/proxy in the outer context that (relevant) inner beans connect to as part of their creation/init process (and deregister from at tear-down). You'll also have to decide how to handle the case where a request needs to be served when there is no inner context. Tricky, especially if you want to do it elegantly...

Resources