I am using Spring-Batch V3.0.6 with Spring V4.2.4 and Apache Camel 2.16.2 and in my configuration each time a Spring batch job (JSR API) is stopping, it stops my Camel context! It could be because of my config that I will try to explain.
First of all I am using only the JSR API and not native Spring Batch API. To be able to call/use Spring bean inside the job definition (JSR XML) I configured Spring-Batch context loading a shared context (loaded by the ContextLoaderListener). In that shared context I defined the CamelContext via the Spring-Camel tags (to be able also to use Spring beans inside the camel routes easily).
When none jobs ran, we do not have any issue, the Camel routes are up and work perfectly. The problem starts as soon a job is started and stops. Indeed, it seems that when a job is starting is trying to start and when the jobs ends it stops the Camel context by send a 'ContextClosedEvent' that is processed by the CamelContext. From this moment Camel is up only when a job is running.
What can I modify in my configuration and/or code to make it work and let my Camel context up? I maybe configured the shared context wrongly.
Related
We have an application stack, deployed in Tomcat, that consists of several Spring Boot applications. As part of our operations, we want to send some messages to a vm endpoint, where a camel route will consume those messages and then publish them to a JMS topic for any of the other Spring Boot applications that are interested in messages on that topic.
When I start the application stack, there are three spring boot apps that utilize camel, and I see camel start properly in the logs. But when one of the apps sends a message to the vm endpoint, the route that consumes from that endpoint and routes the messages to the jms topic does not seem to get that message. I have placed the camel-core jar in my tomcat lib directory. In the spring boot maven plugin configuration, I have specified an exclusion of the camel-core jar. Oddly enough, that jar is in the WEB-INF/lib of the war anyway! So I have stopped Tomcat, removed that jar from the exploded war, and restarted Tomcat, but that does not change the behavior of the messaging.
Here are the versions that we are using:
Spring Boot 2.3.1
Camel 3.4.2
Tomcat 8.5.5
The first spring boot app that links everything together, with the camel route that consumes from the vm endpoint and produces that message on the jms topic is our "routing engine". It uses camel-spring-boot-starter, spring-boot-starter-artemis, camel-vm-starter, artemis-jms-server and camel-jms-starter. Its RouteBuilder's configure method looks like this:
from("vm:task")
.log(LoggingLevel.WARN, "********** Received task message");
.to("jms:topic:local.private.task")
.routeId("taskToJms");
The app that produces messages to the vm endpoint uses camel-spring-boot-starter and camel-vm-starter. In that app, it has a #Service class that receives a ProducerTemplate that is auto-wired in the constructor. When the application invokes this component to send the message, I see a line in the logs that says
o.a.c.impl.engine.DefaultProducerCache (169) - >>>> vm://task Exchange[]
so it appears that the message is being produced and sent properly to the vm endpoint. However, I see no indication that it has been received/consumed in the routing engine's camel route, since the route's log line is not logging anything, and since I see no other indications of receiving the message in the log. The strange thing is that I am not getting the error of not having any consumers on the vm:task endpoint that I was getting before I put the camel-core jar in tomcat's lib directory.
Am I doing anything obviously wrong? How can I get the spring boot maven plugin to really exclude camel-core? And why are the messages (sent to the vm endpoint) not being consumed by the route in the routing engine? Thanks in advance for any help.
Edit: I was able to keep camel-core out of the war files by adding an exclusion to the configuration of the war plugin, but I was still not able to consume the message on the vm endpoint.
I will post the answer, or at least "an" answer, for anyone who might have found themselves in the puzzling situation that I found myself in.
In short, the answer is that it is best to avoid trying to send VM messages across separate contexts within one big JVM like Tomcat. Instead, use something like JMS. I used Artemis, and I stood up an embedded broker in one of the spring boot apps in tomcat. In other apps (that will be clients), I needed to connect to the embedded artemis server, which requires that you add a #Configuration class (in the module that stands up the embedded broker) that implements ArtemisConfigurationCustomizer:
#Configuration
public class ArtemisConfig implements ArtemisConfigurationCustomizer {
#Override
public void customize(final org.apache.activemq.artemis.core.config.Configuration configuration) {
configuration.addConnectorConfiguration("nettyConnector", new TransportConnfiguration(NettyConnectorFactory.class.getName()));
configuration.addAcceptorConfiguration(new TransportConfiguration(NettyAcceptorFactory.class.getName()));
}
}
That lets your other stuff connect to the embedded Artemis broker. Also, you do not have to worry about upgrading camel-core jars in your tomcat shared lib folder when you upgrade camel to a different version. It's good to keep things simple for maintenance purposes!
Anyway, I hope this helps somebody else who might find themselves here someday.
We are using Spring Boot for running Camel routes using Camel Spring Boot starter. We would like to know how to exactly when the processing is done in Camel? The issue is that since we are calling the executable jar file from outside Camel, it needs to know when the Camel processing has been done (say after processing a bunch of files from a directory). If we enable camel.springboot.main-run-controller=true, Camel process never returns and the outside batch process waits indefinitely. If we make camel.springboot.main-run-controller=false, the camel process will return immediately without processing the files (as routes are started in daemon threads). Is there an easier solution?
You can configure camel spring boot to self terminate after X period of time, X period of being idle, or after processing more than X messages.
You can configure these with the camel.springboot.XXX options in the spring application.properties file.
An alternative is to use control bus to stop the route after it has processed all the files etc. But the former may be easier to just say after its been idle for more than 30 seconds etc.
How to create automatic shutdown in spring integration after finish all process for all files?
My application download undefined number of files and process all files sequentially, the flow is like this:
file:inbound-channel-adapter --> Transformer --> Splitter -->
http-outbound-gateway --> int:aggregator --> mail:outbound-channel-adapter.
At the End, i should shutdown my app.
How to know that all files are processed?
Anyone has an experience about it ?
Thanks
In you case you need to track your inbound directory.
You have to add another step to your flow (probably to track the last file).
I would suggest an ETL technique: Read file descriptions then process them.
When it will be processed you can send event to shutdown the app.
Spring boot application can be stopped by using ConfigurableApplicationContext:
SpringApplication app = new SpringApplication(Application.class);
ConfigurableApplicationContext context = app.run(args);
context.close();
Also there is another way using Spring Actuator shutdown endpoint:
See How to shutdown a Spring Boot Application in a correct way?
Spring Boot documentation
I'm trying to implement a graceful shutdown sequence for my Spring Boot application. For that I registered a custom shutdown hook with Runtime and disabled the one provided by Spring (SpringApplication.setRegisterShutdownHook(false)). From this custom shutdown hook I first would like to pause embedded Tomcat or the connectors and some other schedulers after which I manually invoke applicationContext.close() to shutdown the rest of the Spring application.
What is the best way to get access to the embedded Tomcat instance? I was fiddling around with TomcatEmbeddedServletContainerFactory but this does not seem to give me access to default connectors or EmbeddedServletContainer which has a stop method.
You can access the EmbeddedServletContainer from the EmbeddedWebApplicationContext (just inject that) and downcast it.
i saw some code use ShutdownHook like this
Runtime.getRuntime().addShutdownHook(new Thread(){
ConfigurableApplicationContext.stop();
//close spring context;
threadpool.shutdownnow();
//close theadpool
});
is there anything useful to do like this?
i thought
when jvm exit ,maybe thread will be shutdown immediately
and spring context will close tooï¼›
what shall we do next when we need to call System.exit() ?
It really depends on your application and the lifecycle of your objects and those threads you appear to have outside of your context. If you are running the spring container inside a standalone java process, then trapping the shutdown hook like this is one way to do that. Another way is to have it listen on a tcp port and send a command to begin the shutdown process. If you are running in a web container like tomcat, then you should follow the standards on normal webapp shutdown, which Spring supports with Context Listeners.
I would also consider redesigning your app so that the threads are all managed with a bean that lives inside your spring container. For instance using a bean that is configured with directives (attributes) for start/stop methods and then that bean would use an Executor for thread pooling. This way, your shutdown is ONLY shutting down the Spring container, and Spring provides very good support for orderly shutdown of beans. One of those beans is your object holding the threads within the Executor. Its a much cleaner way than trying to integrate Spring beans with external threads.
Hope this helps.