Create a job from Jms message receive - jms

Using wildfly 15 and only JavaEE (no spring) I need to consume messages from a Jms queue, in order and create for every message a new job using Jbatch, in sequence, without job overlap.
For example:
JMS queue: --> msgC --> msgB --> msgA
Jbatch:
on receive msgC, create JobC, run jobC
wait for JobC to end, watching JMS queue, on receive msgB, create JobB, run JobB
wait for JobB to end, watching JMS queue, on receive msgA, create JobA, run JobB
It's possible to achieve this ?

Processing messages in parallel or the right sequence is some standard behaviour in JMS clients and you can simply configure to do it right. That's why you have a queue. Just ensure you have only one message driven bean working on it, which should ensure you have one process and nothing running in parallel.
If you handover the task to the batch API, a different set of threads will process it, and now you need to manually ensure one job terminates before the next can start. So your message driven bean would have to poll and wait until the batch executed.
Why would you do this as it just makes your life more complicated?
I believe you could still benefit from the easy orchestration of batch steps, the restart capability or some parallel execution which you would have to cover in your message driven bean yourself.

Related

Asynchronous Kafka consumer in Spring Batch Application

In our Spring Batch application workers, item processors are further interacting with another service asynchronously through Kafka. The requirement here is we required an acknowledgement in order to retry failed batches and the condition is to not wait for the acknowledgement.
Is there any mechanism in spring batch by which we can asynchronously consume Kafka ?
Is it possible to rerun specific local worker step in rerun of job?
We implement producers and consumers over same step using Spring batch decider. Thus, during the first run it will only produce Kafka and on second run it will consume the Kafka.
We are looking for solution where we can asynchronously consume Kafka in Spring batch application in order to rerun specific worker step.
Is there any mechanism in spring batch by which we can asynchronously consume Kafka ? Is it possible to rerun specific local worker step in rerun of job?
According to your diagram, you are doing that call from an item processor. The closest "feature" you can get from Spring Batch is the AsyncItemProcessor. This is a special processor that processes items asynchronously in a separate thread. The callback is unwrapped in an AsyncItemWriter with the result of the call.
Other than that, I do not see any other obvious way to do that with a built-in feature from Spring Batch. So you would have to manage that in a custom ItemProcessor.

Spring Kafka Listener to pause processing of messages for certain period of time

Here is my use case.
I have an end-of-the-day feed process that runs on a separate process on a scheduled basis.
During the day, I will receive real-time updates through Kafka topics which the Spring Kafka listeners will consume and process. But, I want to pause the Kafka listeners during the EOD feed job kick-started and resume once the job completes.
Here are the approaches I am considering.
I already exposed JMX beans to stop and start the listeners for some admin purposes. The same I want to leverage to stop/start the Kafka listeners when the EOD feed job kicks in. This will be more controlled synchronously.
As I have more control over the EOD job when that kicks in and when that will complete. During that time, the Kafka listeners can be paused based on the schedule. As this is asynchronous, this will cause problems for some reason the EOD feed job takes more time to complete than usual.
Are there any other options to stop the Kafka listeners for a particular amount of time?
Assuming you are using #KafkaListener, you can pause/resume (or stop/start) listener containers using the KafkaListenerEndpointRegistry bean.
Give each listener an id to get it from the registry.
See #KafkaListener Lifecycle Management in the documentation.
If you are not using #KafkaListener and using listener containers directly as beans, you can simply pause/resume, or stop/start the container beans themselves.

Running RabbitMQ consumer in different thread after consuming message

I need to know about the processing of consumed message(thread flow) via Spring SimpleMessageListenerContainer
I have following understanding
1) Messages consumed via consumer threads. (you can define the consumer thread pools via task executors).
2) the same consumer thread who receives the message process it and gets blocked until it does not finish the execution of handler method.
3) meanwhile other consumer threads gets created to consume the other messages and process those message. The interval to create those consumer threads is based on setStartConsumerMinInterval settings.
Please let me know if I am correct?
The next part is
I want to separate the consuming of message and processing of message in different threads(differnt pools for consuming and processing) how we can do that?
I have tried this way, I have made the handle message of handler as #Async to run it in different threads. Is it a correct way or any better way is available?
The last part is
in my Spring boot application I am both publishing and consuming messages, and I am using a single connection factory(CachingConnectionFactory). Should I use 2 connections factories 1 for publishing and other for consuming? and pass respecctive connection factory to the publishing and consuming beans?

Deferred consumption of message queue

Sorry this might sound naive to JMS gurus, but still.
I have a requirement where a Spring based application is not able to connect synchronously to a SAP back-end (via their web-service interface) because the response from SAP is way too slow. We are thinking of a solution where the updates from GUI would be saved by the Spring middle-ware in a local database, simultaneously sending a message to a JMS queue. We want that after (say) every few hours (or may be nightly) a batch job runs to consume the message from the JMS queue, and based on the message contents, queries on the local database and sends the result to the SAP web-service.
Is this approach correct? Would I need a batch to trigger the JMS message consumption (because I don't want to consume the message immediately but in a deferred manner and at a pre-decided time)? Is there any way in Spring to implement this gracefully (like Camel)? Appreciate your help.
Spring Batch has a JmsItemReader that can be used in a batch program; an empty queue signals the end of the batch. Spring Cloud Task is built on top of batch and can be used for cloud deployments.

Spring batch step issue

SpringJMSListener on receiving the message invokes the spring batch job. It is configured to use DefaultMessageListenerContainer with concurrency of 5 and max concurrency of 15.
Spring batch has job definition of 4 steps that are configured as tasklets.
When there are multiple request submitted, jms listener picks up 5 messages and runs the spring batch.
But occasionally, few jobs takes more time to execute when it moves from one step to another step. Couldnt find any specific reason on why spring batch takes more time from between step execution. That is after the step is completed and before the next step is started. This doesnt happen always.
Any insights on this specific problem?

Resources