Spring Rabbit mq Listner complete the process even if is killed - spring

I have integrated rabbitmq in a spring application.In my application i am doing indexing on solr using rabbit mq.
On my every queue i have set only one listner.
I want to stop the listner on message progess.But the problem is that when i am going to stop the listner by registry.stop the rabbit mq ui and logs showing that the listner is stopped. but the message on which it works sucessfully index on solr.
As per my knowledge after killing the listner, the message also not going to further process.

That's not correct. It just stops to consume more messages from the queue. Currently in-flight messages are processed gracefully. Why would one want do not do that? You would lose the data which was consumed and acknowledged on the broker.

Related

How to retry consuming message, then stop consuming, when error occurs in listener

I have Kafka listener writing data to a database, in case of database timeout (JdbcException), I'd like to retry, and if timeouts persist, stop consuming Kafka message.
As far as I understand, Spring Kafka 2.9 has 2 CommonErrorHandler implementations:
DefaultErrorHandler tries to redeliver messages several times and then send failed messages to logs or DLQ
CommonContainerStoppingErrorHandler stops listener container and message consumption
I would like to chain both of them: first try to redeliver messages several times, then stop container when delivery doesn't succeed.
How can I do that?
Use a DefaultErrorHandler with a custom recoverer that calls a CommonContainerStoppingErrorHandler after the retries are exhausted.
See this answer for an example
How do you exit spring boot application programmatically when retries are exhausted, to prevent kafka offset commit
(It uses the older SeekToCurrentErrorHandler, but the same concept applies.)

Can we restrict spring boot rabbitmq message processing only between specific timings?

Using Spring boot #RabbitListener, we are able to process the AMQP messages.
Whenever a message sent to queue its immediately publish to destination exchange.
Using #RabbitListener we are able to process the message immediately.
But we need to process the message only between specific timings example 1AM to 6AM.
How to achieve that ?
First of all you can take a look into Delayed Exchange feature of RabbitMQ: https://docs.spring.io/spring-amqp/docs/current/reference/html/#delayed-message-exchange
So, this way on the producer side you should determine how long the message should be delayed before it is routed to the main exchange for the actual consuming afterwards.
Another way is to take a look into Spring Integration and its Delayer component: https://docs.spring.io/spring-integration/docs/5.2.0.BUILD-SNAPSHOT/reference/html/messaging-endpoints.html#delayer
This way you will consume messages from the RabbitMQ, but will delay them in the target application logic.
And another way I see like start()/stop() the listener container for consumption and after according your timing requirements. This way the message is going to stay in the RabbitMQ until you start the listener container: https://docs.spring.io/spring-amqp/docs/current/reference/html/#containerAttributes

need spring rabbitmq send a message to all customers - disable round robin for one queue

I have a couple of queues and I need to do the following with ONE of them:
A producer should send a message to this queue, but ALL consumers should receive it. So, if I have 5 spring listeners on this queue, each of them should receive the message, but not the producer. I do that because I have a tomcat cluster and rabbitmq asynchronous messages, and if I get response from the worker, I don't know how to dispatch it to the correct tomcat node. So I decided to broadcast all worker replies to all tomcat nodes. Each tomcat cluster node listens the same output queue. Then, if it's a correct tomcat instance, it will be processed, all other copies will be lost, and it's ok. How to implement it? How make consumers on tomcat's end to receive the same message the same time?
Ok, found the solution here:
RabbitMQ / AMQP: single queue, multiple consumers for same message?
It's impossible to do in rabbitmq, need to create a couple of queues for each consumer.

Stop Spring standalone service

I am using Spring Integration in my project.
We have a requirement that in case where we will have stop Spring standalone service if database goes down.
In Message listener when I persist the data into database I check if I get CannotGetJdbcConnectionException then stop the Spring service using applicationContext.close() method.
Problem here is if I received any message on to the Queue and database goes down.
I tried to close Spring service then all resource goes down except DefaultMessageListenerContainer that holds that message.
If I terminate the process manually then message goes into inbound Queue which is correct.
Is there any way I could stop Spring service forcefully and put the message back to Inbound Queue?
I hope I am clear with my point here.
Thanks
Sachin
You should configure the DMLC with setSessionTransacted(true) (acknowledge="transacted" when using the namespace to define the endpoints).
Then any in-flight messages will be rolled-back onto the queue.

JBoss doesn't process JMS messages

Although JBoss seem to receive the JMS messages (I can list them through jmx-console) it doesn't process them. They stayed queued forever. What might be the reason for that?
Do you have a message consumer running to process the queue?
This could be something like a message driven bean, or another JMS client connecting to the queue.

Resources