spring boot rabbitmq dead letter queue config not work - spring

I config spring boot rabbit' dead letter queue, but ErrorHandler never receive any message. I search all the questiones about dead letter queue, but could not figure out. Can anyone help me ?
RabbitConfig.java to config dead letter queue/exchange:
#Configuration
public class RabbitConfig {
public final static String MAIL_QUEUE = "mail_queue";
public final static String DEAD_LETTER_EXCHANGE = "dead_letter_exchange";
public final static String DEAD_LETTER_QUEUE = "dead_letter_queue";
public static Map<String, Object> args = new HashMap<String, Object>();
static {
args.put("x-dead-letter-exchange", DEAD_LETTER_EXCHANGE);
//args.put("x-dead-letter-routing-key", DEAD_LETTER_QUEUE);
args.put("x-message-ttl", 5000);
}
#Bean
public Queue mailQueue() {
return new Queue(MAIL_QUEUE, true, false, false, args);
}
#Bean
public Queue deadLetterQueue() {
return new Queue(DEAD_LETTER_QUEUE, true);
}
#Bean
public FanoutExchange deadLetterExchange() {
return new FanoutExchange(DEAD_LETTER_EXCHANGE);
}
#Bean
public Binding deadLetterBinding() {
return BindingBuilder.bind(deadLetterQueue()).to(deadLetterExchange());
}
}
ErrorHandler.java to process DEAD LETTER QUEUE:
#Component
#RabbitListener( queues = RabbitConfig.DEAD_LETTER_QUEUE)
public class ErrorHandler {
#RabbitHandler
public void handleError(Object message) {
System.out.println("xxxxxxxxxxxxxxxxxx"+message);
}
}
MailServiceImpl.java to process MAIL_QUEUE:
#Service
#RabbitListener(queues = RabbitConfig.MAIL_QUEUE)
#ConditionalOnProperty("spring.mail.host")
public class MailServiceImpl implements MailService {
#Autowired
private JavaMailSender mailSender;
#RabbitHandler
#Override
public void sendMail(TMessageMail form) {
//......
try {
mailSender.save(form);
}catch(Exception e) {
logger.error("error in sending mail: {}", e.getMessage());
throw new AmqpRejectAndDontRequeueException(e.getMessage());
}
}
}

thx god, I finanlly find the answer!
all the configuration are correct, the problem is all the queues like mail_queue are created before I configure dead letter queue. So when I set x-dead-letter-exchange to the queue after the queue is created, it does not take effect.
中文就是,修改队列参数后,要删除队列重建!!!这么简单的一个tip,花了我几小时。。。。。。
How to delete queue, I follow the answer.
Deleting queues in RabbitMQ

Related

How to use error-channel for catching exception in Spring Integration?

What I am trying to do? : I am new to Spring Integration and already have read many similar questions regarding error handling but I don't understand how to catch exceptions using error-channel?
What I have done so far:
#EnableIntegration
#IntegrationComponentScan
#Configuration
public class TcpClientConfig implements ApplicationEventPublisherAware {
private ApplicationEventPublisher applicationEventPublisher;
private final ConnectionProperty connectionProperty;
#Override
public void setApplicationEventPublisher(ApplicationEventPublisher applicationEventPublisher) {
this.applicationEventPublisher = applicationEventPublisher;
}
TcpClientConfig(ConnectionProperty connectionProperty) {
this.connectionProperty = connectionProperty;
}
#Bean
public AbstractClientConnectionFactory clientConnectionFactory() {
TcpNioClientConnectionFactory tcpNioClientConnectionFactory =
getTcpNioClientConnectionFactoryOf(
connectionProperty.getPrimaryHSMServerIpAddress(),
connectionProperty.getPrimaryHSMServerPort());
final List<AbstractClientConnectionFactory> fallBackConnections = getFallBackConnections();
fallBackConnections.add(tcpNioClientConnectionFactory);
final FailoverClientConnectionFactory failoverClientConnectionFactory =
new FailoverClientConnectionFactory(fallBackConnections);
return new CachingClientConnectionFactory(
failoverClientConnectionFactory, connectionProperty.getConnectionPoolSize());
}
#Bean
DefaultTcpNioSSLConnectionSupport connectionSupport() {
final DefaultTcpSSLContextSupport defaultTcpSSLContextSupport =
new DefaultTcpSSLContextSupport(
connectionProperty.getKeystorePath(),
connectionProperty.getTrustStorePath(),
connectionProperty.getKeystorePassword(),
connectionProperty.getTruststorePassword());
final String protocol = "TLSv1.2";
defaultTcpSSLContextSupport.setProtocol(protocol);
return new DefaultTcpNioSSLConnectionSupport(defaultTcpSSLContextSupport, false);
}
#Bean
public MessageChannel outboundChannel() {
return new DirectChannel();
}
#Bean
#ServiceActivator(inputChannel = "outboundChannel")
public MessageHandler outboundGateway(AbstractClientConnectionFactory clientConnectionFactory) {
TcpOutboundGateway tcpOutboundGateway = new TcpOutboundGateway();
tcpOutboundGateway.setConnectionFactory(clientConnectionFactory);
return tcpOutboundGateway;
}
#Bean
#ServiceActivator(inputChannel = "error-channel")
public void handleError(ErrorMessage em) {
throw new RuntimeException(String.valueOf(em));
}
private List<AbstractClientConnectionFactory> getFallBackConnections() {
final int size = connectionProperty.getAdditionalHSMServersConfig().size();
List<AbstractClientConnectionFactory> collector = new ArrayList<>(size);
for (final Map.Entry<String, Integer> server :
connectionProperty.getAdditionalHSMServersConfig().entrySet()) {
collector.add(getTcpNioClientConnectionFactoryOf(server.getKey(), server.getValue()));
}
return collector;
}
private TcpNioClientConnectionFactory getTcpNioClientConnectionFactoryOf(
final String ipAddress, final int port) {
TcpNioClientConnectionFactory tcpNioClientConnectionFactory =
new TcpNioClientConnectionFactory(ipAddress, port);
tcpNioClientConnectionFactory.setUsingDirectBuffers(true);
tcpNioClientConnectionFactory.setDeserializer(new CustomDeserializer());
tcpNioClientConnectionFactory.setApplicationEventPublisher(applicationEventPublisher);
tcpNioClientConnectionFactory.setSoKeepAlive(true);
tcpNioClientConnectionFactory.setConnectTimeout(connectionProperty.getConnectionTimeout());
tcpNioClientConnectionFactory.setSoTcpNoDelay(true);
tcpNioClientConnectionFactory.setTcpNioConnectionSupport(connectionSupport());
return tcpNioClientConnectionFactory;
}
}
Gateway
#Component
#MessagingGateway(defaultRequestChannel = "outboundChannel",errorChannel ="error-channel" )
public interface TcpClientGateway {
String send(String message);
}
Also currently, I am facing
required a bean of type org.springframework.messaging.support.ErrorMessage that could not be found
I need some assistance!
Thanking you in advance,
EDIT
#AllArgsConstructor
#Service
public class AsyncNonBlockingClient implements Connector {
TcpClientGateway tcpClientGateway;
#Override
public String send(final String payload) {
return tcpClientGateway.send(payload);
}
}
See documentation about messaging annotation:
Your problem is here: https://docs.spring.io/spring-integration/docs/current/reference/html/configuration.html#annotations_on_beans
#Bean
#ServiceActivator(inputChannel = "error-channel")
public void handleError(ErrorMessage em) {
This is a plain POJO method, therefore it cannot be marked with a #Bean. You use a #Bean really for beans to expose. Then you decide if that has to be a #ServiceActivator or not. So, just remove #Bean from this method and your error-channel consumer should be OK.

RabbitMQ message not being sent to Dead Letter Queue on Exception

I have this RabbitMQ Spring Boot Configuration:
#Configuration
public class RabbitConfiguration {
// Main queue configuration
#Value("${rabbitmq.main.messages.queue}")
private String mainQueueName;
#Value("${rabbitmq.main.exchange.queue}")
private String mainExchangeName;
#Value("${rabbitmq.main.routing.key}")
private String mainRoutingKey;
// DLQ configuration
#Value("${rabbitmq.dlq.messages.queue}")
private String dlqQueueName;
#Value("${rabbitmq.dlq.exchange.queue}")
private String dlqExchangeName;
#Value("${rabbitmq.dlq.routing.key}")
private String dlqRoutingKey;
// Connectivity
#Value("${spring.rabbitmq.host}")
private String rabbitmqHost;
#Value("${spring.rabbitmq.port}")
private int rabbitmqPort;
#Value("${spring.rabbitmq.username}")
private String rabbitmqUsername;
#Value("${spring.rabbitmq.password}")
private String rabbitmqPassword;
// Not delivered messages, will be used eventually
#Value("${rabbitmq.not.delivered.messages.queue}")
private String notDeliveredMessagesQueue;
// status with delivered messages (callback default)
#Value("${rabbitmq.delivered.messages.queue}")
private String deliveredMessagesQueue;
#Bean
DirectExchange deadLetterExchange() {
return new DirectExchange(dlqExchangeName);
}
#Bean
DirectExchange exchange() {
return new DirectExchange(mainExchangeName);
}
#Bean
Queue dlq() {
return QueueBuilder.durable(dlqQueueName).build();
}
#Bean
Queue queue() {
return QueueBuilder
.durable(mainQueueName)
.withArgument("x-dead-letter-exchange", "deadLetterExchange")
.withArgument("x-dead-letter-routing-key", dlqQueueName).build();
}
#Bean
Binding deadLetterBinding() {
return BindingBuilder.bind(dlq()).to(deadLetterExchange()).with(dlqRoutingKey);
}
#Bean
Binding binding() {
return BindingBuilder.bind(queue()).to(exchange()).with(mainQueueName);
}
#Bean
public MessageConverter jsonMessageConverter() {
return new Jackson2JsonMessageConverter();
}
#Bean
public ConnectionFactory connectionFactory() {
CachingConnectionFactory connectionFactory = new CachingConnectionFactory(rabbitmqHost);
connectionFactory.setUsername(rabbitmqUsername);
connectionFactory.setPassword(rabbitmqPassword);
return connectionFactory;
}
#Bean
public RabbitAdmin rabbitAdmin() {
RabbitAdmin admin = new RabbitAdmin(connectionFactory());
admin.declareQueue(queue());
admin.declareExchange(exchange());
admin.declareBinding(binding());
return admin;
}
public AmqpTemplate rabbitTemplate(ConnectionFactory connectionFactory) {
final RabbitTemplate rabbitTemplate = new RabbitTemplate(connectionFactory);
rabbitTemplate.setMessageConverter(jsonMessageConverter());
return rabbitTemplate;
}
}
The problem is that when some exception is launched the message is not sent to DLQ queue.
Consumer:
#RabbitListener(queues = { "${rabbitmq.main.messages.queue}" })
public void recievedMessage(#Payload Mensagem item, Channel channel, #Header(AmqpHeaders.DELIVERY_TAG) long tag) throws InvalidMessageException {
if (item.getIdCliente().equals("69")) {
logger.info("Something went wrong to: " + item);
throw new InvalidMessageException();
} else {
logger.info("==> Message consumed successfully: " + item);
}
}
This is the configuration I have on my application.properties:
spring.rabbitmq.listener.simple.retry.enabled=true
spring.rabbitmq.listener.simple.retry.initial-interval=3s
spring.rabbitmq.listener.simple.retry.max-attempts=2
spring.rabbitmq.listener.simple.retry.multiplier=2
spring.rabbitmq.listener.simple.retry.max-interval=10s
When I throw an exception on purpose just to see the message moving to DLQ nothing happens. What's wrong here? What am I forgetting here?
Try adding
spring.rabbitmq.listener.simple.default-requeue-rejected=false
to your application.properties. I think the problem is that the failed deliveries are being requeued instead of being sent to the DLQ.

How to build a nonblocking Consumer when using AsyncRabbitTemplate with Request/Reply Pattern

I'm new to rabbitmq and currently trying to implement a nonblocking producer with a nonblocking consumer. I've build some test producer where I played around with typereference:
#Service
public class Producer {
#Autowired
private AsyncRabbitTemplate asyncRabbitTemplate;
public <T extends RequestEvent<S>, S> RabbitConverterFuture<S> asyncSendEventAndReceive(final T event) {
return asyncRabbitTemplate.convertSendAndReceiveAsType(QueueConfig.EXCHANGE_NAME, event.getRoutingKey(), event, event.getResponseTypeReference());
}
}
And in some other place the test function that gets called in a RestController
#Autowired
Producer producer;
public void test() throws InterruptedException, ExecutionException {
TestEvent requestEvent = new TestEvent("SOMEDATA");
RabbitConverterFuture<TestResponse> reply = producer.asyncSendEventAndReceive(requestEvent);
log.info("Hello! The Reply is: {}", reply.get());
}
This so far was pretty straightforward, where I'm stuck now is how to create a consumer which is non-blocking too. My current listener:
#RabbitListener(queues = QueueConfig.QUEUENAME)
public TestResponse onReceive(TestEvent event) {
Future<TestResponse> replyLater = proccessDataLater(event.getSomeData())
return replyLater.get();
}
As far as I'm aware, when using #RabbitListener this listener runs in its own thread. And I could configure the MessageListener to use more then one thread for the active listeners. Because of that, blocking the listener thread with future.get() is not blocking the application itself. Still there might be the case where all threads are blocking now and new events are stuck in the queue, when they maybe dont need to. What I would like to do is to just receive the event without the need to instantly return the result. Which is probably not possible with #RabbitListener. Something like:
#RabbitListener(queues = QueueConfig.QUEUENAME)
public void onReceive(TestEvent event) {
/*
* Some fictional RabbitMQ API call where i get a ReplyContainer which contains
* the CorrelationID for the event. I can call replyContainer.reply(testResponse) later
* in the code without blocking the listener thread
*/
ReplyContainer replyContainer = AsyncRabbitTemplate.getReplyContainer()
// ProcessDataLater calls reply on the container when done with its action
proccessDataLater(event.getSomeData(), replyContainer);
}
What is the best way to implement such behaviour with rabbitmq in spring?
EDIT Config Class:
#Configuration
#EnableRabbit
public class RabbitMQConfig implements RabbitListenerConfigurer {
public static final String topicExchangeName = "exchange";
#Bean
TopicExchange exchange() {
return new TopicExchange(topicExchangeName);
}
#Bean
public ConnectionFactory rabbitConnectionFactory() {
CachingConnectionFactory connectionFactory = new CachingConnectionFactory();
connectionFactory.setHost("localhost");
return connectionFactory;
}
#Bean
public MappingJackson2MessageConverter consumerJackson2MessageConverter() {
return new MappingJackson2MessageConverter();
}
#Bean
public RabbitTemplate rabbitTemplate() {
final RabbitTemplate rabbitTemplate = new RabbitTemplate(rabbitConnectionFactory());
rabbitTemplate.setMessageConverter(producerJackson2MessageConverter());
return rabbitTemplate;
}
#Bean
public AsyncRabbitTemplate asyncRabbitTemplate() {
return new AsyncRabbitTemplate(rabbitTemplate());
}
#Bean
public Jackson2JsonMessageConverter producerJackson2MessageConverter() {
return new Jackson2JsonMessageConverter();
}
#Bean
Queue queue() {
return new Queue("test", false);
}
#Bean
Binding binding() {
return BindingBuilder.bind(queue()).to(exchange()).with("foo.#");
}
#Bean
public SimpleRabbitListenerContainerFactory myRabbitListenerContainerFactory() {
SimpleRabbitListenerContainerFactory factory = new SimpleRabbitListenerContainerFactory();
factory.setConnectionFactory(rabbitConnectionFactory());
factory.setMaxConcurrentConsumers(5);
factory.setMessageConverter(producerJackson2MessageConverter());
factory.setAcknowledgeMode(AcknowledgeMode.MANUAL);
return factory;
}
#Override
public void configureRabbitListeners(final RabbitListenerEndpointRegistrar registrar) {
registrar.setContainerFactory(myRabbitListenerContainerFactory());
}
}
I don't have time to test it right now, but something like this should work; presumably you don't want to lose messages so you need to set the ackMode to MANUAL and do the acks yourself (as shown).
UPDATE
#SpringBootApplication
public class So52173111Application {
private final ExecutorService exec = Executors.newCachedThreadPool();
#Autowired
private RabbitTemplate template;
#Bean
public ApplicationRunner runner(AsyncRabbitTemplate asyncTemplate) {
return args -> {
RabbitConverterFuture<Object> future = asyncTemplate.convertSendAndReceive("foo", "test");
future.addCallback(r -> {
System.out.println("Reply: " + r);
}, t -> {
t.printStackTrace();
});
};
}
#Bean
public AsyncRabbitTemplate asyncTemplate(RabbitTemplate template) {
return new AsyncRabbitTemplate(template);
}
#RabbitListener(queues = "foo")
public void listen(String in, Channel channel, #Header(AmqpHeaders.DELIVERY_TAG) long tag,
#Header(AmqpHeaders.CORRELATION_ID) String correlationId,
#Header(AmqpHeaders.REPLY_TO) String replyTo) {
ListenableFuture<String> future = handleInput(in);
future.addCallback(result -> {
Address address = new Address(replyTo);
this.template.convertAndSend(address.getExchangeName(), address.getRoutingKey(), result, m -> {
m.getMessageProperties().setCorrelationId(correlationId);
return m;
});
try {
channel.basicAck(tag, false);
}
catch (IOException e) {
e.printStackTrace();
}
}, t -> {
t.printStackTrace();
});
}
private ListenableFuture<String> handleInput(String in) {
SettableListenableFuture<String> future = new SettableListenableFuture<String>();
exec.execute(() -> {
try {
Thread.sleep(2000);
}
catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
future.set(in.toUpperCase());
});
return future;
}
public static void main(String[] args) {
SpringApplication.run(So52173111Application.class, args);
}
}

Springboot RabbitMq Consumer consuming only at first time.please help me to get the configurations right and how can configure multiple listners

Springboot RabbitMq Consumer consuming only at first time.please help me to get the configurations right and how can configure multiple listners.I am developing an Automated Mapping solution for which i Execute various jobs .and in a need to put those jobs in queue
Here's my application class:Application.java
#SpringBootApplication
#ComponentScan(basePackages = { "com.fractal.sago", "com.fractal.grpc" })
public class Application extends SpringBootServletInitializer {
public static void main(String[] args) throws Exception {
SpringApplication.run(Application.class, args);
}
RabbitMqConfig class:
#Configuration
public class RabbitMqConfig {
public final static String JOB_QUEUE_NAME = "jobQueue";
public final static String JOB_EXCHANGE_NAME = "jobExchange";
#Bean
Queue jobQueue() {
return new Queue(JOB_QUEUE_NAME, true);
}
#Bean
DirectExchange jobExchange() {
return new DirectExchange(JOB_EXCHANGE_NAME);
}
#Bean
Binding jobBinding(DirectExchange directExchange) {
return BindingBuilder.bind(jobQueue()).to(jobExchange()).with(jobQueue().getName());
}
#Bean
SimpleMessageListenerContainer jobQueueContainer(ConnectionFactory connectionFactory,
MessageListenerAdapter joblistenerAdapter) {
SimpleMessageListenerContainer jobQueueContainer = new SimpleMessageListenerContainer();
jobQueueContainer.setConnectionFactory(connectionFactory);
jobQueueContainer.setQueueNames(JOB_QUEUE_NAME);
jobQueueContainer.setMessageListener(joblistenerAdapter);
return jobQueueContainer;
}
#Bean
MessageListenerAdapter joblistenerAdapter(JobQueueConsumer messageReceiver) {
MessageListenerAdapter messageListenerAdapter = new MessageListenerAdapter(messageReceiver, "receiveMessage");
messageListenerAdapter.setMessageConverter(producerJackson2MessageConverter());
return messageListenerAdapter;
}
#Bean
public Jackson2JsonMessageConverter producerJackson2MessageConverter() {
return new Jackson2JsonMessageConverter();
}
}
producer : JobProducer
#Component
public class JobQueueProducer {
#Autowired
RabbitTemplate rabbitTemplate;
public void sendMessage(String message) {
Message messageToSend = MessageBuilder.withBody(message.getBytes())
.setDeliveryMode(MessageDeliveryMode.PERSISTENT).build();
rabbitTemplate.convertAndSend(RabbitMqConfig.JOB_EXCHANGE_NAME, RabbitMqConfig.JOB_QUEUE_NAME, messageToSend);
//rabbitTemplate.convertAndSend(RabbitMqConfig.JOB_QUEUE_NAME, messageToSend);
}
}
Consumer :JobQueueConsumer
enter code here
#Component
public class JobQueueConsumer implements MessageListener {
#Autowired
SagoAlgo sagoAlgo;
#Autowired
CCDMappingService ccdMappingService;
#RabbitListener(queues = { RabbitMqConfig.JOB_QUEUE_NAME })
public void receiveMessage(Message message) throws SQLException {
System.out.println("Received Message: " + new String(message.getBody()));
Integer jobId = Integer.parseInt(new String(message.getBody()));
System.out.println(jobId);
CCDMappingVO ccdVo =ccdMappingService.fetchCCDWithCategoriesById(jobId);
sagoAlgo.execAlgo(ccdVo); //my Algo to be executed
}
//when I implement MessageListener default method to be executed
public void onMessage(Message message) {
// System.out.println("Received Message: " + new
// String(message.getBody()));
}
}

Spring Boot: how to use FilteringMessageListenerAdapter

I have a Spring Boot application which listens to messages on a Kafka queue. To filter those messages, have the following two classs
#Component
public class Listener implements MessageListener {
private final CountDownLatch latch1 = new CountDownLatch(1);
#Override
#KafkaListener(topics = "${spring.kafka.topic.boot}")
public void onMessage(Object o) {
System.out.println("LISTENER received payload *****");
this.latch1.countDown();
}
}
#Configuration
#EnableKafka
public class KafkaConfig {
#Autowired
private Listener listener;
#Bean
public FilteringMessageListenerAdapter filteringReceiver() {
return new FilteringMessageListenerAdapter(listener, recordFilterStrategy() );
}
public RecordFilterStrategy recordFilterStrategy() {
return new RecordFilterStrategy() {
#Override
public boolean filter(ConsumerRecord consumerRecord) {
System.out.println("IN FILTER");
return false;
}
};
}
}
While messages are being processed by the Listener class, the RecordFilterStrategy implementation is not being invoked. What is the correct way to use FilteringMessageListenerAdapter?
Thanks
The solution was as follows:
No need for the FilteringMessageListenerAdapter class.
Rather, create a ConcurrentKafkaListenerContainerFactory, rather than relying on what Spring Boot provides out of the box. Then, set the RecordFilterStrategy implementation on this class.
#Bean
ConcurrentKafkaListenerContainerFactory<Integer, String>
kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<Integer, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setRecordFilterStrategy(recordFilterStrategy());
return factory;
}

Resources