Most of the RabbitMQ documentation seems to be focused on round-robin, ie where a single message is consumed by a single consumer. I have a requirement wherein would like to receive the same message from a single queue to multiple subscribed consumers.
Below is my sample consumers code. Here there are 2 listeners listening to the same Queue, but the message is getting received by only one of the consumer. How do I configure it so that the same message gets delivered to both the Consumers? (Consumer1 and Consumer2).
Any help will be highly appreciated.
#Component
public class Consumer1 {
#RabbitListener(queues="test.queue.jsa")
public void recievedMessage(Employee msg) {
System.out.println("Recieved Message: " + msg);
}
}
#Component
public class Consumer2 {
#RabbitListener(queues="test.queue.jsa")
public void recievedMessage(Employee msg) {
System.out.println("Consumed Message: " + msg);
}
}
This is not possible; it just doesn't work that way. Each consumer needs its own queue; use a fanout exchange.
Related
I am using the DefaultMessageListenerContainer for consuming messages from ActiveMQ queue as below. With this implementation is there any polling mechanism, does the listener poll the queue to see if there is a new message every 1 second or so , or does the onMessage method get invoked whenever there is a new message in the queue? If it uses polling how can we increase or decrease the polling frequency (time) .
DefaultMessageListenerContainer container = new DefaultMessageListenerContainer();
container.setMessageListener(new MessageJmsListener ());
public class MessageJmsListener implements MessageListener {
#Override
public void onMessage(Message message) {
if (message instanceof TextMessage) {
try {
//process the message and create record in Data Base
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
}
The container polls the JMS client, but the broker pushes messages to the client.
So, no, the container does not poll the queue directly.
If there are no messages in the queue, the container will timeout after receiveTimeout and immediately re-poll and will get the next message as soon as the broker sends it.
The prefetch determines how many messages are sent to the consumer by the broker; so that might impact performance (but it's 1000 by default, I think, with recent ActiveMQ versions).
Setting the prefetch to 1 will give you the slowest delivery rate.
If you want to slow things down, you can add a Thread.sleep() in your listener.
I have a issue with Receive message back from Listener to publisher. I am getting
**AmqpReplyTimeoutException **. Below is the code of Publisher from where i am publishing to queue.
for(CsvWrapperPojo item : items){
resultList.addAll(item.getDbResultList());
for(CSVPojo pojo :item.getQueueRequestList()){
sampleResponseMessageRabbitConverterFuture= asyncRabbitTemplate.convertSendAndReceive("spring-boot-rabbitmq-Interactive.async_Solve_InteractiveMsg", "Interactive_RequestQueue", pojo);
//CSVPojo res =(CSVPojo)rabbitTemplate.convertSendAndReceive("spring-boot-rabbitmq-Interactive.async_Solve_InteractiveMsg", "Interactive_RequestQueue", pojo);
System.out.println("heyyyyyy:" + sampleResponseMessageRabbitConverterFuture.get().getLatitute());
//resultList.add(res);
//resultList.add(sampleResponseMessageRabbitConverterFuture.get());
}
}
By using it i am able to publish to queue, i have subscriber code below.
#EnableRabbit
public class ListenerQueueSubscriber {
#RabbitHandler
#RabbitListener(containerFactory = "simpleMessageListenerContainerFactory", queues ="Interactive_RequestQueue")
public void subscribeToRequestQueue(#Payload CSVPojo sampleRequestMessage, Message message) throws InterruptedException {
System.out.println("inside listener");
sampleRequestMessage.setResult("Hello");
Thread.sleep(120000);
System.out.println("After sleep:" +sampleRequestMessage.getLongitude());
//return sampleRequestMessage;
}
}
By using above subscriber able to listen message and i am appending "Hello and put sleep for 2 minutes and after that i have to receive the message back to publisher from where i have published . But unfortunately not receiving the message with Hello appended getting **AmqpReplyTimeoutException **. Can please help to achieve this behavior.
Thanks in advance!!!!
We are designing a microservices architecture, we would like to use RabbitMQ as message broker.
We wanted each service to have one specific queue, lets say applicationQueue.
We also defined that our messages would be of two kinds:
Events: Messages that are routed to every service. If a service is interested in some specific event, it will intercept it and create a task from it.
Tasks: Messages representing jobs created from the service to himself, they should be publish only to the queue of the service itself
We are struggling to implement that so far using Spring AMQP.
We designed a message producer, so after a given http request, it would create a task for the service itself:
RestController:
#PostMapping
public void saveProduct(#RequestBody Product product) {
messageProducer.message("subscriptions.product.create", product)
.fromHttpRequest(requestContext)
.send();
}
our send method of the message producer:
public void send() {
template.convertAndSend(exchange, routingKey, payload, message -> {
if (requestContext != null) {
extractHttpRequestInfo(message);
message.getMessageProperties().getHeaders()
.put(MessageDictionary.TRANSACTION_ID, generateTransactionId());
} else if (originalMessage != null) {
extractMessageInfo(message);
}
return message;
});
}
RabbitMQ Configuration:
#Bean
List<Binding> binding(Queue queue, TopicExchange exchange) {
return Arrays.asList(
BindingBuilder.bind(queue).to(exchange).with("*.*"),
BindingBuilder.bind(queue).to(exchange).with("${condohub.rabbitmq.queue.name}.#")
);
}
and then subscribe elsewhere (The #Digest annotation is a custom annotation):
#Digest("${condohub.rabbitmq.queue.name}.product.create")
public void createProduct(Product product) {
service.save(product);
}
Any help is welcome.
Your bindings don't make sense; the first one will match all keys with the form foo.bar, baz.qux etc, so the second one is irrelevant.
You should probably just use a fanout exchange for the events and each service has 2 queues, one on the fanout for events and one on the topic exchange for jobs (with a narrow binding for just its own jobs).
Is there any way how to subscribe from a topic and forward messaged to another layer of an application (have a new Listener for given topic) using Spring?
Consider following message handler it handler which sends messages to a topic topic/chat/{conversationId}
public class ConversationController{
#MessageMapping("/chat/{conversationId}")
#SendTo("/topic/chat/{conversationId}")
public ConversationMessage createMesage(
#Payload CreateMessage message,
#DestinationVariable String conversationId) {
log.info("handleMessage {}", message);
return conversationService.create( message );
}
}
I would like to listen on this topic and do an action on some messages.
public class Bot{
#SubscribeMapping("/topic/chat/{conversationId}")
public void subscribeUserMessages(
#Payload ConversationMessage message,
#DestinationVariable String conversationId){
// doesn't work
}
}
I've also tried use SimpMessagingTemplate.convertAndSend(..) but it doesn't work neither. Maybe I am doing something wrong.
My application doesn't use full flagged message broker, just the default one in memory broker.
I have a spring-cloud-stream application with kafka binding. I would like to send and receive a message from the same topic from within the same executable(jar). I have my channel definitions such as below:-
public interface ChannelDefinition {
#Input("forum")
public SubscriableChannel readMessage();
#Output("forum")
public MessageChannel postMessage();
}
I use #StreamListener to receive messages. I get all sorts of unexpected errors. At times, i receive
No dispatcher found for unknown.message.channel for every other message
If i attach a command line kafka subscriber to the above forum topic, it recieves every other message.
My application receives every other message, which is exclusive set of messages from command line subscriber. I have made sure that my application subscribes under a specific group name.
Is there a working example of the above usecase?
This is a wrong way to define bindable channels (because of the use of the forum name for both). We should be more thorough and fail fast on it, but you're binding both the input and the output to the same channel and creating a competing consumer within your application. That also explains your other issue with alternate messages.
What you should do is:
public interface ChannelDefinition {
#Input
public MessageChannel readMessage();
#Output
public MessageChannel postMessage();
}
And then use application properties to bind your channels to the same queue:
spring.cloud.stream.bindings.readMessage.destination=forum
spring.cloud.stream.bindings.postMessage.destination=forum
Along with the answer above by Marius Bogoevici, here's an example of how to listen to that Input.
#StreamListener
public void handleNewOrder(#Input("input") SubscribableChannel input) {
logger.info("Subscribing...");
input.subscribe((message) -> {
logger.info("Received new message: {}", message);
});
}
For me, consuming from "input" didn't work. I needed to use method name on #Streamlistener and needed to use #EnableBinding, like below:
#Slf4j
#RequiredArgsConstructor
#EnableBinding(value = Channels.class)
public class Consumer {
#StreamListener("readMessage")
public void retrieve(Something req) {
log.info("Received {{}}", req);
}
}