RabbitMQ Direct Reply-To Unroutable Messages Drop with Spring Boot - spring

I implemented a RabbitMQ Direct-Reply-To functionality based on the answer from Gary Russell (the second one)
Problem with RabbitMQ Direct reply-to with Spring.
Now, everything is working as intended, but when Im checking the messages on the RabbitMq Overview, around half of the messages is "Unroutable (Drop)". While I dont see any messages, that are not answered, I'm still worried and want to resolve this problem or are those messages just Unroutable because I'm using the Direct-Reply-To Exchange?
Greetings
Jan

Related

Spring-AMQP and Direct Reply-To

I found this RabbitMQ "extension" listed here: http://www.rabbitmq.com/direct-reply-to.html, I set the RabbitTemplate's "replyQueue" with amq.rabbitmq.reply-to I tried it with an already function RPC call, and it had functioned, but now it just times out.
Any help is appreciated!
When using a fixed reply queue (whether user-specified or the amq.rabbitmq.reply-to) you have to configure a <reply-listener/> - see the Spring AMQP documentation. for the amq.rabbitmq.reply-to you should set the reply container's acknowledge to NONE (which is no-ack in RabbitMQ speak).
CORRECTION: The RabbitTemplate does not currently support Direct reply-to for sendAndReceive() operations; you can, however, specify a fixed reply queue (with a reply-listener). Or you can use rabbitTemplate.execute() with a ChannelCallback to consume the reply from that "queue" (and publish).
I have created a JIRA issue if you wish to track it.
1.4.1 and above now supports direct reply-to.

(How) can i route a message to one particular client?

I'm trying to understand the principles of HornetQ as well as core/JMS messaging using this solution.
In my experimental app, I'd like my end-user application(client) to send messages to a HornetQ which will be read by a backend app. So far this is no problem and I love HornetQ.
But now, i'd like to send an "answer" message from the backend app back to the end-user. For this, I have the condition that no other client app should be able to read the answer message (let's say it contains the current bank balance). So user A should only fetch messages for himself and the same applies to any other user.
Is this possible using HornetQ? If so, how do I have to do it?
with hornetq (or any other message system) you always send to a queue, not to a specific consumer.
ON this case you have to create a queue matching your client.
This answer here will provide you some feedback on request-response where I won't need to repeat myself after this approach:
Synchronous request-reply pattern in a Java EE container

JMS design: topic and queue combination

I am relatively new to JMS and I have been reading a lot on it lately.
I am planning to design a web app which would do the following:
User logs into the system and publishes a message/question to a topic.
All the users who have subscribed to the topic read the message/question and reply to it.
The originator reviews all the answers and picks the best answer.
The originator now replies to only the user whose answer he/she picked and asks for further clarification.
The responder gets the message and replies.
So, once the originator has picked the answer, the JMS now becomes a request/reply design.
My questions are:
Is it possible to publish to a topic with setJmsReplyTo(tempQueue)?
Can request/reply approach be async?
Is it a good idea to have per user queue?
These questions might some dumb to some of the experts here, but please bare in mind that I am still learning.
Thanks.
Is it possible to publish to a topic with setJmsReplyTo(tempQueue)?
You should be able but I'm not 100% sure about it. By the way, I searched in my bookmarks and found this link that should explain what you have to do to build up a Request/Response system using JMS
http://activemq.apache.org/how-should-i-implement-request-response-with-jms.html
Can request/reply approach be async?
A message listener is an object that acts as an asynchronous event handler for messages. So you approach about request/reply, if using JMS, is by default async.
http://docs.oracle.com/javaee/1.3/jms/tutorial/1_3_1-fcs/doc/prog_model.html#1023398
Is it a good idea to have per user queue?
I don't know how many user you expect to have but having one queue for each user is not a good way to handle the messages. I had a problem similar to yours but we used a single queue for each of the macro area and we structured the message to hold the information of the user that sent it in order to store the information later and use it to further analysis.
JMSReplyTo is just a message header, nothing else. So It is possible to publish a message withing a topic with specific value in this header.
Sure! If you would like to create a scalable system you should design event driven system using async instead of blocking aproach. MessageListener can help you.
It is specific to JMS broker implementation. If queue creation is quite cheap there is no problems with such a solution.

Two consumers on same Websphere MQ JMS Queue, both receiving same message

I am working with someone who is trying to achieve a load-balancing behavior using JMS Queues with IBM Websphere MQ. As such, they have multiple Camel JMS consumers configured to read from the same Queue. Despite that this behavior is undefined according to the JMS spec (last time I looked anyway), they expect a sort of round-robin / load-balancing behavior. And, while the spec leaves this undefined, I'm led to believe that the normal behavior of Websphere MQ is to deliver the message to only one of the consumers, and that it may do some type of load-balancing. See here, for example: When multi MessageConsumer connect to same queue(Websphere MQ),how to load balance message-consumer?
But in this particular case, it appears that both consumers are receiving the same message.
Can anyone who is more of an expert with Websphere MQ shed any light on this? Is there any situation where this behavior is expected? Is there any configuration change that can alleviate this?
I'm leaning towards telling everyone here to use the native Websphere MQ clustering facility and go away from having multiple consumers pointing at the same Queue, but that will be a big change for them, so I'd love to discover a way to make this work.
Not that I'm a fan of relying on anything that's undefined, but if they're willing to rely on IBM specific behavior, I'll leave that up to them.
The only way for them to both receive the same messages are:
There are multiple copies of the message.
The apps are browsing the message without a lock, then circling back to delete it.
The apps are backing out a transaction and making the message available again.
The connection is severed before the app acknowledges the message.
Having multiple apps compete for messages in a queue is a recommended practice. If one app goes down the queue is still served. In a cluster this is crucial because the cluster will continue to direct messages to the un-served queue instance until it fills up.
If it's a Dev system, install SupportPac MA0W and tell it to trace just that one queue and you will be able to see exactly what is happening.
See the JMS spec in section 4.4. The provider must never deliver a second copy of an acknowledged message. Exception is made for session handling in 4.4.13 which I cover in #4 above. That's pretty unambiguous and part of the official spec so not an IBM-specific behavior.

Using Torquebox to send messages to the browser

So our team has recently implemented torquebox into our jruby on rails applications. The purpose of this was to be able to receive queue/topic messages from an outside source which is streaming live data.
We have setup our queues/topics and they are receiving the messages without an issue. The next step we want to take is to get these messages on the browser.
So we started to look into leveraging the power of stomp. But we have come across some issues with this. It seems from the documentation that the purpose of using stomp + websockets is to receive messages from the client-side and push those messages to other clients. But we want to receive messages on our queues, and then push these messages to the client-side using websockets. Is this possible? Or would we have to implement a different technology such as Pusher or socket.io to get the queue/topic messages to the browser?
Thanks.
I think stomplets is good solution for this task. In rails application you should use ruby base stomp client, in browser javascript base stomp client. In rails just send data, and in browser just receive.
More detail how do it you can find in torquebox documentation
http://torquebox.org/documentation/2.0.0/stomp.html
It is indeed possible to push messages straight from the server to clients. It took me quite a bit of digging to find it as it is not listed in the documentation directly. Their blog lists it in their example of how to build a chat client using websockets.
http://torquebox.org/news/2011/08/23/stomp-chat-demo-part3/
Basically you use the inject method to choose which channel you're publishing to, and then use the publish method on the returned object to actually send the message. This code excerpt from the article should get you pointed in the right direction.
inject( '/topics/chat' ).publish( message,
:properties=>{
:recipient=>username,
:sender=>'system'
} )
It looks like :properties is the same thing as message headers. I'll be giving this a go over the next couple of days to see how well this works in Rails.

Resources