How to configure cluster of Consumers in ActiveMQ?
I created a simple embedded ActiveMQ application with two consumers of one Queue, consumers are working in separate threads. But when I send a message to the Queue, JMS delivers it to first consumer no matter how long it sleeps after receiving.
I think you're trying to explain that the first consumer is receiving all the messages. There is a FAQ entry for this type of problem available here:
http://activemq.apache.org/i-do-not-receive-messages-in-my-second-consumer.html
Bruce
Related
I am running multiple instances of the same Spring Boot 2.0.4 Application, for scaling purposes, that consume messages from an ActiveMQ queue using the following:
#JmsListener(destination = "myQ")
Only the first consumer receives messages and if I stop the first consumer the second instance starts receiving the messages. I want each consumer to consume a message, not the same message, in a round robin fashion. But only the first consumer consumes messages.
It sounds like you want a JMS Topic rather than a Queue. You should also research durable subscriptions, shared subscriptions, and durable topics before you settle on the configuration you need for your setup.
See:
JMS API Programming Model (Search for JMS Message Consumers)
Queues vs Topics
Durable Queues and Topics
For a Spring Framework app working with ActiveMQ and with/without WebSocket
The requirement is prior to send any message to a Topic a check should be done about Number Of Consumers, if it returns 1 the message can be sent safely. If it returns 0, the message can't be sent.
The clients could be come from websocket and consider there is no durable subscription. Thus if a message is sent, and there are no clients, the message arrives to the Topic and practically is lost (never consumed) and Messages Enqueued increments +1
I already did a research and I have read the following:
JMSTemplate check if topic exists and get subscriber count
Notify ActiveMQ producer if consumer on the destination is down
ActiveMQ get number of consumers listening to a topic from java
Detect change in Consumers of an ActiveMQ topic
Practically all is based on Advisory Message. I already have read:
Advisory Message
Handling Advisory Messages
Apache ActiveMQ Advisory Example
I understand that if exists a Topic named abc.xyz then ActiveMQ creates ActiveMQ.Advisory.Consumer.Topic.abc.xyz, until here I am ok with this pattern. I can confirm this approach through ActiveMQ Web Console
What is confuse for me is that practically all the examples available from the previous links works around creating a Session and uses mostly the onMessage method. For the latter I know it works how a listener.
Question 01: Thus who is expected to call that ActiveMQ.Advisory.Consumer.Topic.abc.xyz? It to trigger that onMessage method? That is my confusion.
What I need is work with the Spring Framework API (The app is already working and running with a CachingConnectionFactory, thus a Connection can be retrieved and other #Beans about infrastructure about ActiveMQ) and get access to that ActiveMQ.Advisory.Consumer.Topic.abc.xyz destination and retrieve the Number Of Consumers value.
Note: even when exists ActiveMQTopic declared with #Bean and is possible to retrieve that Destination for some #Component, sadly the API does not offer a method such getConsumers().
Question 02: How can be accomplished this?
I am assuming the JMS API for 2.0.x could help perhaps in someway.
I got my zookeeper and 3 kafka broker running locally.
I started one producer and one consumer. I can see consumer is consuming message.
I then started three consumers with same consumer group name (different ports since its a spring boot project). but what I found is that all the consumers are now consuming (receiving) messages. But I expect the message to be load-balanced in that only messages are not repeated across the consumers. I don't know what the problem is.
Here is my property file
spring.cloud.stream.bindings.input.destination=timerTopicLocal
spring.cloud.stream.kafka.binder.zkNodes=localhost
spring.cloud.stream.kafka.binder.brokers=localhost
spring.cloud.stream.bindings.input.group=timerGroup
Here the group is timerGroup.
consumer code : https://github.com/codecentric/edmp-sample-stream-sink
producer code : https://github.com/codecentric/edmp-sample-stream-source
Can you please update dependencies to Camden.RELEASE (and start using Kafka 0.9+) ? In Brixton.RELEASE, Kafka consumers were 0.8-based and required passing instanceIndex/instanceCount as properties in order to distribute partitions correctly.
In Camden.RELEASE we are using the Kafka 0.9+ consumer client, which does load-balancing in the way you are expecting (we also support static partition allocation via instanceIndex/instanceCount, but I suspect this is not what you want). I can enter into more details on how to configure this with Brixton, but I guess an upgrade should be a much easier path.
I have a couple of queues and I need to do the following with ONE of them:
A producer should send a message to this queue, but ALL consumers should receive it. So, if I have 5 spring listeners on this queue, each of them should receive the message, but not the producer. I do that because I have a tomcat cluster and rabbitmq asynchronous messages, and if I get response from the worker, I don't know how to dispatch it to the correct tomcat node. So I decided to broadcast all worker replies to all tomcat nodes. Each tomcat cluster node listens the same output queue. Then, if it's a correct tomcat instance, it will be processed, all other copies will be lost, and it's ok. How to implement it? How make consumers on tomcat's end to receive the same message the same time?
Ok, found the solution here:
RabbitMQ / AMQP: single queue, multiple consumers for same message?
It's impossible to do in rabbitmq, need to create a couple of queues for each consumer.
I need to implemt a pulling consumer.
Most of the examples I see are the producer pushing a message to the consumer; Assuming consumer is always up.
I want the producer to push messages to a queue and the consumer to consume those messages on its own schedule.
My consumer has a off hours calender and cannot process requests during off hours.
How would I configure that in spring.
TIA
Raghu
Use message driven POJOs and JMS.
The onMessage is a server push, The message is processed as soon as it is delivered to the client. I want the client to decide when it wants to pull the message and process it.