Dynamically Change Concurrent Consumers - rabbitMQ - spring-rabbit

I am using SimpleRabbitListenerFactory and am trying to dynamically decrease/increase the concurrentConsumers which am not able to achieve. Can anybody help me with this. Thank you
Code is as follows:
SimpleRabbitListenerContainerFactory factory;
#Bean
public SimpleRabbitListenerContainerFactory rabbitListenerContainerFactory()
{
factory=new SimpleRabbitListenerContainerFactory();
factory.setConnectionFactory(this.connectionFactory);
changeConsumers(2);
return factory;
}
public void changeConsumers(int minConsumers)
{
factory.setConcurrentConsumers(2);
}
***In another package***
if(messageCount>6000)
changeConsumers(5);
When message count is morethan 6000 it is going into the changeCOnsumers method but consumers are not changing to 5.

The factory is used during application initialization to create listener container objects; changing its properties later won't change the properties on containers it has already created.
You have to change the property on the containers themselves.
You can access the containers using the RabbitListenerEndpointRegistry bean; call getListenerContainers and iterate over them (or get an individual container using it's id).
Cast the MessageListenerContainer to SimpleMessageListenerContainer to change its properties.

Related

ConcurrentKafkaListenerContainerFactory message converter is ignored when configuring listeners automatically

I need to create Kafka listeners at runtime, and everything seems working, except that the message converter property seems being ignored (or maybe this is a designed feature or I've made something wrong).
When using #KafkaListener, it works correct, but when creating listeners manually my message isn't converted to a desired object and I'm getting an error:
Caused by: java.lang.ClassCastException: class java.lang.String cannot be cast to class com.my.company.model.MyPojo (java.lang.String is in module java.base of loader 'bootstrap'; com.my.company.model.MyPojo is in unnamed module of loader 'app')
at com.my.company.config.MyPojo.kafka.KafkaConfig.lambda$createListenerContainers$2(KafkaConfig.java:142)
My configuration:
#Bean
ConcurrentKafkaListenerContainerFactory<String, Object> kafkaListenerContainerFactory() {
var factory = new ConcurrentKafkaListenerContainerFactory<String, Object>();
factory.setConsumerFactory(consumerFactory());
factory.setMessageConverter(new StringJsonMessageConverter());
return factory;
}
#Bean
MessageListenerContainer createListenerContainer1() {
ContainerProperties containerProperties = new ContainerProperties(topicConfig("my_topic"));
var container = new KafkaMessageListenerContainer<>(consumerFactory(), containerProperties);
//tried this too...
//var container = kafkaListenerContainerFactory().createContainer(topicConfig("my_topic"));
container.setupMessageListener((MessageListener<String, MyPojo>) data -> getDataService.process(data.value()););
container.start();
return container;
}
The WORKING Kafka listener:
#KafkaListener(id = "1", topics = "my_topic)
public void listenGetDataTopic(#Payload MyPojo message) {
log.info(message);
}
I've tried a lot of different configs and to debug it deeply, and, of course I see the difference between handling messages when using #KafkaListener and manually created listeners, but I didn't figure out how to apply a message conversion to a manually created listeners. Is there a possibility to achieve this?
The message converter is not a property of the container, it is a property of the listener adapter used to invoke the pojo method for the #KafkaListener.
When using a container directly, your listener must implement MessageListener or one of its sub-interfaces.
You can either invoke the converter yourself in your listener (e.g. create a lightweight adapter) or you need to use another technique for dynamically creating #KafkaListeners.
See
Kafka Spring: How to create Listeners dynamically or in a loop?
Kafka Consumer in spring can I re-assign partitions programmatically?
Can i add topics to my #kafkalistener at runtime
for some examples of those techniques.

Kafka consumer picking up topics dynamically

I have a Kafka consumer configured in Spring Boot. Here's the config class:
#EnableKafka
#Configuration
#PropertySource({"classpath:kafka.properties"})
public class KafkaConsumerConfig {
#Autowired
private Environment env;
#Bean
public ConsumerFactory<String, GenericData.Record> consumerFactory() {
dataRiverProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, env.getProperty("bootstrap.servers"));
dataRiverProps.put(ConsumerConfig.GROUP_ID_CONFIG, env.getProperty("group.id"));
dataRiverProps.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, env.getProperty("enable.auto.commit"));
dataRiverProps.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, env.getProperty("auto.commit.interval.ms"));
dataRiverProps.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, env.getProperty("session.timeout.ms"));
dataRiverProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, env.getProperty("auto.offset.reset"));
dataRiverProps.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, env.getProperty("schema.registry.url"));
dataRiverProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class.getName());
dataRiverProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class.getName());
return new DefaultKafkaConsumerFactory<>(dataRiverProps);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, GenericData.Record> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, GenericData.Record> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
And here's the consumer:
#Component
public class KafkaConsumer {
#Autowired
private MessageProcessor messageProcessor;
#KafkaListener(topics = "#{'${kafka.topics}'.split(',')}", containerFactory = "kafkaListenerContainerFactory")
public void consumeAvro(GenericData.Record message) {
messageProcessor.process();
}
}
Please note that I am using topics = "#{'${kafka.topics}'.split(',')}" to pick up the topics from a properties file.
And this is what my kafka.properties file looks like:
kafka.topics=pwdChange,pwdCreation
bootstrap.servers=aaa.bbb.com:37900
group.id=pwdManagement
enable.auto.commit=true
auto.commit.interval.ms=1000
session.timeout.ms=30000
schema.registry.url=http://aaa.bbb.com:37800
Now if I am to add a new topic to the subscription, say pwdExpire, and modify the prop files as follows:
kafka.topics=pwdChange,pwdCreation,pwdExpire
Is there a way for my consumer to start subscribe to this new topic without restarting the server?
I have found this post Spring Kafka - Subscribe new topics during runtime, but the documentation has this to say about metadata.max.age.ms:
The period of time in milliseconds after which we force a refresh of
metadata even if we haven't seen any partition leadership changes to
proactively discover any new brokers or partitions.
Sounds to me it won't work. Thanks for your help!
No; the only way to do that is to use a topic pattern; as new topics are added (that match the pattern), the broker will add them to the subscription, after 5 minutes, by default.
You can, however, add new listener container(s) for the new topic(s) at runtime.
Another option would be to load the #KafkaListener bean in a child application context and re-create the context each time the topic(s) change.
EDIT
See the javadocs for KafkaConsumer.subscribe(Pattern pattern)...
/**
* Subscribe to all topics matching specified pattern to get dynamically assigned partitions.
* The pattern matching will be done periodically against topics existing at the time of check.
* <p>
...

Kafka listener receiving List<ConsumerRecord<String, String>>, is it possible to consume?

I am super new in Kafka and I frankly have no idea about this type of consumer (as far as I understood is like that due is batch ready), so I am struggling to figure out how to basically consume the list of these events.
I have something like this:
#KafkaListener(topics = "#{'${kafka.listener.list-of-topics}'.split(',')}")
public void readMessage(List<ConsumerRecord<String, String>> records,
final Acknowledgment acknowledgment) {
try {
....
I know when I receive an event (at least a single one) is of type "MyObject" so I can do it fine when I get a single message.
I believe there must be a way to read/cast this List<ConsumerRecords<String,String> but I cannot figure out how..
any ideas?
See the reference manual: Batch Listeners.
Starting with version 1.1, #KafkaListener methods can be configured to receive the entire batch of consumer records received from the consumer poll. To configure the listener container factory to create batch listeners, set the batchListener property:
#Bean
public KafkaListenerContainerFactory<?> batchFactory() {
ConcurrentKafkaListenerContainerFactory<Integer, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setBatchListener(true); // <<<<<<<<<<<<<<<<<<<<<<<<<
return factory;
}
...
You can also receive a list of ConsumerRecord<?, ?> objects but it must be the only parameter (aside from optional Acknowledgment, when using manual commits, and/or Consumer<?, ?> parameters) defined on the method:
...
When using Spring Boot, set the property spring.kafka.listener.type=batch.

Springboot JMS LIstener ActiveMQ is very slow

Im having a SpringBoot application which consume my custom serializable message from ActiveMQ Queue. So far it is worked, however, the consume rate is very poor, only 1 - 20 msg/sec.
#JmsListener(destination = "${channel.consumer.destination}", concurrency="${channel.consumer.maxConcurrency}")
public void receive(IMessage message) {
processor.process(message);
}
The above is my channel consumer class's snippet, it has a processor instance (injected, autowired and inside it i have #Async service, so i can assume the main thread will be released as soon as message entering #Async method) and also it uses springboot activemq default conn factory which i set from application properties
# ACTIVEMQ (ActiveMQProperties)
spring.activemq.broker-url= tcp://localhost:61616?keepAlive=true
spring.activemq.in-memory=true
spring.activemq.pool.enabled=true
spring.activemq.pool.expiry-timeout=1
spring.activemq.pool.idle-timeout=30000
spring.activemq.pool.max-connections=50
Few things worth to inform:
1. I run everything (Eclipse, ActiveMQ, MYSQL) in my local laptop
2. Before this, i also tried using custom connection factory (default AMQ, pooling, and caching) equipped with custom threadpool task executor, but still getting same result. Below is a snapshot performance capture which i took and updating every 1 sec
3. I also notive in JVM Monitor that the used heap keep incrementing
I want to know:
1. Is there something wrong/missing from my steps?I can't even touch hundreds in my message rate
2. Annotated #JmsListener method will execute process async or sync?
3. If possible and supported, how to use traditional sync receive() with SpringBoot properly and ellegantly?
Thank You
I'm just checking something similar. I have defined DefaultJmsListenerContainerFactory in my JMSConfiguration class (Spring configuration) like this:
#Bean
public DefaultJmsListenerContainerFactory jmsListenerContainerFactory(CachingConnectionFactory connectionFactory) {
// settings made based on https://bsnyderblog.blogspot.sk/2010/05/tuning-jms-message-consumption-in.html
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory(){
#Override
protected void initializeContainer(DefaultMessageListenerContainer container) {
super.initializeContainer(container);
container.setIdleConsumerLimit(5);
container.setIdleTaskExecutionLimit(10);
}
};
factory.setConnectionFactory(connectionFactory);
factory.setConcurrency("10-50");
factory.setCacheLevel(CACHE_CONSUMER);
factory.setReceiveTimeout(5000L);
factory.setDestinationResolver(new BeanFactoryDestinationResolver(beanFactory));
return factory;
}
As you can see, I took those values from https://bsnyderblog.blogspot.sk/2010/05/tuning-jms-message-consumption-in.html. It's from 2010 but I could not find anything newer / better so far.
I have also defined Spring's CachingConnectionFactory Bean as a ConnectionFactory:
#Bean
public CachingConnectionFactory buildCachingConnectionFactory(#Value("${activemq.url}") String brokerUrl) {
// settings based on https://bsnyderblog.blogspot.sk/2010/02/using-spring-jmstemplate-to-send-jms.html
ActiveMQConnectionFactory activeMQConnectionFactory = new ActiveMQConnectionFactory();
activeMQConnectionFactory.setBrokerURL(brokerUrl);
CachingConnectionFactory cachingConnectionFactory = new CachingConnectionFactory(activeMQConnectionFactory);
cachingConnectionFactory.setSessionCacheSize(10);
return cachingConnectionFactory;
}
This setting will help JmsTemplate with sending.
So my answer to you is set the values of your connection pool like described in the link. Also I guess you can delete spring.activemq.in-memory=true because (based on documentation) in case you specify custom broker URL, "in-memory" property is ignored.
Let me know if this helped.
G.

Spring JMS listener connecting to multiple regions

I have a listener which listens to a queue name, which is same across regions.
I need to establish connection to multiple regions with different connection urls for the same listener.
could you let me know is it possible using DefaultJmsListenerContainerFactory.
As of now below code establishes connection to only one region.
#Bean
public DefaultJmsListenerContainerFactory containerFactory(ConnectionFactory connectionFactory) {
DefaultJmsListenerContainerFactory factory =
new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(connectionFactory);
factory.setDestinationResolver(new BeanFactoryDestinationResolver(springContextBeanFactory));
factory.setConcurrency(concurrency);
return factory;
}
#JmsListener(containerFactory = "containerFactory",
destination = "TestQueue")
public void qpidMessages(String msg){
System.out.println(msg);
}
I need a mechanism to mention List of connections.
I want the configuration to be dynamic based on properties file or read from DB as the connections count differ.
Could you please let me know is there something similar to AbstractRoutingDataSource for JMS Listeners

Resources