Spring Kafka and exactly once delivery guarantee - spring

I use Spring Kafka and Spring Boot and just wondering how to configure my consumer, for example:
#KafkaListener(topics = "${kafka.topic.post.send}", containerFactory = "postKafkaListenerContainerFactory")
public void sendPost(ConsumerRecord<String, Post> consumerRecord, Acknowledgment ack) {
// do some logic
ack.acknowledge();
}
to use the exactly once delivery guarantee?
Should I only add org.springframework.transaction.annotation.Transactional annotation over sendPost method and that's it or do I need to perform some extra steps in order to achieve this?
UPDATED
This is my current config
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory(KafkaProperties kafkaProperties, KafkaTransactionManager<Object, Object> transactionManager) {
kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG, kafkaConsumerMaxPollIntervalMs);
kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, kafkaConsumerMaxPollRecords);
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
//factory.getContainerProperties().setAckMode(AckMode.MANUAL_IMMEDIATE);
factory.getContainerProperties().setTransactionManager(transactionManager);
factory.setConsumerFactory(consumerFactory(kafkaProperties));
return factory;
}
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
props.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, 15000000);
return props;
}
#Bean
public ProducerFactory<String, Post> postProducerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, Post> postKafkaTemplate() {
return new KafkaTemplate<>(postProducerFactory());
}
#Bean
public ProducerFactory<String, Update> updateProducerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, Update> updateKafkaTemplate() {
return new KafkaTemplate<>(updateProducerFactory());
}
#Bean
public ProducerFactory<String, Message> messageProducerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, Message> messageKafkaTemplate() {
return new KafkaTemplate<>(messageProducerFactory());
}
but it fails with the following error:
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of method kafkaTransactionManager in org.springframework.boot.autoconfigure.kafka.KafkaAutoConfiguration required a single bean, but 3 were found:
- postProducerFactory: defined by method 'postProducerFactory' in class path resource [com/example/domain/configuration/messaging/KafkaProducerConfig.class]
- updateProducerFactory: defined by method 'updateProducerFactory' in class path resource [com/example/domain/configuration/messaging/KafkaProducerConfig.class]
- messageProducerFactory: defined by method 'messageProducerFactory' in class path resource [com/example/domain/configuration/messaging/KafkaProducerConfig.class]
What am I doing wrong ?

You should not use manual acknowledgments. Instead, inject a KafkaTransactionManager into the listener container and the container will send the offset to the transaction when the listener method exits normally (or rollback otherwise).
You should not do acks via the consumer for exactly once.
EDIT
application.yml
spring:
kafka:
consumer:
auto-offset-reset: earliest
enable-auto-commit: false
properties:
isolation:
level: read_committed
producer:
transaction-id-prefix: myTrans.
App
#SpringBootApplication
public class So52570118Application {
public static void main(String[] args) {
SpringApplication.run(So52570118Application.class, args);
}
#Bean // override boot's auto-config to add txm
public ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory(
ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> kafkaConsumerFactory,
KafkaTransactionManager<Object, Object> transactionManager) {
ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
configurer.configure(factory, kafkaConsumerFactory);
factory.getContainerProperties().setTransactionManager(transactionManager);
return factory;
}
#Autowired
private KafkaTemplate<String, String> template;
#KafkaListener(id = "so52570118", topics = "so52570118")
public void listen(String in) throws Exception {
System.out.println(in);
Thread.sleep(5_000);
this.template.send("so52570118out", in.toUpperCase());
System.out.println("sent");
}
#KafkaListener(id = "so52570118out", topics = "so52570118out")
public void listenOut(String in) {
System.out.println(in);
}
#Bean
public ApplicationRunner runner() {
return args -> this.template.executeInTransaction(t -> t.send("so52570118", "foo"));
}
#Bean
public NewTopic topic1() {
return new NewTopic("so52570118", 1, (short) 1);
}
#Bean
public NewTopic topic2() {
return new NewTopic("so52570118out", 1, (short) 1);
}
}

Related

Listener not consuming message on Test with EmbeddedKafka Spring

I was trying to see if a service is invoked when the consumer receives a message from kafka topic but the test is not passing and the consumer is not even receiving the message.
My test:
#SpringBootTest
#DirtiesContext
#EmbeddedKafka(partitions = 1, brokerProperties = { "listeners=PLAINTEXT://localhost:9092", "port=9092" })
class ConsumerTest {
#Autowired
Consumer consumer;
#Mock
private Service service;
#Autowired
private KafkaTemplate<String, String> kafkaTemplate;
#Test
public void givenEmbeddedKafkaBroker_whenSendingtoSimpleProducer_thenMessageReceived()
throws Exception {
String message = "hi";
kafkaTemplate.send("topic", message);
consumer.getLatch().await(10000, TimeUnit.MILLISECONDS);
verify(service, times(1)).addMessage();
}
}
The consumer, in main package, is a normal consumer with #KafkaListener(topics = "topic").
Then I have a configuration file:
#EnableKafka
#Configuration
public class KafkaConfig {
#Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "group");
return props;
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(consumerConfigs(), new StringDeserializer(),
new StringDeserializer());
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
And also in application.properties (inside test package) i put this:
spring:
kafka:
consumer:
auto-offset-reset: earliest
group-id: group

EmbeddedKafka does not invoke listener when using avro schema

I have been trying to write a simple Kafka Listener unit test using KafkaEmbedded. However my listener does not gets invoked. I have been using this link for inspiration since I also need an Avro Serializer/DeSerializer.
Below are how my test class looks like.
#ExtendWith(SpringExtension.class)
public class KafkaTest{
public static final String TOPIC_2 = "topic";
#Autowired
private Service listener;
#Autowired
private SomeClient someClient;
#Autowired
private KafkaTemplate<String, Avro> template;
#Test
void testSimple() {
template.send(TOPIC_2, "test", Avro.newBuilder()
.setGameProvider("abcd")
.setMessageName("someMessageName")
.setRequestId(UUID.randomUUID().toString())
.setBody(UUID.randomUUID().toString())
.build());
verify(someClient).register(anyString(), anyString(), anyString(), anyString());
template.flush();
}
#Configuration
#EnableKafka
public static class Config {
#Bean
public EmbeddedKafkaBroker kafkaEmbedded() {
return new EmbeddedKafkaBroker(1, true, 1, TOPIC_2);
}
#Bean
public ConsumerFactory<String, Avro> createConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaEmbedded().getBrokersAsString());
props.put(ConsumerConfig.GROUP_ID_CONFIG, "group-1");
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, CustomKafkaAvroDeserializer.class);
props.put("schema.registry.url", "not-used");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Avro> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Avro> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(createConsumerFactory());
return factory;
}
#MockBean
private SomeClient client;
#MockBean
private Marshaller marshaller;
#Bean
public SomeService listener() {
return new SomeService(client, marshaller, kafkaTemplate());
}
#Bean
public ProducerFactory<String, Avro> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaEmbedded().getBrokersAsString());
props.put(ProducerConfig.RETRIES_CONFIG, 1);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, CustomKafkaAvroSerializer.class);
props.put("schema.registry.url", "not-used");
return new DefaultKafkaProducerFactory<>(props);
}
#Bean
public KafkaTemplate<String, Avro> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
}
public class CustomKafkaAvroDeserializer extends KafkaAvroDeserializer {
#Override
public Object deserialize(String topic, byte[] bytes) {
if (topic.equals("topic")) {
this.schemaRegistry = getMockClient(SafeReport.SCHEMA$);
}
return super.deserialize(topic, bytes);
}
private static SchemaRegistryClient getMockClient(final Schema schema$) {
return new MockSchemaRegistryClient() {
#Override
public synchronized Schema getById(int id) {
return schema$;
}
};
}
}
public class CustomKafkaAvroSerializer extends KafkaAvroSerializer {
public CustomKafkaAvroSerializer() {
super();
super.schemaRegistry = new MockSchemaRegistryClient();
}
public CustomKafkaAvroSerializer(SchemaRegistryClient client) {
super(new MockSchemaRegistryClient());
}
public CustomKafkaAvroSerializer(SchemaRegistryClient client, Map<String, ?> props) {
super(new MockSchemaRegistryClient(), props);
}
}
#Service
#EnableBinding(Sink.class)
public class SomeService {
#KafkaListener(topics = "topic", groupId = "group-1")
public void listen(ConsumerRecord<String, Avro> cr) {
System.out.println(String.format("#### -> Consumed message -> %s", cr.toString()));
}
}
My listener is never invoked when I run this test.

How to access MessageHeaders in Kafka batch listener

I have the below Autoconfiguration class for Kafka:
#Configuration
#EnableKafka
#ConditionalOnClass(KafkaReceiver.class)
#ConditionalOnProperty({"spring.kafka.bootstrap-servers"})
public class KafkaAutoConfiguration<T> { #Value("${spring.kafka.bootstrap-servers}")
private String bootstrapServers;
private KafkaProperties kafkaConfig;
private String groupId;
#Autowired
public void setKafkaProperties(KafkaProperties properties) {
this.kafkaConfig = properties;
}
#Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
bootstrapServers);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
org.apache.kafka.common.serialization.ByteArrayDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaConfig.getGroupId());
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return props;
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(consumerConfigs());
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaBatchListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setBatchListener(true);
return factory;
}
#Bean
public DefaultKafkaHeaderMapper headerMapper(){
return new DefaultKafkaHeaderMapper();
}
#Bean("simpleReceiver")
#ConditionalOnMissingBean(name = "simpleReceiver")
#ConditionalOnProperty({"service.kafka.consumer.topics"})
public KafkaReceiver simpleReceiver() {
return new KafkaSimpleReceiver();
}
#Bean("batchReceiver")
#ConditionalOnMissingBean(name = "batchReceiver")
#ConditionalOnProperty({"service.kafka.consumer.batch-topics"})
public KafkaReceiver batchReceiver() {
return new KafkaBatchReceiver();
}
}
Simple listener bean:
public class KafkaSimpleReceiver implements KafkaReceiver {
#KafkaListener(topics = "#{'${service.kafka.consumer.topics}'.split(',')}", containerFactory = "kafkaListenerContainerFactory")
public void receive(ConsumerRecord record, #Headers MessageHeaders headers) throws KafkaException {
}
}
Batch listener bean:
public class KafkaBatchReceiver implements KafkaReceiver {
#KafkaListener(topics = "#{'${service.kafka.consumer.batch-topics}'.split(',')}", containerFactory = "kafkaBatchListenerContainerFactory")
public void receive(List<ConsumerRecord> records, #Headers MessageHeaders headers) throws KafkaException {
}
}
Simple listener is working fine, but I am getting the following error for batch listener. How can we access MessageHeaders in this case?
A parameter of type 'List<ConsumerRecord>' must be the only parameter (except for an optional 'Acknowledgment' and/or 'Consumer')
EDIT
This is what I did to convert ConsumerRecord to MessageHeaders
public void receive(List<ConsumerRecord> records) {
for(ConsumerRecord record : records) {
Map<String, Object> headersList = new HashMap<>();
for(final Header h : record.headers()) {
headersList.put(h.key(), new String(h.value()));
}
MessageHeaders headers = new MessageHeaders(headersList);
}
}
You can get a list of Message<?>.
#KafkaListener(topics = "so54086076", id = "so54086076")
public void listen(List<Message<?>> records) {
System.out.println(records.size() + ":" + records);
}
The message payloads will be the ConsumerRecord.value(); the other ConsumerRecord properties will be in the headers.

Unable to read same message on same topic by multiple consumers

I am publishing a message to topicX. To consume this message by multiple consumers, I have written following ConsumerConfig (Note, each consumer has different groupd_id.
#EnableKafka
#Configuration
public class ConsumerConfig {
#Value("${spring.kafka.bootstrap-servers}")
private String bootstrapServers;
#Bean
public Map<String, Object> consumerConfigs(String groupId) {
Map<String, Object> props = new HashMap<>();
props.put(BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(GROUP_ID_CONFIG, "my.groupid." + groupId);
props.put(AUTO_OFFSET_RESET_CONFIG, "earliest");
return props;
}
#Bean
public ConsumerFactory<String, Car> consumerFactory(String groupId) {
return new DefaultKafkaConsumerFactory<>(consumerConfigs(groupId), new StringDeserializer(),
new JsonDeserializer<>(Car.class));
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Car> consumerOneKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Car> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory("one"));
return factory;
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Car> consumerTwoKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Car> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory("two"));
return factory;
}
}
And then on specific consumer, I have added following annotation (specified the group id and container factory)
#KafkaListener(id = "my.groupid.one", topics = "${app.topic.topicname}", containerFactory = "consumerOneKafkaListenerContainerFactory")
But when i publish the message, only one consumer consumes it. How can i i configure kafka so that each published message is consumed by multiple consumers?
I have tried this code with 1 partition and n partitions (where n = number of consumers) and yet i am having this issue.
UPDATE
Here is my topic producer class
#Service
#Slf4j
public class TopicProducer {
#Autowired
private KafkaTemplate<String, TopicPayload> kafkaTemplate;
public void send(String topicName, TopicPayload payload) {
log.info("sending message={} to topic={}", payload, topicName);
kafkaTemplate.send(topicName, payload);
}
}
And here is my TopicProducerConfig
#Configuration
public class TopicProducerConfig {
#Value("${spring.kafka.bootstrap-servers}")
private String bootstrapServers;
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return props;
}
#Bean
public ProducerFactory<String, TopicPayload> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, TopicPayload> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
I post a message by calling producer.send(topicName, payload).

Problems adding multiple KafkaListenerContainerFactories

Hi I'm currently dabbling in Spring Kafka and succeeded in adding a single KafkaListenerContainerFactory to my listener. Now I'd like to add multiple KafkaListenerContainerFactorys (One for a topic that will have messages in json, another one for strings). See code below:
#EnableKafka
#Configuration
public class KafkaConsumersConfig {
private final KafkaConfiguration kafkaConfiguration;
#Autowired
public KafkaConsumersConfig(KafkaConfiguration kafkaConfiguration) {
this.kafkaConfiguration = kafkaConfiguration;
}
#Bean
public KafkaListenerContainerFactory<?> kafkaJsonListenerContainerFactory(){
ConcurrentKafkaListenerContainerFactory<String,Record> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(jsonConsumerFactory());
factory.setConcurrency(3);
factory.setAutoStartup(true);
return factory;
}
#Bean
public ConsumerFactory<String,Record> jsonConsumerFactory(){
JsonDeserializer<Record> jsonDeserializer = new JsonDeserializer<>(Record.class);
return new DefaultKafkaConsumerFactory<>(jsonConsumerConfigs(),new StringDeserializer(), jsonDeserializer);
}
#Bean
public Map<String,Object> jsonConsumerConfigs(){
Map<String,Object> propsMap = new HashMap<>();
propsMap.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaConfiguration.getBrokerAddress());
propsMap.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaConfiguration.getJsonGroupId());
propsMap.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, kafkaConfiguration.getAutoCommit());
propsMap.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, kafkaConfiguration.getAutoCommitInterval());
propsMap.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, kafkaConfiguration.getSessionTimeout());
return propsMap;
}
#Bean
public KafkaListenerContainerFactory<?> kafkaFileListenerContainerFactory(){
ConcurrentKafkaListenerContainerFactory<String,String> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(fileConsumerFactory());
factory.setConcurrency(3);
factory.setAutoStartup(true);
return factory;
}
#Bean
public ConsumerFactory<String,String> fileConsumerFactory(){
return new DefaultKafkaConsumerFactory<>(fileConsumerConfigs());
}
#Bean
public Map<String,Object> fileConsumerConfigs(){
Map<String,Object> propsMap = new HashMap<>();
propsMap.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaConfiguration.getBrokerAddress());
propsMap.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaConfiguration.getFileGroupId());
propsMap.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, kafkaConfiguration.getAutoCommit());
propsMap.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, kafkaConfiguration.getAutoCommitInterval());
propsMap.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, kafkaConfiguration.getSessionTimeout());
return propsMap;
}
}
Running this gives me the following error:
Description:
Parameter 1 of method kafkaListenerContainerFactory in org.springframework.boot.autoconfigure.kafka.KafkaAnnotationDrivenConfiguration required a bean of type 'org.springframework.kafka.core.ConsumerFactory' that could not be found.
- Bean method 'kafkaConsumerFactory' in 'KafkaAutoConfiguration' not loaded because #ConditionalOnMissingBean (types: org.springframework.kafka.core.ConsumerFactory; SearchStrategy: all) found beans 'jsonConsumerFactory', 'fileConsumerFactory'
Action:
Consider revisiting the conditions above or defining a bean of type 'org.springframework.kafka.core.ConsumerFactory' in your configuration.
What am I doing wrong?
Looks like you are not going to rely on the Spring Boot's Kafka Auto Configuration.
Spring Boot provides in the KafkaAutoConfiguration:
#Bean
#ConditionalOnMissingBean(ConsumerFactory.class)
public ConsumerFactory<?, ?> kafkaConsumerFactory() {
Since you have jsonConsumerFactory and fileConsumerFactory, they override that one provided by the auto-config.
But on the other hand, in the KafkaAnnotationDrivenConfiguration, non of your factories can be applied:
#Bean
#ConditionalOnMissingBean(name = "kafkaListenerContainerFactory")
public ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory(
ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> kafkaConsumerFactory) {
Because your ConsumerFactory beans are not of ConsumerFactory<Object, Object> type.
So:
Just exclude KafkaAutoConfiguration from the Spring Boot auto configuration by adding the following to the application properties file:
spring.autoconfigure.exclude=org.springframework.boot.autoconfigure.kafka.KafkaAutoConfiguration
or rename one of your KafkaListenerContainerFactory beans to the kafkaListenerContainerFactory to override it in the Boot
or make one of the ConsumerFactory beans as a ConsumerFactory<Object, Object> type.
I have achieved it below code and its working fine for me.
// LISTENER 1
#Bean
#ConditionalOnMissingBean(name = "yourListenerFactory1")
public ConsumerFactory<String, YourCustomObject1> yourConsumerFactory1() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "YOUR-GROUP-1");
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(),
new JsonDeserializer<>(YourCustomObject1.class));
}
#Bean(name = "yourListenerFactory1")
public ConcurrentKafkaListenerContainerFactory<String, YourCustomObject1>
yourListenerFactory1() {
ConcurrentKafkaListenerContainerFactory<String, YourCustomObject1> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(yourConsumerFactory1());
ContainerProperties containerProperties = factory.getContainerProperties();
containerProperties.setPollTimeout(...);
containerProperties.setAckMode(AckMode...);
return factory;
}
// LISTENER 2
#Bean
#ConditionalOnMissingBean(name = "yourListenerFactory2")
public ConsumerFactory<String, YourCustomObject2> yourConsumerFactory2() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "YOUR-GROUP-2");
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(),
new JsonDeserializer<>(YourCustomObject2.class));
}
#Bean(name = "yourListenerFactory2")
public ConcurrentKafkaListenerContainerFactory<String, YourCustomObject2>
yourListenerFactory2() {
ConcurrentKafkaListenerContainerFactory<String, YourCustomObject2> factory
= new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(yourConsumerFactory2());
ContainerProperties containerProperties = factory.getContainerProperties();
containerProperties.setPollTimeout(...);
containerProperties.setAckMode(AckMode...);
return factory;
}
Also, I have set spring.autoconfigure.exclude property as ITS MUST
spring.autoconfigure.exclude=org.springframework.boot.autoconfigure.kafka.KafkaAutoConfiguration
This is my Consumer config
Consumer 1
#KafkaListener(id = "your-cousumer-1",
topicPattern = "your-topic-1",
containerFactory = "yourListenerFactory1")
public void consumer1(YourCustomObject1 data,
Acknowledgment acknowledgment,
#Header(KafkaHeaders.RECEIVED_PARTITION_ID) List<Integer> partitions,
#Header(KafkaHeaders.RECEIVED_TOPIC) List<String> topics,
#Header(KafkaHeaders.OFFSET) List<Long> offsets) throws Exception { ... }
Consumer 2
#KafkaListener(id = "your-cousumer-2",
topicPattern = "your-topic-2",
containerFactory = "yourListenerFactory2")
public void consumer2(YourCustomObject2 data,
Acknowledgment acknowledgment,
#Header(KafkaHeaders.RECEIVED_PARTITION_ID) List<Integer> partitions,
#Header(KafkaHeaders.RECEIVED_TOPIC) List<String> topics,
#Header(KafkaHeaders.OFFSET) List<Long> offsets) throws Exception { ... }
Also, my kafka template was
#Autowired
KafkaTemplate<String, Object> kafkaTemplate;
You can define each container factory in KafkaListener definition as follow:
#KafkaListener(topics = "fileTopic", containerFactory = "kafkaFileListenerContainerFactory")
public void fileConsumer(...) {...}
#KafkaListener(topics = "jsonTopic", containerFactory = "kafkaJsonListenerContainerFactory")
public void jsonConsumer(...) {...}

Resources