Autwire is Kafka Configuration - spring

I am not able to autowire MyRestcall in the MySerialiser class.
Its throwing NPE as MyRestcall isnt getting autowired.How cani autowire it.
MyRestcall is annoted with #Service annotation.
#Configuration
public class KakfaConfiguration {
#Bean
public ProducerFactory<String, MyMessage> producerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, MySerializer.class);
}
import org.apache.kafka.common.serialization.Serializer;
public class MySerializer implements Serializer {
#Autowired
private MyRestcall myRestcall;
#Override
public byte[] serialize(String topic, Object data) {
myRestcall.m1();
return out.toByteArray();
} catch (Exception e) {

config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, MySerializer.class);
With this configuration, Kafka creates the serializer, not Spring.
You need to define MySerializer as a #Bean and pass it into the producer factory directly.

Related

Error while Deserializing object when sending to Kafka topic

I am new to Kafka. I am trying to send a message to Kafka topic which contains header and payload.
Below is the error:
"org.apache.kafka.common.errors.SerializationException: Can't convert value of class com.cabservice.request.CabLocationPayload to class org.apache.kafka.common.serialization.StringSerializer specified in value.serializer\nCaused by: java.lang.ClassCastException: class com.cabservice.request.CabLocationPayload cannot be cast to class java.lang.String
Payload:
{
"header":{
"eventName":"CAB-LOCATION",
"eventId":"3b1i333kiwoskl",
"timestamp":1615205167470
},
"payload":{
"cabId":"cc8",
"driverId":"test#gmail.com",
"geoLocation":{
"id":"1234",
"latitude":78.12,
"longitude":45.23
}
}
}
I have CabLocationPayload which has fields Header and Payload.
public class CabLocationPayload {
private Header header;
private Payload payload;
// getter and setters
}
In Controller,
#PostMapping(value = "/publish")
public void sendMessageToKafkaTopic(#RequestBody CabLocationPayload cabLocationPayload) {
Header and Payload has mapping fields for the Json.
After changing VALUE_SERIALIZER_CLASS_CONFIG in Producer, I am able to see the data. But still failing with ClassCastException.
{public class KafkaConfiguration {
#Bean
public ProducerFactory<String, String> producerFactoryString() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
#Bean
public KafkaTemplate<String, String> kafkaTemplateString() {
return new KafkaTemplate<>(producerFactoryString());
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
configProps.put(ConsumerConfig.GROUP_ID_CONFIG, "group_id");
configProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
configProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
return new DefaultKafkaConsumerFactory<>(configProps);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}}
Current error is
{2 09:41:20.108 INFO 22561 --- [ad | producer-1] org.apache.kafka.clients.Metadata : [Producer clientId=producer-1] Cluster ID: lWghv-b_RG-_hO-qOp_cjA
2021-04-22 09:41:20.123 ERROR 22561 --- [nio-9080-exec-2] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.apache.kafka.common.errors.SerializationException: Can't convert value of class com.cabservice.request.CabLocationPayload to class org.apache.kafka.common.serialization.StringSerializer specified in value.serializer] with root cause
java.lang.ClassCastException: class com.cabservice.request.CabLocationPayload cannot be cast to class java.lang.String (com.cabservice.request.CabLocationPayload is in unnamed module of loader org.springframework.boot.devtools.restart.classloader.RestartClassLoader #1144043d; java.lang.String is in module java.base of loader 'bootstrap')
at org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:28) ~[kafka-clients-2.6.0.jar:na]}
Any help is really appreciated.
Kafka Configuration 'value.serializer' Config should a Serializer subclass, not your object type
For example
key: VALUE_SERIALIZER_CLASS_CONFIG, value: JsonSerializer.class (source: org.springframework.kafka.support.serializer)
Example producer config:
#EnableKafka
#Configuration
public class KafkaProducerConfiguration {
#Bean
KafkaTemplate<String, Object> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
#Bean
public ProducerFactory<String, Object> producerFactory() {
return new DefaultKafkaProducerFactory<>(getConfig());
}
private Map<String, Object> getConfig() {
Map<String, Object> config = new HashMap<>();
config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "brokers");
config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return config;
}
}
Example consumer config:
You have to replace Yourclass with class name you want to be consume. (for this example: CabLocationPayload)
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.support.serializer.ErrorHandlingDeserializer;
import org.springframework.kafka.support.serializer.JsonDeserializer;
import java.util.HashMap;
import java.util.Map;
#Configuration
public class KafkaConsumerConfiguration {
private Map<String, Object> consumerConfigs() {
final Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "your brokers");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "consumeer-group-id");
return props;
}
#Bean
public ConsumerFactory<String, YourClass> kafkaListenerConsumerFactory() {
final ErrorHandlingDeserializer<YourClass> errorHandlingDeserializer = new ErrorHandlingDeserializer<>(new JsonDeserializer<>(YourClass.class, false));
return new DefaultKafkaConsumerFactory<>(this.consumerConfigs(), new StringDeserializer(), errorHandlingDeserializer);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, YourClass> kafkaListenerContainerFactory() {
final ConcurrentKafkaListenerContainerFactory<String, YourClass> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(this.kafkaListenerConsumerFactory());
return factory;
}
}
Moved Kafka configs to application.properties
spring.kafka.consumer.bootstrap-servers: localhost:9092
spring.kafka.consumer.group-id: group-id
spring.kafka.consumer.auto-offset-reset: earliest
spring.kafka.consumer.key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.trusted.packages=*
spring.kafka.producer.bootstrap-servers: localhost:9092
spring.kafka.producer.key-serializer: org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
Not sure if spring.kafka.consumer.properties.spring.json.trusted.packages=* is making the difference.

When set ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG to IntegerSerializer in ProducerConfigs in Spring boot kafka, it gives class cast exception

I m using this Spring boot with kafka
to setup my project. but when i run it it will give org.apache.kafka.common.errors.SerializationException: Can't convert key of class java.lang.Integer to class org.apache.kafka.common.serialization.StringSerializer specified in key.serializer
Caused by: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String
this exception when send the request using kafka template.
#EmbeddedKafka(partitions = 1)
#SpringBootTest
class KafkaApplicationTests {
#Autowired
private MyKafkaListener listener;
#Autowired
private KafkaTemplate<Integer, String> template;
#Autowired
private EmbeddedKafkaBroker embeddedKafka;
#Test
public void testSimple() throws Exception {
template.send("annotated1", 0, "foo");
template.flush();
assertTrue(this.listener.latch1.await(10, TimeUnit.SECONDS));
}
#Configuration
#EnableKafka
public class Config {
#Bean
ConcurrentKafkaListenerContainerFactory<Integer, String>
kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<Integer, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
#Bean
public ConsumerFactory<Integer, String> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(consumerConfigs());
}
#Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
// props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
return props;
}
#Bean
public ProducerFactory<Integer, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
// props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafka.getBrokersAsString());
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
return props;
}
#Bean
public KafkaTemplate<Integer, String> kafkaTemplate() {
return new KafkaTemplate<Integer, String>(producerFactory());
}
}
}
But when I change that template.send("annotated1", "key-foo", "foo"); it will work. I have used
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
return props;
}
this configuration as well. But it is still giving me the
Caused by: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String
If someone there please help me. thanks
You need to add the main application class and Config class to the #SpringBootTest
#SpringBootTest(classes = { So61985794Application.class, So61985794ApplicationTests.Config.class })
to override Boot's normal configuration.

How to use "li-apache-kafka-clients" in spring boot app to send large message (above 1MB) from Kafka producer?

How to use li-apache-kafka-clients in spring boot app to send large message (above 1MB) from Kafka producer to Kafka Consumer? Below is the GitHub link of li-apache-kafka-clients:
https://github.com/linkedin/li-apache-kafka-clients
I have imported .jar file of li-apache-kafka-clients and put the below configuration for producer:
props.put("large.message.enabled", "true");
props.put("max.message.segment.bytes", 1000 * 1024);
props.put("segment.serializer", DefaultSegmentSerializer.class.getName());
and for consumer:
message.assembler.buffer.capacity,
max.tracked.messages.per.partition,
exception.on.message.dropped,
segment.deserializer.class
but still getting error for large message. Please help me to solve error.
Below is my code, please let me know where I need to create LiKafkaProducer:
#Configuration
public class KafkaProducerConfig {
#Value("${kafka.boot.server}")
private String kafkaServer;
#Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfig());
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<String, String>(producerFactory());
}
#Bean
public Map<String, Object> producerConfig() {
// TODO Auto-generated method stub
Map<String, Object> config = new HashMap<>();
config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put("bootstrap.servers", "localhost:9092");
config.put("acks", "all");
config.put("retries", 0);
config.put("batch.size", 16384);
config.put("linger.ms", 1);
config.put("buffer.memory", 33554432);
// The following properties are used by LiKafkaProducerImpl
config.put("large.message.enabled", "true");
config.put("max.message.segment.bytes", 1000 * 1024);
config.put("segment.serializer", DefaultSegmentSerializer.class.getName());
config.put("auditor.class", LoggingAuditor.class.getName());
return config;
}
}
#RestController
#RequestMapping("/kafkaProducer")
public class KafkaProducerController {
#Autowired
private KafkaSender sender;
#PostMapping
public ResponseEntity<List<Student>> sendData(#RequestBody List<Student> student){
sender.sendData(student);
return new ResponseEntity<List<Student>>(student, HttpStatus.OK);
}
}
#Service
public class KafkaSender {
private static final Logger LOGGER = LoggerFactory.getLogger(KafkaSender.class);
#Autowired
private KafkaTemplate<String, String> kafkaTemplate;
#Value("${kafka.topic.name}")
private String topicName;
public void sendData(List<Student> student) {
// TODO Auto-generated method stub
Map<String, Object> headers = new HashMap<>();
headers.put(KafkaHeaders.TOPIC, topicName);
headers.put("payload", student.get(0));
// Construct a JSONObject from a Map.
JSONObject HeaderObject = new JSONObject(headers);
System.out.println("\nMethod-2: Using new JSONObject() ==> " + HeaderObject);
final String record = HeaderObject.toString();
Message<String> message = MessageBuilder.withPayload(record).setHeader(KafkaHeaders.TOPIC, topicName)
.setHeader(KafkaHeaders.MESSAGE_KEY, "Message")
.build();
kafkaTemplate.send(topicName, message.toString());
}
}
You would need to implement your own ConsumerFactory and ProducerFactory to create the LiKafkaConsumer and LiKafkaProducer respectively.
You should be able to subclass the default factories provided by the framework.

Spring Kafka and exactly once delivery guarantee

I use Spring Kafka and Spring Boot and just wondering how to configure my consumer, for example:
#KafkaListener(topics = "${kafka.topic.post.send}", containerFactory = "postKafkaListenerContainerFactory")
public void sendPost(ConsumerRecord<String, Post> consumerRecord, Acknowledgment ack) {
// do some logic
ack.acknowledge();
}
to use the exactly once delivery guarantee?
Should I only add org.springframework.transaction.annotation.Transactional annotation over sendPost method and that's it or do I need to perform some extra steps in order to achieve this?
UPDATED
This is my current config
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory(KafkaProperties kafkaProperties, KafkaTransactionManager<Object, Object> transactionManager) {
kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG, kafkaConsumerMaxPollIntervalMs);
kafkaProperties.getProperties().put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, kafkaConsumerMaxPollRecords);
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
//factory.getContainerProperties().setAckMode(AckMode.MANUAL_IMMEDIATE);
factory.getContainerProperties().setTransactionManager(transactionManager);
factory.setConsumerFactory(consumerFactory(kafkaProperties));
return factory;
}
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
props.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, 15000000);
return props;
}
#Bean
public ProducerFactory<String, Post> postProducerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, Post> postKafkaTemplate() {
return new KafkaTemplate<>(postProducerFactory());
}
#Bean
public ProducerFactory<String, Update> updateProducerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, Update> updateKafkaTemplate() {
return new KafkaTemplate<>(updateProducerFactory());
}
#Bean
public ProducerFactory<String, Message> messageProducerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, Message> messageKafkaTemplate() {
return new KafkaTemplate<>(messageProducerFactory());
}
but it fails with the following error:
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of method kafkaTransactionManager in org.springframework.boot.autoconfigure.kafka.KafkaAutoConfiguration required a single bean, but 3 were found:
- postProducerFactory: defined by method 'postProducerFactory' in class path resource [com/example/domain/configuration/messaging/KafkaProducerConfig.class]
- updateProducerFactory: defined by method 'updateProducerFactory' in class path resource [com/example/domain/configuration/messaging/KafkaProducerConfig.class]
- messageProducerFactory: defined by method 'messageProducerFactory' in class path resource [com/example/domain/configuration/messaging/KafkaProducerConfig.class]
What am I doing wrong ?
You should not use manual acknowledgments. Instead, inject a KafkaTransactionManager into the listener container and the container will send the offset to the transaction when the listener method exits normally (or rollback otherwise).
You should not do acks via the consumer for exactly once.
EDIT
application.yml
spring:
kafka:
consumer:
auto-offset-reset: earliest
enable-auto-commit: false
properties:
isolation:
level: read_committed
producer:
transaction-id-prefix: myTrans.
App
#SpringBootApplication
public class So52570118Application {
public static void main(String[] args) {
SpringApplication.run(So52570118Application.class, args);
}
#Bean // override boot's auto-config to add txm
public ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory(
ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> kafkaConsumerFactory,
KafkaTransactionManager<Object, Object> transactionManager) {
ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
configurer.configure(factory, kafkaConsumerFactory);
factory.getContainerProperties().setTransactionManager(transactionManager);
return factory;
}
#Autowired
private KafkaTemplate<String, String> template;
#KafkaListener(id = "so52570118", topics = "so52570118")
public void listen(String in) throws Exception {
System.out.println(in);
Thread.sleep(5_000);
this.template.send("so52570118out", in.toUpperCase());
System.out.println("sent");
}
#KafkaListener(id = "so52570118out", topics = "so52570118out")
public void listenOut(String in) {
System.out.println(in);
}
#Bean
public ApplicationRunner runner() {
return args -> this.template.executeInTransaction(t -> t.send("so52570118", "foo"));
}
#Bean
public NewTopic topic1() {
return new NewTopic("so52570118", 1, (short) 1);
}
#Bean
public NewTopic topic2() {
return new NewTopic("so52570118out", 1, (short) 1);
}
}

Problems adding multiple KafkaListenerContainerFactories

Hi I'm currently dabbling in Spring Kafka and succeeded in adding a single KafkaListenerContainerFactory to my listener. Now I'd like to add multiple KafkaListenerContainerFactorys (One for a topic that will have messages in json, another one for strings). See code below:
#EnableKafka
#Configuration
public class KafkaConsumersConfig {
private final KafkaConfiguration kafkaConfiguration;
#Autowired
public KafkaConsumersConfig(KafkaConfiguration kafkaConfiguration) {
this.kafkaConfiguration = kafkaConfiguration;
}
#Bean
public KafkaListenerContainerFactory<?> kafkaJsonListenerContainerFactory(){
ConcurrentKafkaListenerContainerFactory<String,Record> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(jsonConsumerFactory());
factory.setConcurrency(3);
factory.setAutoStartup(true);
return factory;
}
#Bean
public ConsumerFactory<String,Record> jsonConsumerFactory(){
JsonDeserializer<Record> jsonDeserializer = new JsonDeserializer<>(Record.class);
return new DefaultKafkaConsumerFactory<>(jsonConsumerConfigs(),new StringDeserializer(), jsonDeserializer);
}
#Bean
public Map<String,Object> jsonConsumerConfigs(){
Map<String,Object> propsMap = new HashMap<>();
propsMap.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaConfiguration.getBrokerAddress());
propsMap.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaConfiguration.getJsonGroupId());
propsMap.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, kafkaConfiguration.getAutoCommit());
propsMap.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, kafkaConfiguration.getAutoCommitInterval());
propsMap.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, kafkaConfiguration.getSessionTimeout());
return propsMap;
}
#Bean
public KafkaListenerContainerFactory<?> kafkaFileListenerContainerFactory(){
ConcurrentKafkaListenerContainerFactory<String,String> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(fileConsumerFactory());
factory.setConcurrency(3);
factory.setAutoStartup(true);
return factory;
}
#Bean
public ConsumerFactory<String,String> fileConsumerFactory(){
return new DefaultKafkaConsumerFactory<>(fileConsumerConfigs());
}
#Bean
public Map<String,Object> fileConsumerConfigs(){
Map<String,Object> propsMap = new HashMap<>();
propsMap.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaConfiguration.getBrokerAddress());
propsMap.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaConfiguration.getFileGroupId());
propsMap.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, kafkaConfiguration.getAutoCommit());
propsMap.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, kafkaConfiguration.getAutoCommitInterval());
propsMap.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, kafkaConfiguration.getSessionTimeout());
return propsMap;
}
}
Running this gives me the following error:
Description:
Parameter 1 of method kafkaListenerContainerFactory in org.springframework.boot.autoconfigure.kafka.KafkaAnnotationDrivenConfiguration required a bean of type 'org.springframework.kafka.core.ConsumerFactory' that could not be found.
- Bean method 'kafkaConsumerFactory' in 'KafkaAutoConfiguration' not loaded because #ConditionalOnMissingBean (types: org.springframework.kafka.core.ConsumerFactory; SearchStrategy: all) found beans 'jsonConsumerFactory', 'fileConsumerFactory'
Action:
Consider revisiting the conditions above or defining a bean of type 'org.springframework.kafka.core.ConsumerFactory' in your configuration.
What am I doing wrong?
Looks like you are not going to rely on the Spring Boot's Kafka Auto Configuration.
Spring Boot provides in the KafkaAutoConfiguration:
#Bean
#ConditionalOnMissingBean(ConsumerFactory.class)
public ConsumerFactory<?, ?> kafkaConsumerFactory() {
Since you have jsonConsumerFactory and fileConsumerFactory, they override that one provided by the auto-config.
But on the other hand, in the KafkaAnnotationDrivenConfiguration, non of your factories can be applied:
#Bean
#ConditionalOnMissingBean(name = "kafkaListenerContainerFactory")
public ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory(
ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> kafkaConsumerFactory) {
Because your ConsumerFactory beans are not of ConsumerFactory<Object, Object> type.
So:
Just exclude KafkaAutoConfiguration from the Spring Boot auto configuration by adding the following to the application properties file:
spring.autoconfigure.exclude=org.springframework.boot.autoconfigure.kafka.KafkaAutoConfiguration
or rename one of your KafkaListenerContainerFactory beans to the kafkaListenerContainerFactory to override it in the Boot
or make one of the ConsumerFactory beans as a ConsumerFactory<Object, Object> type.
I have achieved it below code and its working fine for me.
// LISTENER 1
#Bean
#ConditionalOnMissingBean(name = "yourListenerFactory1")
public ConsumerFactory<String, YourCustomObject1> yourConsumerFactory1() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "YOUR-GROUP-1");
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(),
new JsonDeserializer<>(YourCustomObject1.class));
}
#Bean(name = "yourListenerFactory1")
public ConcurrentKafkaListenerContainerFactory<String, YourCustomObject1>
yourListenerFactory1() {
ConcurrentKafkaListenerContainerFactory<String, YourCustomObject1> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(yourConsumerFactory1());
ContainerProperties containerProperties = factory.getContainerProperties();
containerProperties.setPollTimeout(...);
containerProperties.setAckMode(AckMode...);
return factory;
}
// LISTENER 2
#Bean
#ConditionalOnMissingBean(name = "yourListenerFactory2")
public ConsumerFactory<String, YourCustomObject2> yourConsumerFactory2() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "YOUR-GROUP-2");
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(),
new JsonDeserializer<>(YourCustomObject2.class));
}
#Bean(name = "yourListenerFactory2")
public ConcurrentKafkaListenerContainerFactory<String, YourCustomObject2>
yourListenerFactory2() {
ConcurrentKafkaListenerContainerFactory<String, YourCustomObject2> factory
= new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(yourConsumerFactory2());
ContainerProperties containerProperties = factory.getContainerProperties();
containerProperties.setPollTimeout(...);
containerProperties.setAckMode(AckMode...);
return factory;
}
Also, I have set spring.autoconfigure.exclude property as ITS MUST
spring.autoconfigure.exclude=org.springframework.boot.autoconfigure.kafka.KafkaAutoConfiguration
This is my Consumer config
Consumer 1
#KafkaListener(id = "your-cousumer-1",
topicPattern = "your-topic-1",
containerFactory = "yourListenerFactory1")
public void consumer1(YourCustomObject1 data,
Acknowledgment acknowledgment,
#Header(KafkaHeaders.RECEIVED_PARTITION_ID) List<Integer> partitions,
#Header(KafkaHeaders.RECEIVED_TOPIC) List<String> topics,
#Header(KafkaHeaders.OFFSET) List<Long> offsets) throws Exception { ... }
Consumer 2
#KafkaListener(id = "your-cousumer-2",
topicPattern = "your-topic-2",
containerFactory = "yourListenerFactory2")
public void consumer2(YourCustomObject2 data,
Acknowledgment acknowledgment,
#Header(KafkaHeaders.RECEIVED_PARTITION_ID) List<Integer> partitions,
#Header(KafkaHeaders.RECEIVED_TOPIC) List<String> topics,
#Header(KafkaHeaders.OFFSET) List<Long> offsets) throws Exception { ... }
Also, my kafka template was
#Autowired
KafkaTemplate<String, Object> kafkaTemplate;
You can define each container factory in KafkaListener definition as follow:
#KafkaListener(topics = "fileTopic", containerFactory = "kafkaFileListenerContainerFactory")
public void fileConsumer(...) {...}
#KafkaListener(topics = "jsonTopic", containerFactory = "kafkaJsonListenerContainerFactory")
public void jsonConsumer(...) {...}

Resources