Producer Kafka throws deserialization exception - spring

I have one topic and Producer/Consumer:
Dependencies (Spring Initializr)
Producer (apache kafka)
Consumer (apache kafka stream, cloud stream)
Producer:
KafkaProducerConfig
#Configuration
public class KafkaProducerConfig {
#Bean
public KafkaTemplate<String, Person> kafkaTemplate(){
return new KafkaTemplate<>(producerFactory());
}
#Bean
public ProducerFactory<String, Person> producerFactory(){
Map<String, Object> configs = new HashMap<>();
configs.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
configs.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configs.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(configs);
}
}
Controller:
#RestController
public class KafkaProducerApplication {
private KafkaTemplate<String, Person> kafkaTemplate;
public KafkaProducerApplication(KafkaTemplate<String, Person> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
#GetMapping("/persons")
public Mono<List<Person>> findAll(){
var personList = Mono.just(Arrays.asList(new Person("Name1", 15),
new Person("Name2", 10)));
personList.subscribe(dataList -> kafkaTemplate.send("topic_test_spring", dataList.get(0)));
return personList;
}
}
It works correctly when accessing the endpoint and does not throw any exception in the IntelliJ console.
Consumer:
spring:
cloud:
stream:
function:
definition: personService
bindings:
personService-in-0:
destination: topic_test_spring
kafka:
bindings:
personService-in-0:
consumer:
configuration:
value:
deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
binders:
brokers:
- localhost:9091
- localhost:9092
kafka:
consumer:
properties:
spring:
json:
trusted:
packages: "*"
PersonKafkaConsumer
#Configuration
public class PersonKafkaConsumer {
#Bean
public Consumer<KStream<String, Person>> personService(){
return kstream -> kstream.foreach((key, person) -> {
System.out.println(person.getName());
});
}
}
Here I get the exception when run the project.
org.apache.kafka.streams.errors.StreamsException: Deserialization exception handler is set to fail upon a deserialization error. If you would rather have the streaming pipeline continue after a deserialization error, please set the default.deserialization.exception.handler appropriately. Caused by: java.lang.IllegalArgumentException: The class 'com.example.producer.model.Person' is not in the trusted packages: [java.util, java.lang, com.nttdata.bootcamp.yanki.model, com.nttdata.bootcamp.yanki.model.]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (). org.apache.kafka.streams.errors.StreamsException: Deserialization exception handler is set to fail upon a deserialization error. If you would rather have the streaming pipeline continue after a deserialization error, please set the default.deserialization.exception.handler appropriately.
The package indicated in the exception refers to the entity's package but in the producer. The producer's properties file has no configuration.

Related

Spring cloud stream Producer error handler not works

I'm using spring cloud stream versione 3.1.4, and this is my producer
#Component
public class Producer {
#Autowired
private StreamBridge streamBridge;
public void produce(int messageId, Object message) {
Message<Object> msg= MessageBuilder
.withPayload(message)
.setHeader("partitionKey", messageId)
.build();
streamBridge.send("outputchannel-out-0", msg);
}
#ServiceActivator(inputChannel = "errorchannel.errors")
public void errorHandler(ErrorMessage em) {
log.info("Error: {}", em);
}
}
Into application.yaml I set errorChannelEnabled
spring:
cloud:
stream:
bindings:
#Channel name
outputchannel-out-0:
destination: my-topic
contentType: application/json
producer:
partitionKeyExpression: headers['partitionKey']
partitionCount: 1
errorChannelEnabled: true
Now, If I change the produce() method in this way, in order to test the error handler
public void produce(int messageId, Object message) {
throw new RuntimeException("Producer error");
}
Nothing happens.
Error handler is not triggered.
I'm not sure that it's the right way to setup the error handler in spring cloud stream 3.1.4.
Can you help me?
errorchannel.errors does not exist.
There are two error channels errorChannel is the global error channel; the binding-specific error channel is named <destination>.<group>.errors. You don't currently have a group.

Spring Kafka 2.6.x ErrorHandler and DeadLetterPublishingRecoverer with ConcurrentKafkaListenerContainerFactory

We are trying to use the DLT feature in Spring Kafka 2.6.x. This is the config yml:
kafka:
bootstrap-servers: localhost:9092
auto-offset-reset: earliest
consumer:
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
enable-auto-commit: false
properties:
isolation.level: read_committed
fetch.max.wait.ms: 100
spring.json.value.default.type: 'com.sample.entity.Event'
spring.json.trusted.packages: 'com.sample.entity.*'
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
producer:
bootstrap-servers: localhost:9092
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonDeserializer
And here is the KafkaConfig class:
#EnableKafka
#Configuration
#Log4j2
public class KafkaConfig {
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Event>
kafkaListenerContainerFactory(ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> consumerFactory) {
ConcurrentKafkaListenerContainerFactory<String, Event> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory);
return factory;
}
#Bean
public SeekToCurrentErrorHandler errorHandler(DeadLetterPublishingRecoverer deadLetterPublishingRecoverer) {
SeekToCurrentErrorHandler handler = new SeekToCurrentErrorHandler(deadLetterPublishingRecoverer);
handler.addNotRetryableExceptions(UnprocessableException.class);
return handler;
}
#Bean
public DeadLetterPublishingRecoverer publisher(KafkaOperations kafkaOperations) {
return new DeadLetterPublishingRecoverer(kafkaOperations);
}
}
It is fine without the ConcurrentKafkaListenerContainerFactory, but since we want to scale up scale down the number of instances, we want to use the ConcurrentKafkaListenerContainer.
What is the proper way to do this?
Also, I found that if it is Deserialisation Exception, the message in .DLT is not sent properly (not proper JSON), while if it is "UnprocessableException" (our custom exception that throws within listener) it is proper JSON in .DLT
Since you are wiring up your own consumer factory, you have to set the error handler on it.
but since we want to scale up scale down the number of instances, we want to use the ConcurrentKafkaListenerContainer.
Boot's auto configuration wires up a concurrent container with concurrency=1 (if there is no ....listener.concurrency property); so you can use Boot's factory.
For deserialization exceptions (all exceptions), the record.value() is whatever came in originally. If that's not what you are seeing, please provide an example of what's in the original record and the DLT.

Spring-kafka error handling with DeadLetterPublishingRecoverer

I am trying to implement error handling in Spring boot kafa. In my Kafka listener I am throwing a runtime exception as per below:
#KafkaListener(topics= "Kafka-springboot-example", groupId="group-employee-json")
public void consumeEmployeeJson(Employee employee) {
logger.info("Consumed Employee JSON: "+ employee);
if(null==employee.getEmployeeId()) {
throw new RuntimeException("failed");
//throw new ListenerExecutionFailedException("failed");
}
}
And I have configured error handling as per below:
#Configuration
#EnableKafka
public class KafkaConfiguration {
#Bean
public ConcurrentKafkaListenerContainerFactory<Object, Object> containerFactory(
ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> kafkaConsumerFactory,
KafkaTemplate<Object, Object> template){
ConcurrentKafkaListenerContainerFactory<Object, Object> factory= new ConcurrentKafkaListenerContainerFactory<>();
configurer.configure(factory, kafkaConsumerFactory);
factory.setErrorHandler(new SeekToCurrentErrorHandler(
new DeadLetterPublishingRecoverer(template)));
return factory;
}
}
And my listener for DLT is as per below:
#KafkaListener(topics= "Kafka-springboot-example.DLT", groupId="group-employee-json")
public void consumeEmployeeErrorJson(Employee employee) {
logger.info("Consumed Employee JSON frpm DLT topic: "+ employee);
}
But my message is not getting published to DLT topic.
Any idea what I am doing wrong?
Edited:
application.properties
server.port=8088
#kafka-producer-config
spring.kafka.producer.bootstrap-servers=localhost:9092
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer
#Kafka consumer properties
spring.kafka.consumer.bootstrap-servers=localhost:9092
spring.kafka.consumer.group-id=group-employee-json
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer=org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.trusted.packages=*
public ConcurrentKafkaListenerContainerFactory<Object, Object> containerFactory(
If you use a non-standard bean name for the container factory, you need to set it on the #KafkaListener in the containerFactory property.
The default bean name is kafkaListenerContainerFactory which is auto-configured by Boot. You need to either override that bean or configure the listener to point to your non-standard bean name.

Spring Kafka #SendTo throws exception : a KafkaTemplate is required to support replies

I'm trying to get consumer result, according to Spring kafka doc.
Based on this stackoverflow question, it should be possible to do this only by using #SendTo annotation beacuse spring boot "also auto configures a kafka template if there is not one already in the context."
But I can't get it works, I still get
java.lang.IllegalStateException: a KafkaTemplate is required to support replies
at org.springframework.util.Assert.state(Assert.java:73) ~[spring-core-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.kafka.config.MethodKafkaListenerEndpoint.createMessageListener(MethodKafkaListenerEndpoint.java:156)
...
This is my listener method
#KafkaListener(topics = "t_invoice")
#SendTo("t_ledger")
public List<LedgerEntry> consume(Invoice invoice) throws IOException {
// do some processing
var ledgerCredit = new LedgerEntry(invoice.getAmount(), "Credit side", 0, "");
var ledgerDebit = new LedgerEntry(0, "", invoice.getAmount(), "Debit side");
return List.of(ledgerCredit, ledgerDebit);
}
What did I miss?
This my the only #Configuration file I have on consumer.
Consumer & producer is separated system (e.g. payment system produce invoice to kafka, my program is accounting system that took data and create ledger entry)
#Configuration
public class KafkaConfig {
#Autowired
private KafkaProperties kafkaProperties;
#Bean
public ConsumerFactory<String, String> consumerFactory() {
var properties = kafkaProperties.buildConsumerProperties();
properties.put(ConsumerConfig.METADATA_MAX_AGE_CONFIG, "600000");
return new DefaultKafkaConsumerFactory<>(properties);
}
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
var factory = new ConcurrentKafkaListenerContainerFactory<String, String>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
aplication.yml
spring:
kafka:
consumer:
group-id: default-spring-consumer
auto-offset-reset: earliest
Trial-Error 1
If I disable the KafkaConfig, or enable debug during run, this error exists:
org.apache.kafka.common.errors.SerializationException: Can't convert value of class com.accounting.kafkaconsumer.entity.LedgerEntry to class org.apache.kafka.common.serialization.StringSerializer specified in value.serializer
Caused by: java.lang.ClassCastException: class com.accounting.kafkaconsumer.entity.LedgerEntry cannot be cast to class java.lang.String (com.accounting.kafkaconsumer.entity.LedgerEntry is in unnamed module of loader 'app'; java.lang.String is in module java.base of loader 'bootstrap')
at org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:28) ~[kafka-clients-2.0.1.jar:na]
at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:65) ~[kafka-clients-2.0.1.jar:na]
at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:55) ~[kafka-clients-2.0.1.jar:na]
...
Trial-Error 2
If I disable KafkaConfig and using this signature (returning String), it works. But this is not expected, since my configuration is on KafkaConfig
#KafkaListener(topics = "t_invoice")
#SendTo("t_ledger")
public String consume(Invoice invoice) throws IOException {
// do some processing
var listLedger = List.of(ledgerCredit, ledgerDebit);
return objectMapper.writeValueAsString(listLedger);
}
I think the problem is in here (KafkaConfig), since I create new instance of KafkaListenerContainerFactory, the replyTemplate is null.
How is the correct way to set up my KafkaConfig?
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
var factory = new ConcurrentKafkaListenerContainerFactory<String, String>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
If you override Boot's auto-configured container factory then it won't be... auto-configured, including applying the template. When you define your own factory, you are responsible for configuring it. It's not clear why you are overriding Boot's kafkaListenerContainerFactory bean since all you are doing is injecting the consumer factory. Just remove that #Bean and use Boot's.
If you override Boot's kafkaListenerContainerFactory make sure that you set the reply template
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Object> kafkaListenerContainerFactory(KafkaTemplate<String, Object> kafkaTemplate) {
ConcurrentKafkaListenerContainerFactory<String, Object> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setReplyTemplate(kafkaTemplate); // <============
return factory;
}

Spring Boot Auto Configuration Failed Loading Spring Kafka Properties

Spring boot failed to load properties. Here are the properties that i am using through the yaml file.
spring:
kafka:
bootstrap-servers: localhost:9092
consumer:
auto-commit-interval: 100
enable-auto-commit: true
group-id: ********************
auto-offset-reset: earliest
value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
producer:
batch-size: 16384
buffer-memory: 33554432
retries: 0
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
listener:
poll-timeout: 20000
The exception i am getting is this
Caused by: java.lang.IllegalAccessException: Class org.apache.kafka.common.utils.Utils can not access a member of class org.springframework.kafka.support.serializer.JsonDeserializer with modifiers "protected"
I think the constructor is protected. Please provide a way to instantiate this.
That's correct. See:
protected JsonDeserializer() {
this((Class<T>) null);
}
protected JsonDeserializer(ObjectMapper objectMapper) {
this(null, objectMapper);
}
public JsonDeserializer(Class<T> targetType) {
this(targetType, new ObjectMapper());
this.objectMapper.configure(MapperFeature.DEFAULT_VIEW_INCLUSION, false);
this.objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
}
The JsonDeserializer isn't designed to be instantiated by the default constructor because it needs to know the targetType to deserialize.
You can extend this class to your particular type:
public class FooJsonDeserializer extends JsonDeserializer<Foo> { }
and use already this as class value for that value-deserializer property.
Or you can consider to customize the DefaultKafkaConsumerFactory:
#Bean
public ConsumerFactory<?, ?> kafkaConsumerFactory(KafkaProperties properties) {
Map<String, Object> consumerProperties = properties.buildConsumerProperties();
consumerProperties.put(CommonClientConfigs.METRIC_REPORTER_CLASSES_CONFIG,
MyConsumerMetricsReporter.class);
DefaultKafkaConsumerFactory<Object, Object> consumerFactory =
new DefaultKafkaConsumerFactory<>(consumerProperties);
consumerFactory.setValueDeserializer(new JsonDeserializer<>(Foo.class));
return consumerFactory;
}

Resources