How to configure a custom Kafka deserializer and get the consumed JSON data using a KafkaListener - spring

I am trying to consume a JSON message using spring kafka. The message which is consumed by the consumer is like this.
{
"EventHeader": {
"entityName": "Account",
"changedFields": ["Id", "Name"]
},
"NewFields": {
"Id": "001",
"Name": "Test Account",
},
"OldFields": {}
}
So far I have created classes for "EventHeader", "NewFields","OldFields" ,and for "KafkaPayload". And also I have created a custom deserializer to deserialize this JSON payload.Here is my custom deserializer.
public class CustomDeserailizer <T extends Serializable> implements Deserializer<T> {
private ObjectMapper objectMapper = new ObjectMapper();
public static final String VALUE_CLASS_NAME_CONFIG = "value.class.name";
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
Deserializer.super.configure(configs, isKey);
}
#Override
public T deserialize(String topic, byte[] objectData) {
return (objectData == null) ? null : (T) SerializationUtils.deserialize(objectData);
}
#Override
public T deserialize(String topic, Headers headers, byte[] data) {
return Deserializer.super.deserialize(topic, headers, data);
}
#Override
public void close() {
Deserializer.super.close();
}
}
I have set the consumer configurations as below.
public class KafkaConfig {
#Bean
public KafkaConsumer<String, KafkaPayload> consumerFactory(){
Properties config = new Properties();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
config.put(ConsumerConfig.GROUP_ID_CONFIG, "groupId");
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, CustomDeserializer.class);
return new KafkaConsumer<>(config);
}
}
Now I need to show the consumed message through a #KafkaListener setting the consumer into ConsumerFactory. But I don't understand how to do that. This is my first time using kafka.So could anyone give me some idea about this?
This is how I am trying to do that.
#Bean
public ConcurrentKafkaListenerContainerFactory<String, KafkaPayload> kafkaListener(){
ConcurrentKafkaListenerContainerFactory factory = new ConcurrentKafkaListenerContainerFactory();
factory.setConsumerFactory(consumerFactory());
return factory;
}
This is my KafkaListener
public class ConsumerService {
#KafkaListener(topics = "Topic", groupId = "sample-group",containerFactory = "kafkaListener")
public void consume(KafkaPayload kafkaPayload){
System.out.println("Consumed Message :"+ kafkaPayload);
}
}

Since you are using Spring Boot, just set the value deserializer class name as a property and Boot will automatically wire it into the container factory for your #KafkaListener. No need to define your own consumer factory or container factory.
spring.kafka.consumer.value-deserializer=com.acme.CustomDeserializer
https://docs.spring.io/spring-boot/docs/current/reference/html/application-properties.html#application-properties.integration.spring.kafka.consumer.value-deserializer

Related

Spring Boot Kafka Configure DefaultErrorHandler?

I created a batch-consumer following the Spring Kafka docs:
#SpringBootApplication
public class ApplicationConsumer {
private static final Logger LOGGER = LoggerFactory.getLogger(ApplicationConsumer.class);
private static final String TOPIC = "foo";
public static void main(String[] args) {
ConfigurableApplicationContext context = SpringApplication.run(ApplicationConsumer.class, args);
}
#Bean
public RecordMessageConverter converter() {
return new JsonMessageConverter();
}
#Bean
public BatchMessagingMessageConverter batchConverter() {
return new BatchMessagingMessageConverter(converter());
}
#KafkaListener(topics = TOPIC)
public void listen(List<Name> ps) {
LOGGER.info("received name beans: {}", Arrays.toString(ps.toArray()));
}
}
I was able to successfully get the consumer running by defining the following additional configuration env variables, that Spring automatically picks up:
export SPRING_KAFKA_BOOTSTRAP-SERVERS=...
export SPRING_KAFKA_CONSUMER_GROUP-ID=...
So the above code works. But now I want to customize the default error handler to use exponential backoff. From the ref docs I tried adding the following to ApplicationConsumer class:
#Bean
public ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setCommonErrorHandler(new DefaultErrorHandler(new ExponentialBackOffWithMaxRetries(10)));
factory.setConsumerFactory(consumerFactory());
return factory;
}
#Bean
public ConsumerFactory<String, Object> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(consumerConfigs());
}
#Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
return props;
}
But now I get errors saying that it can't find some of the configuration. It looks like I'm stuck having to redefine all of the properties in consumerConfigs() that were already being automatically defined before. This includes everything from bootstrap server uris to the json-deserialization config.
Is there a good way to update my first version of the code to just override the default-error handler?
Just define the error handler as a #Bean and Boot will automatically wire it into its auto configured container factory.
EDIT
This works as expected for me:
#SpringBootApplication
public class So70884203Application {
public static void main(String[] args) {
SpringApplication.run(So70884203Application.class, args);
}
#Bean
DefaultErrorHandler eh() {
return new DefaultErrorHandler((rec, ex) -> {
System.out.println("Recovered: " + rec);
}, new FixedBackOff(0L, 0L));
}
#KafkaListener(id = "so70884203", topics = "so70884203")
void listen(String in) {
System.out.println(in);
throw new RuntimeException("test");
}
#Bean
NewTopic topic() {
return TopicBuilder.name("so70884203").partitions(1).replicas(1).build();
}
}
foo
Recovered: ConsumerRecord(topic = so70884203, partition = 0, leaderEpoch = 0, offset = 0, CreateTime = 1643316625291, serialized key size = -1, serialized value size = 3, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value = foo)

Kafka Streams programmatically configuration not working

I'm doing a spring boot application and I'm trying to configure kafka programmatically, but for some reason is still getting the properties from application.yaml instead of the ones I set programmatically
#Configuration
public class KafkaConfiguration {
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(kafkaConsumerFactory());
factory.setConcurrency(1);
factory.getContainerProperties().setPollTimeout(30000);
return factory;
}
public ConsumerFactory<String, String> kafkaConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "aaa"); // should crash since is not valid
props.put(ConsumerConfig.GROUP_ID_CONFIG, "app1");
return new DefaultKafkaConsumerFactory<>(props);
}
}
#Component
public class StreamListener {
#StreamListener(TestStreams.TEST_STREAM_IN)
public void testStream(#Payload GenericCustomEvent response, #Headers MessageHeaders headers) throws Exception {
log.debug("Received generic event {} with headers {}", response, headers);
}
}
public interface TestStreams {
String TEST_STREAM_IN = "test-stream-in";
#Input(TEST_STREAM_IN)
SubscribableChannel inputTestStream();
}
#EnableBinding({TestStreams.class})
#SpringBootApplication
public class KafkaApplication {
public static void main(String[] args) {
SpringApplication.run(KafkaApplication .class, args);
}
}
The binder doesn't use a KafkaListenerContainerFactory, it creates the container itself from the yaml.
You can modify the container(s) by adding a ListenerContainerCustomizer bean.
Example here Can I apply graceful shutdown when using Spring Cloud Stream Kafka 3.0.3.RELEASE?

Spring boot JMS using different messages class

I'm using spring boot.
I want to use different models at both sender and receiver so that they don't depend to the same model (receiver doesn't need to add model of sender to classpath).
1) Should I do that?
2) And how can I do that?
Sender:
AccountEvent accountEvent = new com.dspc.account.domain.dto.AccountEvent(createdAccount.getId(), EventType.CREATED);
jmsMessagingTemplate.convertAndSend(new ActiveMQTopic("VirtualTopic.ACCOUNT-EVENT-TOPIC"), accountEvent);
Receiver:
#JmsListener(destination = "Consumer.AgentGenerator.VirtualTopic.ACCOUNT-EVENT-TOPIC")
public void receive(com.dspc.devicemgmt.domain.dto.AccountEvent accountEvent) {
System.out.println(accountEvent);
}
JMS config of both sender and receiver:
#Bean // Serialize message content to json using TextMessage
public MessageConverter jacksonJmsMessageConverter() {
MappingJackson2MessageConverter converter = new MappingJackson2MessageConverter();
converter.setTargetType(MessageType.TEXT);
converter.setTypeIdPropertyName("_type");
return converter;
}
Get exception when receiving message:
[com.dspc.account.domain.dto.AccountEvent]; nested exception is
java.lang.ClassNotFoundException:
com.dspc.account.domain.dto.AccountEvent
Note that there are 2 different packages:
- com.dspc.account.domain.dto.AccountEvent
- com.dspc.devicemgmt.domain.dto.AccountEvent
I'm thinking about creating a common message model. How do you think?
public class DspcCommonMessage {
private Map<String, String> properties;
private Optional<byte[]> payLoad = Optional.empty();
public Map<String, String> getProperties() {
return properties;
}
public void setProperties(Map<String, String> properties) {
this.properties = properties;
}
public Optional<byte[]> getPayLoad() {
return payLoad;
}
public void setPayLoad(Optional<byte[]> payLoad) {
this.payLoad = payLoad;
}
}
Sender and receiver:
public void publishMessage(com.dspc.account.domain.dto.AccountEvent accountEvent) {
ObjectMapper objectMapper = new ObjectMapper();
String messageAsString = objectMapper.writeValueAsString(accountEvent);
DspcCommonMessage dspcMessage = new DspcCommonMessage();
dspcMessage.setPayLoad(Optional.of(messageAsString.getBytes()));
jmsMessagingTemplate.convertAndSend(new ActiveMQTopic("VirtualTopic.ACCOUNT-EVENT-TOPIC"), dspcMessage);
}
#JmsListener(destination = "Consumer.AgentGenerator.VirtualTopic.ACCOUNT-EVENT-TOPIC")
public void receive(com.dspc.common.domain.DspcCommonMessage dspcCommonMessage) {
String jsonBody = new String(dspcCommonMessage.getPayload());
ObjectMapper objectMapper = new ObjectMapper();
com.dspc.devicemgmt.domain.dto.AccountEvent accountEvent = objectMapper.readValue(jsonBody,
com.dspc.devicemgmt.domain.dto.AccountEvent accountEvent.class);
System.out.println(accountEvent);
}

Spring-Rabbitmq MessageConverter - not invoking custom object handleMessage

I am implementing a consumer class that binds to fanout exchange in RabbitMQ and receives the message published as json. For some reason, the handleMessage within the Consumer class is not being invoked when its argument is a custom object. Same code works when the handleMessage is changed to take Object. Would appreciate your help in identity the missing piece.
Here is the configuration and consumer classes. This is not a SpringBoot application. My Configuration class has #Configuration annotation and not #SpringBootApplication.
#Bean
public SimpleMessageListenerContainer messageListenerContainer() {
SimpleMessageListenerContainer container = new SimpleMessageListenerContainer();
container.setConnectionFactory(rabbitConnectionFactory());
container.setQueueNames(QUEUE_NAME);
container.setMessageListener(listenerAdapter());
container.setMessageConverter(new Jackson2JsonMessageConverter());
container.setMissingQueuesFatal(false);
return container;
}
#Bean
public AmqpAdmin amqpAdmin() {
return new RabbitAdmin(rabbitConnectionFactory());
}
#Bean
public Queue queue() {
return new Queue(QUEUE_NAME, false, false, false);
}
#Bean
public FanoutExchange exchange() {
return new FanoutExchange(EXCHANGE_NAME, false, false);
}
#Bean
public Binding inboundEmailExchangeBinding() {
return BindingBuilder.bind(queue()).to(exchange());
}
#Bean
public ConnectionFactory rabbitConnectionFactory() {
return new CachingConnectionFactory("localhost");
}
#Bean
public RabbitTemplate rabbitTemplate() {
RabbitTemplate rabbitTemplate = new RabbitTemplate(rabbitConnectionFactory());
rabbitTemplate.setExchange(EXCHANGE_NAME);
return rabbitTemplate;
}
#Bean
MessageListenerAdapter listenerAdapter() {
return new MessageListenerAdapter(new Consumer(), "receiveMessage");
}
Here is the consumer ...
public class Consumer {
// This works
/*
public void receiveMessage(Object message) {
System.out.println("Received <" + message + ">");
}
*/
// This does not works, whereas I expect this to work.
public void receiveMessage(CustomObject message) {
System.out.println("Received <" + message + ">");
}
}
where CustomObject class is a plain POJO.
Here is an example of what is being published in RabbitMQ.
{
"state": "stable",
"ip": "1.2.3.4"
}
Its being published as json content-type
exchange.publish(message_json, :content_type => "application/json")
Appreciate all your help in making me understand the problem. Thanks.
The Jackson2JsonMessageConverter needs to be told what object to map the json to.
This can be provided via information in a __TypeId__ header (which would be the case if Spring was on the sending side); the header can either contain the full class name, or a token that is configured to map to the class name.
Or, you need to configure the converter with a class mapper.
For convenience there is a DefaultClassMapper that be configured with your target class:
ClassMapper classMapper = new DefaultClassMapper();
classMapper.setDefaultType(CustomObject.class);
converter.setClassMapper(classMapper);

Spring Boot: how to use FilteringMessageListenerAdapter

I have a Spring Boot application which listens to messages on a Kafka queue. To filter those messages, have the following two classs
#Component
public class Listener implements MessageListener {
private final CountDownLatch latch1 = new CountDownLatch(1);
#Override
#KafkaListener(topics = "${spring.kafka.topic.boot}")
public void onMessage(Object o) {
System.out.println("LISTENER received payload *****");
this.latch1.countDown();
}
}
#Configuration
#EnableKafka
public class KafkaConfig {
#Autowired
private Listener listener;
#Bean
public FilteringMessageListenerAdapter filteringReceiver() {
return new FilteringMessageListenerAdapter(listener, recordFilterStrategy() );
}
public RecordFilterStrategy recordFilterStrategy() {
return new RecordFilterStrategy() {
#Override
public boolean filter(ConsumerRecord consumerRecord) {
System.out.println("IN FILTER");
return false;
}
};
}
}
While messages are being processed by the Listener class, the RecordFilterStrategy implementation is not being invoked. What is the correct way to use FilteringMessageListenerAdapter?
Thanks
The solution was as follows:
No need for the FilteringMessageListenerAdapter class.
Rather, create a ConcurrentKafkaListenerContainerFactory, rather than relying on what Spring Boot provides out of the box. Then, set the RecordFilterStrategy implementation on this class.
#Bean
ConcurrentKafkaListenerContainerFactory<Integer, String>
kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<Integer, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setRecordFilterStrategy(recordFilterStrategy());
return factory;
}

Resources