CommonErrorHandler not present in kafka spring? - spring

I have created a simple kafka consumer as
#EnableKafka
#Configuration
public class KafkaConsumerConfig {
#Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
props.put(ConsumerConfig.GROUP_ID_CONFIG, "group_id");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,StringDeserializer.class);
return new DefaultKafkaConsumerFactory<>(props);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String>
kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
This is kafka consumer
#Component
public class KafkaConsumer {
#KafkaListener(topics = "NewTopic", groupId = "group_id")
public void consume(String message) {
System.out.println("message = " + message);
}
}
When I run the application get the following error
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.boot.autoconfigure.kafka.KafkaAnnotationDrivenConfiguration': Unexpected exception during bean creation; nested exception is java.lang.TypeNotPresentException: Type org.springframework.kafka.listener.CommonErrorHandler not present
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:555) ~[spring-beans-5.3.21.jar:5.3.21]
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335) ~[spring-beans-5.3.21.jar:5.3.21]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-5.3.21.jar:5.3.21]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333) ~[spring-beans-5.3.21.jar:5.3.21]

This is the problem of versions compatibility.
Please, consider to not override spring-kafka version, but rely on what Spring Boot provides for us.
The CommonErrorHandler has been introduced since Spring for Apache Kafka 2.8. According your 5.3.21 version for Spring Framework in logs, it looks like you use Spring Boot 2.6.x or even 2.7.x. But at the same time you use an old Spring for Apache Kafka version and really with an explicit version.

Related

Problem initializing Reactor Redis with Spring Boot

I am trying to get Reactive Redis working with existing application that has normal syncronous Redis implementation running. I can't change all the implementation at once, so I'm trying to get them both work at the same time.
This is what I have previously.
#Configuration
#ConfigurationProperties(prefix = "app.redis")
public class MemoryCacheConfiguration {
private String endpoint;
private int port;
#Bean
JedisConnectionFactory jedisConnectionFactory() {
return new JedisConnectionFactory(new RedisStandaloneConfiguration(this.endpoint, this.port));
}
#Bean
public RedisTemplate<String, String> redisTemplate() {
final RedisTemplate<String, String> template = new RedisTemplate<>();
template.setConnectionFactory(jedisConnectionFactory());
template.setKeySerializer(new StringRedisSerializer());
template.setHashValueSerializer(new GenericToStringSerializer<>(Serializable.class));
template.setValueSerializer(new GenericToStringSerializer<>(Serializable.class));
return template;
}
}
This is what I'm adding to the file
#Bean
public ReactiveRedisConnectionFactory reactiveRedisConnectionFactory() {
return new LettuceConnectionFactory(endpoint, port);
}
#Bean
public ReactiveRedisTemplate<String, String> reactiveRedisTemplate(ReactiveRedisConnectionFactory factory) {
RedisSerializationContext.RedisSerializationContextBuilder<String, String> builder =
RedisSerializationContext.newSerializationContext(new StringRedisSerializer());
RedisSerializationContext<String, String> context =
builder.value(new StringRedisSerializer()).build();
return new ReactiveRedisTemplate<>(factory, context);
}
If I have ReactiveRedisTemplate in the file defined, I'm getting following error about the duplicate
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type 'org.springframework.data.redis.core.ReactiveRedisTemplate<java.lang.String, java.lang.String>' available: expected single matching bean but found 2: reactiveRedisTemplate,reactiveStringRedisTemplate
However, if I remove the definition so that it is blank, I'm getting the following.
org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.springframework.data.redis.core.ReactiveRedisTemplate<java.lang.String, java.lang.String>' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {}
What might be the reason for this, as I haven't defined any reactiveStringRedisTemplate and there is no reference to it in my project, but if I remove my custom one, then it doesn't find any Bean required.
I am using Spring Boot version 2.7.3.
It helped that naming the bean properly where it was used so that it detected the correct template.
As I had this:
private final ReactiveRedisTemplate<String, String> redisReactorTemplate;
#Autowired
public RedisService(ReactiveRedisTemplate<String, String> redisReactorTemplate) {
this.redisReactorTemplate = redisReactorTemplate;
}
Instead of this:
private final ReactiveRedisTemplate<String, String> reactiveRedisTemplate;
#Autowired
public RedisService(ReactiveRedisTemplate<String, String> reactiveRedisTemplate) {
this.redisReactorTemplate = reactiveRedisTemplate;
}
I don't know the reason it has two possible beans at that point though (expected single matching bean but found 2: reactiveRedisTemplate,reactiveStringRedisTemplate).

Spring Boot Kafka Consumer throwing error in loop

I'm new to Kafka and while trying a sample scenario where a Kafka Producer sends a user Details in JSON format to a Consumer. I've visited similar questions but I couldn't get the answer I needed.
I don't face any problem if I run any of the Producer or Consumer in a terminal and the other in spring boot. The error occurs, in infinite loop (when both producer and consumer are started from different spring boot projects):
Consumer exception
java.lang.IllegalStateException: This error handler cannot process 'SerializationException's directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer
at org.springframework.kafka.listener.SeekUtils.seekOrRecover(SeekUtils.java:145) ~[spring-kafka-2.6.7.jar:2.6.7]
at org.springframework.kafka.listener.SeekToCurrentErrorHandler.handle(SeekToCurrentErrorHandler.java:113) ~[spring-kafka-2.6.7.jar:2.6.7]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.handleConsumerException(KafkaMessageListenerContainer.java:1427) ~[spring-kafka-2.6.7.jar:2.6.7]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1124) ~[spring-kafka-2.6.7.jar:2.6.7]
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na]
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na]
at java.base/java.lang.Thread.run(Thread.java:832) ~[na:na]
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition Example3-0 at offset 0. If needed, please seek past the record to continue consumption.
Caused by: java.lang.IllegalArgumentException: The class 'edu.kafka.producer.model.User' is not in the trusted packages: [java.util, java.lang, edu.consumer.test.model, edu.consumer.test.model.*]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*).
at org.springframework.kafka.support.converter.DefaultJackson2JavaTypeMapper.getClassIdType(DefaultJackson2JavaTypeMapper.java:126) ~[spring-kafka-2.6.7.jar:2.6.7]
at org.springframework.kafka.support.converter.DefaultJackson2JavaTypeMapper.toJavaType(DefaultJackson2JavaTypeMapper.java:100) ~[spring-kafka-2.6.7.jar:2.6.7]
at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:504) ~[spring-kafka-2.6.7.jar:2.6.7]
at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1365) ~[kafka-clients-2.6.0.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.access$3400(Fetcher.java:130) ~[kafka-clients-2.6.0.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.fetchRecords(Fetcher.java:1596) ~[kafka-clients-2.6.0.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.access$1700(Fetcher.java:1432) ~[kafka-clients-2.6.0.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:684) ~[kafka-clients-2.6.0.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:635) ~[kafka-clients-2.6.0.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1283) ~[kafka-clients-2.6.0.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1237) ~[kafka-clients-2.6.0.jar:na]
at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1210) ~[kafka-clients-2.6.0.jar:na]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doPoll(KafkaMessageListenerContainer.java:1271) ~[spring-kafka-2.6.7.jar:2.6.7]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1162) ~[spring-kafka-2.6.7.jar:2.6.7]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1075) ~[spring-kafka-2.6.7.jar:2.6.7]
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na]
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na]
at java.base/java.lang.Thread.run(Thread.java:832) ~[na:na]
I've mentioned the, deserialization and trusted packages in consumer configuration, below:
#EnableKafka
#Configuration
public class TestConfig {
#Bean
public ConsumerFactory<String, User> consumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
config.put(ConsumerConfig.GROUP_ID_CONFIG, "group_json");
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
config.put(ErrorHandlingDeserializer.KEY_DESERIALIZER_CLASS, StringDeserializer.class);
config.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS, JsonDeserializer.class);
config.put(JsonDeserializer.TRUSTED_PACKAGES, "edu.kafka.producer.model.User, java.util, java.lang, edu.consumer.test.model, edu.consumer.test.model.*" );
return new DefaultKafkaConsumerFactory<String, User>(config, new StringDeserializer(), new JsonDeserializer<>(User.class));
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, User> kafkaLister() {
ConcurrentKafkaListenerContainerFactory<String, User> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
I believe I'm missing something in configurations. I wanted to print the receiving message from Kafka into my Spring Boot console (I understand printing in console isn't recommended, this is a practice project), Below is the listener for consumer:
#Service
public class TestListener {
#KafkaListener(topics = "Example3", groupId = "group_json", containerFactory = "kafkaLister")
public void post(User user) {
System.out.println("Consumed Message: " + user);
}
}
The JSON I'm trying to consume:
{"name":"qaz","dept":"Aero"}
Spring version: 2.4.4
Kafka version (according to console): 2.6.7
Thank you so much in advance.
Caused by: java.lang.IllegalArgumentException: The class 'edu.kafka.producer.model.User' is not in the trusted packages: [java.util, java.lang, edu.consumer.test.model, edu.consumer.test.model.*]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*).
Looks like the deserializer is getting its properties from somewhere else.
config.put(JsonDeserializer.TRUSTED_PACKAGES, "edu.kafka.producer.model.User, java.util, java.lang, edu.consumer.test.model, edu.consumer.test.model.*" );
'edu.kafka.producer.model.User'
You are trying to deserialize a ...producer.model.User not a ...consumer.model.User
The ...producer... is coming from type information in headers; if you want to map a ...producer... object to a ...consumer... object, you need to configure type mapping as described in the documentation.
If you are only deserializing User objects, you can set use type info to false and set the default value type. See the configuration options...
https://docs.spring.io/spring-kafka/docs/current/reference/html/#serdes-json-config
Configuration Properties
JsonSerializer.ADD_TYPE_INFO_HEADERS (default true): You can set it to false to disable this feature on the JsonSerializer (sets the addTypeInfo property).
JsonSerializer.TYPE_MAPPINGS (default empty): See Mapping Types.
JsonDeserializer.USE_TYPE_INFO_HEADERS (default true): You can set it to false to ignore headers set by the serializer.
JsonDeserializer.REMOVE_TYPE_INFO_HEADERS (default true): You can set it to false to retain headers set by the serializer.
JsonDeserializer.KEY_DEFAULT_TYPE`: Fallback type for deserialization of keys if no header information is present.
JsonDeserializer.VALUE_DEFAULT_TYPE: Fallback type for deserialization of values if no header information is present.
JsonDeserializer.TRUSTED_PACKAGES (default java.util, java.lang): Comma-delimited list of package patterns allowed for deserialization. * means deserialize all.
JsonDeserializer.TYPE_MAPPINGS (default empty): See Mapping Types.
JsonDeserializer.KEY_TYPE_METHOD (default empty): See Using Methods to Determine Types.
JsonDeserializer.VALUE_TYPE_METHOD (default empty): See Using Methods to Determine Types.
The default type's package is always trusted.
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
config.put(ErrorHandlingDeserializer.KEY_DESERIALIZER_CLASS, StringDeserializer.class);
config.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS, JsonDeserializer.class);
The key and value deserializer have to be the ErrorHandlingDeserializer. You still have the native deserializers there.
Based on Mr. Gary Russell's answer below is the configuration which resolved the issue
Producer Configuration:
#Bean
public ProducerFactory<String, User> producerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
config.put(JsonSerializer.ADD_TYPE_INFO_HEADERS, false);
config.put(JsonSerializer.TYPE_MAPPINGS, "user:edu.kafka.test.model.User");
return new DefaultKafkaProducerFactory<>(config);
}
#Bean
public KafkaTemplate<String, User> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
Consumer Configuration:
#Configuration
#EnableKafka
public class TestConfig {
#Bean
public ConsumerFactory<String, User> consumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
config.put(ConsumerConfig.GROUP_ID_CONFIG, "group_json");
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
config.put(JsonSerializer.TYPE_MAPPINGS, "user:edu.kafka.test.model.User");
config.put(JsonDeserializer.VALUE_DEFAULT_TYPE, "edu.kafka.test.model.User");
config.put(ErrorHandlingDeserializer.KEY_DESERIALIZER_CLASS, StringDeserializer.class);
config.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS, JsonDeserializer.class);
return new DefaultKafkaConsumerFactory<String, User>(config, new StringDeserializer(), new JsonDeserializer<>(User.class));
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, User> kafkaLister() {
ConcurrentKafkaListenerContainerFactory<String, User> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setMissingTopicsFatal(false);
factory.setConsumerFactory(consumerFactory());
return factory;
}
}

Spring-kafka error handling with DeadLetterPublishingRecoverer

I am trying to implement error handling in Spring boot kafa. In my Kafka listener I am throwing a runtime exception as per below:
#KafkaListener(topics= "Kafka-springboot-example", groupId="group-employee-json")
public void consumeEmployeeJson(Employee employee) {
logger.info("Consumed Employee JSON: "+ employee);
if(null==employee.getEmployeeId()) {
throw new RuntimeException("failed");
//throw new ListenerExecutionFailedException("failed");
}
}
And I have configured error handling as per below:
#Configuration
#EnableKafka
public class KafkaConfiguration {
#Bean
public ConcurrentKafkaListenerContainerFactory<Object, Object> containerFactory(
ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> kafkaConsumerFactory,
KafkaTemplate<Object, Object> template){
ConcurrentKafkaListenerContainerFactory<Object, Object> factory= new ConcurrentKafkaListenerContainerFactory<>();
configurer.configure(factory, kafkaConsumerFactory);
factory.setErrorHandler(new SeekToCurrentErrorHandler(
new DeadLetterPublishingRecoverer(template)));
return factory;
}
}
And my listener for DLT is as per below:
#KafkaListener(topics= "Kafka-springboot-example.DLT", groupId="group-employee-json")
public void consumeEmployeeErrorJson(Employee employee) {
logger.info("Consumed Employee JSON frpm DLT topic: "+ employee);
}
But my message is not getting published to DLT topic.
Any idea what I am doing wrong?
Edited:
application.properties
server.port=8088
#kafka-producer-config
spring.kafka.producer.bootstrap-servers=localhost:9092
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer
#Kafka consumer properties
spring.kafka.consumer.bootstrap-servers=localhost:9092
spring.kafka.consumer.group-id=group-employee-json
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer=org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.trusted.packages=*
public ConcurrentKafkaListenerContainerFactory<Object, Object> containerFactory(
If you use a non-standard bean name for the container factory, you need to set it on the #KafkaListener in the containerFactory property.
The default bean name is kafkaListenerContainerFactory which is auto-configured by Boot. You need to either override that bean or configure the listener to point to your non-standard bean name.

Spring Kafka #SendTo throws exception : a KafkaTemplate is required to support replies

I'm trying to get consumer result, according to Spring kafka doc.
Based on this stackoverflow question, it should be possible to do this only by using #SendTo annotation beacuse spring boot "also auto configures a kafka template if there is not one already in the context."
But I can't get it works, I still get
java.lang.IllegalStateException: a KafkaTemplate is required to support replies
at org.springframework.util.Assert.state(Assert.java:73) ~[spring-core-5.1.8.RELEASE.jar:5.1.8.RELEASE]
at org.springframework.kafka.config.MethodKafkaListenerEndpoint.createMessageListener(MethodKafkaListenerEndpoint.java:156)
...
This is my listener method
#KafkaListener(topics = "t_invoice")
#SendTo("t_ledger")
public List<LedgerEntry> consume(Invoice invoice) throws IOException {
// do some processing
var ledgerCredit = new LedgerEntry(invoice.getAmount(), "Credit side", 0, "");
var ledgerDebit = new LedgerEntry(0, "", invoice.getAmount(), "Debit side");
return List.of(ledgerCredit, ledgerDebit);
}
What did I miss?
This my the only #Configuration file I have on consumer.
Consumer & producer is separated system (e.g. payment system produce invoice to kafka, my program is accounting system that took data and create ledger entry)
#Configuration
public class KafkaConfig {
#Autowired
private KafkaProperties kafkaProperties;
#Bean
public ConsumerFactory<String, String> consumerFactory() {
var properties = kafkaProperties.buildConsumerProperties();
properties.put(ConsumerConfig.METADATA_MAX_AGE_CONFIG, "600000");
return new DefaultKafkaConsumerFactory<>(properties);
}
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
var factory = new ConcurrentKafkaListenerContainerFactory<String, String>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}
aplication.yml
spring:
kafka:
consumer:
group-id: default-spring-consumer
auto-offset-reset: earliest
Trial-Error 1
If I disable the KafkaConfig, or enable debug during run, this error exists:
org.apache.kafka.common.errors.SerializationException: Can't convert value of class com.accounting.kafkaconsumer.entity.LedgerEntry to class org.apache.kafka.common.serialization.StringSerializer specified in value.serializer
Caused by: java.lang.ClassCastException: class com.accounting.kafkaconsumer.entity.LedgerEntry cannot be cast to class java.lang.String (com.accounting.kafkaconsumer.entity.LedgerEntry is in unnamed module of loader 'app'; java.lang.String is in module java.base of loader 'bootstrap')
at org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:28) ~[kafka-clients-2.0.1.jar:na]
at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:65) ~[kafka-clients-2.0.1.jar:na]
at org.apache.kafka.common.serialization.ExtendedSerializer$Wrapper.serialize(ExtendedSerializer.java:55) ~[kafka-clients-2.0.1.jar:na]
...
Trial-Error 2
If I disable KafkaConfig and using this signature (returning String), it works. But this is not expected, since my configuration is on KafkaConfig
#KafkaListener(topics = "t_invoice")
#SendTo("t_ledger")
public String consume(Invoice invoice) throws IOException {
// do some processing
var listLedger = List.of(ledgerCredit, ledgerDebit);
return objectMapper.writeValueAsString(listLedger);
}
I think the problem is in here (KafkaConfig), since I create new instance of KafkaListenerContainerFactory, the replyTemplate is null.
How is the correct way to set up my KafkaConfig?
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
var factory = new ConcurrentKafkaListenerContainerFactory<String, String>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
If you override Boot's auto-configured container factory then it won't be... auto-configured, including applying the template. When you define your own factory, you are responsible for configuring it. It's not clear why you are overriding Boot's kafkaListenerContainerFactory bean since all you are doing is injecting the consumer factory. Just remove that #Bean and use Boot's.
If you override Boot's kafkaListenerContainerFactory make sure that you set the reply template
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Object> kafkaListenerContainerFactory(KafkaTemplate<String, Object> kafkaTemplate) {
ConcurrentKafkaListenerContainerFactory<String, Object> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setReplyTemplate(kafkaTemplate); // <============
return factory;
}

With Spring boot and integration DSL, getting error ClassNotFoundException integration.history.TrackableComponent

Trying a very basic JMS receiver using Spring Boot, Integration and DSL. I have worked on XML based on Spring Integration, but am new to Spring Boot and DSL.
This is a code sample that I have so far
#SpringBootApplication
#IntegrationComponentScan
#EnableJms
public class JmsReceiver {
static String mailboxDestination = "RETRY.QUEUE";
#Configuration
#EnableJms
#IntegrationComponentScan
#EnableIntegration
public class MessageReceiver {
#Bean
public IntegrationFlow jmsMessageDrivenFlow() {
return IntegrationFlows
.from(Jms.messageDriverChannelAdapter(this.connectionFactory())
.destination(mailboxDestination))
.transform((String s) -> s.toUpperCase())
.get();
}
//for sneding message
#Bean
ConnectionFactory connectionFactory() {
ActiveMQConnectionFactory acFac = new ActiveMQConnectionFactory();
acFac.setBrokerURL("tcp://crsvcdevlnx01:61616");
acFac.setUserName("admin");
acFac.setPassword("admin");
return new CachingConnectionFactory(acFac);
}
}
//Message send code
public static void main(String args[]) throws Throwable {
AnnotationConfigApplicationContext context =
new AnnotationConfigApplicationContext(JmsReceiver.class);
JmsTemplate jmsTemplate = context.getBean(JmsTemplate.class);
System.out.println("Sending a new mesage.");
MessageCreator messageCreator = new MessageCreator() {
#Override
public Message createMessage(Session session) throws JMSException {
return session.createTextMessage("ping!");
}
};
jmsTemplate.send(mailboxDestination, messageCreator);
context.close();
}
}
And, I get this error when running with Gradle.
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.integration.dsl.IntegrationFlow]: Factory method 'inboundFlow' threw exception; nested exception is java.lang.NoClassDefFoundError: org/springframework/integration/history/TrackableComponent
reflect.NativeMethodAccessorImpl.invoke0(Native Method)
.
.
.
Caused by: java.lang.ClassNotFoundException: org.springframework.integration.history.TrackableComponent
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
My gradle dependencies:
compile "org.springframework.boot:spring-boot-starter-jersey",
"org.springframework.boot:spring-boot-starter-actuator",
"org.springframework.boot:spring-boot-configuration-processor",
"org.springframework.boot:spring-boot-starter-integration",
"org.springframework.integration:spring-integration-jms",
"org.springframework.integration:spring-integration-java-dsl:1.1.1.RELEASE",
"org.springframework.integration:spring-integration-flow:1.0.0.RELEASE",
"org.springframework.integration:spring-integration-core:4.2.2.RELEASE",
"org.springframework.integration:spring-integration-java-dsl:1.1.0.RELEASE",
"org.springframework.integration:spring-integration-flow:1.0.0.RELEASE",
"org.apache.activemq:activemq-spring:5.11.2",
UPDATE.. SOLVED: Thanks much. Changed two things:
Cleaned up gradle dependencies based on your advice. New ones looks like this:
compile "org.springframework.boot:spring-boot-starter-jersey",
"org.springframework.boot:spring-boot-starter-actuator",
"org.springframework.boot:spring-boot-configuration-processor",
"org.springframework.boot:spring-boot-starter-integration",
"org.springframework.integration:spring-integration-jms",
"org.springframework.integration:spring-integration-java-dsl:1.1.0.RELEASE",
"org.apache.activemq:activemq-spring:5.11.2"
Code was throwing constructor error about not being able to instantiate <init> in the inner class. Changed the Inner class to static. New Code:
#SpringBootApplication
#IntegrationComponentScan
#EnableJms
public class JmsReceiver {
static String lsamsErrorQueue = "Queue.LSAMS.retryMessage";
static String fatalErrorsQueue = "Queue.LSAMS.ManualCheck";
//receiver
#EnableJms
#EnableIntegration
#Configuration
public static class MessageReceiver {
#Bean
public IntegrationFlow jmsMessageDrivenFlow() {
return IntegrationFlows
.from(Jms.messageDriverChannelAdapter(this.connectionFactory())
.destination(lsamsErrorQueue))
//call LSAMS REST service with the payload received
.transform((String s) -> s.toUpperCase())
.handle(Jms.outboundGateway(this.connectionFactory())
.requestDestination(fatalErrorsQueue))
.get();
}
#Bean
ConnectionFactory connectionFactory() {
ActiveMQConnectionFactory acFac = new ActiveMQConnectionFactory();
acFac.setBrokerURL("tcp://crsvcdevlnx01:61616");
acFac.setUserName("admin");
acFac.setPassword("admin");
return new CachingConnectionFactory(acFac);
}
}
//Message send code
public static void main(String args[]) throws Throwable {
AnnotationConfigApplicationContext context =
new AnnotationConfigApplicationContext(JmsReceiver.class);
JmsTemplate jmsTemplate = context.getBean(JmsTemplate.class);
System.out.println("Sending a new mesage.");
MessageCreator messageCreator = new MessageCreator() {
#Override
public Message createMessage(Session session) throws JMSException {
return session.createTextMessage("ping!");
}
};
jmsTemplate.send(lsamsErrorQueue, messageCreator);
context.close();
}
}
Well, that fully looks like you have a version mess in your classpath.
First of all you shouldn't mix the same artifacts manually, like you have with spring-integration-java-dsl and spring-integration-flow. BTW, do you really need the last one?.. I mean is there some reason to keep spring-integration-flow? This project is about Modular Flows.
From other side you don't need to specify spring-integration-core if you are based on the Spring Boot (spring-boot-starter-integration in your case).
And yes: the TrackableComponent has been moved to the org.springframework.integration.support.management since Spring Integration 4.2 (https://jira.spring.io/browse/INT-3799).
From here it looks like you use the older Spring Integration version somehow:
- or Spring Boot 1.2.x
- or it is really side-effect of transitive dependency from spring-integration-flow...

Resources