Spring integration email pop3 inbound adapter not working/starting - spring-boot

With java spring integration written the below code to read the email from gmail.
As per logs seems the connection with gmail is formed, but on new email its not reading or not going into handle() method. Please help.
o.s.i.m.AbstractMailReceiver - attempting to receive mail from folder [INBOX]
import lombok.extern.log4j.Log4j2;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.integration.channel.QueueChannel;
import org.springframework.integration.config.EnableIntegration;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.dsl.Pollers;
import org.springframework.integration.mail.MailReceiver;
import org.springframework.integration.mail.dsl.Mail;
import org.springframework.integration.mail.support.DefaultMailHeaderMapper;
import org.springframework.integration.mapping.HeaderMapper;
import org.springframework.messaging.Message;
import org.springframework.messaging.PollableChannel;
import javax.mail.internet.MimeMessage;
#Log4j2
#Configuration
#EnableIntegration
public class EmailReceiver {
#Autowired
private PollableChannel pop3Channel;
#Bean
public PollableChannel receivedChannel() {
return new QueueChannel();
}
#Bean
public IntegrationFlow pop3MailFlow() {
return IntegrationFlows
.from(Mail.pop3InboundAdapter("pop.gmail.com", 995, "userName", "password")
.javaMailProperties(p -> {
p.put("mail.debug", "true");
p.put("mail.pop3.socketFactory.fallback", "false");
p.put("mail.pop3.port", 995);
p.put("mail.pop3.socketFactory.class", "javax.net.ssl.SSLSocketFactory");
p.put("mail.pop3.socketFactory.port", 995);
})
.headerMapper(mailHeaderMapper()),
e -> e.poller(Pollers.fixedRate(5000).maxMessagesPerPoll(1)))
.handle((payload, header) -> logMail(payload))
.get();
}
public Message logMail(Object payload) {
Message message = (Message)payload;
log.info("*******Email[TEST]********* ", payload);
return message;
}
#Bean
public HeaderMapper<MimeMessage> mailHeaderMapper() {
return new DefaultMailHeaderMapper();
}
}

The problem is here:
.maxFetchSize(1)
By default the AbstractMailReceiver tries to fetch all the messages from the mail box. And looks like it takes a lot of time.
Another problem that with the .headerMapper(mailHeaderMapper() a Mail.pop3InboundAdapter does not produce a Message but rather byte[]. So, your .handle((payload, header) -> logMail(payload)) not only bad by the request-reply definition in the end of flow, but also fails with ClassCast like this, because you expect the type which is not produced for you:
Caused by: java.lang.ClassCastException: class [B cannot be cast to class org.springframework.messaging.Message ([B is in module java.base of loader 'bootstrap'; org.springframework.messaging.Message is in unnamed module of loader 'app')
at com.firm.demo.EmailReceiver.logMail(EmailReceiver.java:59)
at com.firm.demo.EmailReceiver.lambda$pop3MailFlow$2(EmailReceiver.java:53)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
Well, no: better to say you are missing the fact that this signature .handle((payload, header) -> logMail(payload)) deals with the payload not the whole message. So, your expectations in the logMail() are wrong. It has to be byte[]. And consider to have a one-way handler over there in the end of flow anyway.
UPDATE
The working code is like this:
#Log4j2
#Configuration
#EnableIntegration
public class EmailReceiver {
#Autowired
private PollableChannel pop3Channel;
private MailReceiver receiver;
#Bean
public PollableChannel receivedChannel() {
return new QueueChannel();
}
#Bean
public IntegrationFlow pop3MailFlow() {
return IntegrationFlows
.from(Mail.pop3InboundAdapter("pop.gmail.com", 995, "userName", "password")
.javaMailProperties(p -> {
p.put("mail.debug", "false");
p.put("mail.pop3.socketFactory.fallback", "false");
p.put("mail.pop3.port", 995);
p.put("mail.pop3.socketFactory.class", "javax.net.ssl.SSLSocketFactory");
p.put("mail.pop3.socketFactory.port", 995);
})
.maxFetchSize(1)
.headerMapper(mailHeaderMapper()),
e -> e.poller(Pollers.fixedRate(5000).maxMessagesPerPoll(1)))
.handle(this, "logMail")
.get();
}
public void logMail(String payload) {
log.info("*******Email[TEST]********* \n" + payload);
}
#Bean
public HeaderMapper<MimeMessage> mailHeaderMapper() {
return new DefaultMailHeaderMapper();
}
}

Related

Azure Service Bus - Spring Boot disable autostart (com.microsoft.azure : azure-servicebus-spring-boot-starter)

I have the following implementation to consume the message from Azure Service Bus using Spring Boot application however I want to be able to control the ServiceBusConsumer from automatically start listening to the Topic using Spring boot profile property
something like this in the application.yaml
servicebus.consumer.enable=false
it should disable the ServiceBusConsumer from listening to the Topic(s) as well as I should be able to start the ServiceBusConsumer using a REST API - eg: ./api/servicebus/consumer/start?
import com.microsoft.azure.servicebus.ExceptionPhase;
import com.microsoft.azure.servicebus.IMessage;
import com.microsoft.azure.servicebus.IMessageHandler;
import com.microsoft.azure.servicebus.ISubscriptionClient;
import lombok.extern.log4j.Log4j2;
import org.springframework.boot.context.event.ApplicationReadyEvent;
import org.springframework.context.event.EventListener;
import org.springframework.core.Ordered;
import org.springframework.stereotype.Component;
import java.util.concurrent.CompletableFuture;
#Log4j2
#Component
class ServiceBusConsumer implements Ordered {
private final ISubscriptionClient iSubscriptionClient;
ServiceBusConsumer(ISubscriptionClient isc) {
this.iSubscriptionClient = isc;
}
#EventListener(ApplicationReadyEvent.class)
public void consume() throws Exception {
this.iSubscriptionClient.registerMessageHandler(new IMessageHandler() {
#Override
public CompletableFuture<Void> onMessageAsync(IMessage message) {
log.info("received message " + new String(message.getBody()) + " with body ID " + message.getMessageId());
return CompletableFuture.completedFuture(null);
}
#Override
public void notifyException(Throwable exception, ExceptionPhase phase) {
log.error("eeks!", exception);
}
});
}
#Override
public int getOrder() {
return Ordered.HIGHEST_PRECEDENCE;
}
}
You can create the ServiceBusConsumer bean conditionally by adding the #ConditionalOnProperty annotation like so, to make sure the bean is created only when servicebus.consumer.enabled=true:
#Log4j2
#Component
#ConditionalOnProperty(prefix = "servicebus.consumer", name = "enabled")
class ServiceBusConsumer implements Ordered {
...
}

Get response body from NoFallbackAvailableException in spring cloud circuit breaker resilience4j

I want to call a third party API. I use spring cloud circuit breaker resilience4j.
Here is my service class :
package ir.co.isc.resilience4jservice.service;
import ir.co.isc.resilience4jservice.model.Employee;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cloud.client.circuitbreaker.CircuitBreaker;
import org.springframework.cloud.client.circuitbreaker.CircuitBreakerFactory;
import org.springframework.cloud.client.circuitbreaker.NoFallbackAvailableException;
import org.springframework.stereotype.Service;
import org.springframework.web.client.RestTemplate;
#Service
public class EmployeeService {
#Autowired
private RestTemplate restTemplate;
#Autowired
private CircuitBreakerFactory circuitBreakerFactory;
public Employee getEmployee() {
try {
String url = "http://localhost:8090/employee";
CircuitBreaker circuitBreaker = circuitBreakerFactory.create("circuit-breaker");
return circuitBreaker.run(() -> restTemplate.getForObject(url, Employee.class));
} catch (NoFallbackAvailableException e) {
//I should extract error response body and do right action then return correct answer
return null;
}
}
}
ResilienceConfig:
package ir.co.isc.resilience4jservice.config;
import io.github.resilience4j.circuitbreaker.CircuitBreakerConfig;
import io.github.resilience4j.timelimiter.TimeLimiterConfig;
import org.springframework.cloud.circuitbreaker.resilience4j.Resilience4JCircuitBreakerFactory;
import org.springframework.cloud.circuitbreaker.resilience4j.Resilience4JConfigBuilder;
import org.springframework.cloud.client.circuitbreaker.Customizer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import java.time.Duration;
#Configuration
public class CircuitBreakerConfiguration {
#Bean
public Customizer<Resilience4JCircuitBreakerFactory> defaultCustomizer() {
CircuitBreakerConfig circuitBreakerConfig = CircuitBreakerConfig.custom()
.slidingWindowType(CircuitBreakerConfig.SlidingWindowType.COUNT_BASED)
.slidingWindowSize(10)
.minimumNumberOfCalls(10)
.failureRateThreshold(25)
.permittedNumberOfCallsInHalfOpenState(3)
.build();
TimeLimiterConfig timeLimiterConfig = TimeLimiterConfig.custom()
.timeoutDuration(Duration.ofSeconds(4))
.build();
return factory ->
factory.configureDefault(id -> new Resilience4JConfigBuilder(id)
.circuitBreakerConfig(circuitBreakerConfig)
.timeLimiterConfig(timeLimiterConfig)
.build());
}
}
in some situation third party api return ResponseEntity with statusCode = 500 and
body = {"errorCode":"CCBE"}.
response is look like this :
[503] during [POST] to [http://localhost:8090/employee]:[{"errorCode":"CCBE"}]
When I call this API and get internal server error with body, my catch block catchs api response.
In catch block I need retrieve response body and do some actions according to errorCode.
But I can not do this.
How can I extract body in this situation?

Spring Integration Flow route based on file name

I have a requirement to unzip and file and process its contents. Inside the zip file, there could be two types of file individual or firm. That can be distinguished by file name. After processing of all files, it should call another program module, also archive the processed file in a different location. Would like use Spring integration for the same. I am trying to achieve this by the following code, but it is creating problem while routing based on file name. I am using JDK 8, Spring 5
.<File, Boolean>route(new Function<File, Boolean>() {
#Override
public Boolean apply(File f) {
return f.getName().contains("individual");
}
}, m -> m
.subFlowMapping(true, sf -> sf.gateway(individualProcessor()))
.subFlowMapping(false, sf -> sf.gateway(firmProcessor()))
)
Exception
Caused by: java.lang.IllegalArgumentException: Found ambiguous parameter type [interface java.util.function.Function] for method match: [public default <V> java.util.function.Function<V, R> java.util.function.Function.compose(java.util.function.Function<? super V, ? extends T>), public static <T> java.util.function.Function<T, T> java.util.function.Function.identity(), public java.lang.Boolean com.xxx.thirdpatysystem.config.IntegrationConfig$1.apply(java.io.File)]
at org.springframework.util.Assert.isNull(Assert.java:155) ~[spring-core-5.0.4.RELEASE.jar:5.0.4.RELEASE]
at org.springframework.integration.util.MessagingMethodInvokerHelper.findHandlerMethodsForTarget(MessagingMethodInvokerHelper.java:843) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.util.MessagingMethodInvokerHelper.<init>(MessagingMethodInvokerHelper.java:362) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.util.MessagingMethodInvokerHelper.<init>(MessagingMethodInvokerHelper.java:231) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.util.MessagingMethodInvokerHelper.<init>(MessagingMethodInvokerHelper.java:225) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.handler.MethodInvokingMessageProcessor.<init>(MethodInvokingMessageProcessor.java:60) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.router.MethodInvokingRouter.<init>(MethodInvokingRouter.java:46) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.dsl.IntegrationFlowDefinition.route(IntegrationFlowDefinition.java:1922) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.dsl.IntegrationFlowDefinition.route(IntegrationFlowDefinition.java:1895) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
even I have tried below
.<File, Boolean>route(f -> f.getName().contains("individual"), m -> m
.subFlowMapping(true, sf -> sf.gateway(individualProcessor()))
.subFlowMapping(false, sf -> sf.gateway(firmProcessor()))
)
Exception
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.integration.dsl.IntegrationFlow]: Factory method 'thirdpatysystemFlow' threw exception; nested exception is java.lang.UnsupportedOperationException
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:185) ~[spring-beans-5.0.4.RELEASE.jar:5.0.4.RELEASE]
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:579) ~[spring-beans-5.0.4.RELEASE.jar:5.0.4.RELEASE]
... 17 common frames omitted
Caused by: java.lang.UnsupportedOperationException: null
at org.springframework.integration.dsl.StandardIntegrationFlow.configure(StandardIntegrationFlow.java:89) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.dsl.IntegrationFlowDefinition.gateway(IntegrationFlowDefinition.java:2172) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.dsl.IntegrationFlowDefinition.gateway(IntegrationFlowDefinition.java:2151) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
Entire code snippet is below
import java.io.File;
import java.util.concurrent.TimeUnit;
import java.util.function.Function;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.task.TaskExecutor;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.config.EnableIntegration;
import org.springframework.integration.core.MessageSource;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.dsl.Pollers;
import org.springframework.integration.dsl.channel.MessageChannels;
import org.springframework.integration.file.FileReadingMessageSource;
import org.springframework.integration.file.FileWritingMessageHandler;
import org.springframework.integration.file.filters.AcceptOnceFileListFilter;
import org.springframework.integration.file.filters.ChainFileListFilter;
import org.springframework.integration.file.filters.RegexPatternFileListFilter;
import org.springframework.integration.zip.splitter.UnZipResultSplitter;
import org.springframework.integration.zip.transformer.UnZipTransformer;
import org.springframework.integration.zip.transformer.ZipResultType;
import org.springframework.messaging.MessageHandler;
import org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor;
/**
* #author dpoddar
*
*/
#Configuration
#EnableIntegration
public class IntegrationConfig {
#Value("${input.directory}")
private String inputDir;
#Value("${outputDir.directory}")
private String outputDir;
#Value("${input.scan.frequency: 100}")
private long scanFrequency;
#Bean
public MessageSource<File> inputFileSource() {
FileReadingMessageSource src = new FileReadingMessageSource();
src.setDirectory(new File(inputDir));
src.setAutoCreateDirectory(true);
ChainFileListFilter<File> chainFileListFilter = new ChainFileListFilter<>();
chainFileListFilter.addFilter(new AcceptOnceFileListFilter<>() );
chainFileListFilter.addFilter(new RegexPatternFileListFilter("(?i)^.+\\.zip$"));
src.setFilter(chainFileListFilter);
return src;
}
#Bean
public UnZipTransformer unZipTransformer() {
UnZipTransformer unZipTransformer = new UnZipTransformer();
unZipTransformer.setExpectSingleResult(false);
unZipTransformer.setZipResultType(ZipResultType.FILE);
//unZipTransformer.setWorkDirectory(new File("/usr/tmp/uncompress"));
unZipTransformer.setDeleteFiles(true);
return unZipTransformer;
}
#Bean
public UnZipResultSplitter splitter() {
UnZipResultSplitter splitter = new UnZipResultSplitter();
return splitter;
}
#Bean
public DirectChannel outputChannel() {
return new DirectChannel();
}
#Bean
public MessageHandler fileOutboundChannelAdapter() {
FileWritingMessageHandler adapter = new FileWritingMessageHandler(new File(outputDir));
adapter.setDeleteSourceFiles(true);
adapter.setAutoCreateDirectory(true);
adapter.setExpectReply(false);
return adapter;
}
#Bean
public TaskExecutor taskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(5);
executor.setMaxPoolSize(10);
executor.setQueueCapacity(25);
return executor;
}
#Autowired
DirectChannel outputChannel;
#Autowired
MessageHandler fileOutboundChannelAdapter;
#Bean
public IntegrationFlow individualProcessor() {
return flow -> flow.handle("thirdpatysystemprocessor","processfile").channel(outputChannel).handle(fileOutboundChannelAdapter);
}
#Bean
public IntegrationFlow firmProcessor() {
return flow -> flow.handle("thirdpatysystemprocessor","processfile").channel(outputChannel).handle(fileOutboundChannelAdapter);
}
#Bean
public IntegrationFlow thirdpatysystemAgentDemographicFlow() {
return IntegrationFlows
.from(inputFileSource(), spec -> spec.poller(Pollers.fixedDelay(scanFrequency,TimeUnit.SECONDS)))
.transform(unZipTransformer())
.split(splitter())
.channel(MessageChannels.executor(taskExecutor()))
.<File, Boolean>route(new Function<File, Boolean>() {
#Override
public Boolean apply(File f) {
return f.getName().contains("individual");
}
}, m -> m
.subFlowMapping(true, sf -> sf.gateway(individualProcessor()))
.subFlowMapping(false, sf -> sf.gateway(firmProcessor()))
)
.aggregate()
/*.handle("thirdpatysystemprocessor","processfile")
.channel(outputChannel())
.handle(fileOutboundChannelAdapter())*/
.get()
;
}
}
The java.lang.IllegalArgumentException: Found ambiguous parameter type [interface java.util.function.Function] has been fixed in the Spring Integration 5.0.5: https://jira.spring.io/browse/INT-4456. So, now with an explicit Function impl we do this:
MethodInvokingRouter methodInvokingRouter = isLambda(router)
? new MethodInvokingRouter(new LambdaMessageProcessor(router, payloadType))
: new MethodInvokingRouter(router, ClassUtils.FUNCTION_APPLY_METHOD);
We explicitly point to the apply() method.
The re-use of existing IntegrationFlow beans in the sub-flows (gateway()) has been fixed in version 5.0.4: https://jira.spring.io/browse/INT-4434
So, what you need is just to upgrade your project to the latest dependencies. In particular Spring Integration 5.0.7: https://spring.io/projects/spring-integration#learn

Spring Integration HTTP Outbound Gateway Post Request with Java DSL

I am trying to consume a rest service and receive a json back and convert it to a list of objects. but I am receiving the below erorr. I am new to EIP and there aren't many tutorials for doing this in java dsl. I have configured 2 channels, one for sending a request and one for receiving the payload back.
Exception in thread "main" org.springframework.beans.factory.BeanNotOfRequiredTypeException: Bean named 'httpPostAtms' is expected to be of type 'org.springframework.messaging.MessageChannel' but was actually of type 'org.springframework.integration.dsl.StandardIntegrationFlow'
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:378)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
at org.springframework.integration.support.channel.BeanFactoryChannelResolver.resolveDestination(BeanFactoryChannelResolver.java:89)
at org.springframework.integration.support.channel.BeanFactoryChannelResolver.resolveDestination(BeanFactoryChannelResolver.java:46)
at org.springframework.integration.gateway.MessagingGatewaySupport.getRequestChannel(MessagingGatewaySupport.java:344)
at org.springframework.integration.gateway.MessagingGatewaySupport.doSendAndReceive(MessagingGatewaySupport.java:433)
at org.springframework.integration.gateway.MessagingGatewaySupport.sendAndReceive(MessagingGatewaySupport.java:422)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.invokeGatewayMethod(GatewayProxyFactoryBean.java:474)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.doInvoke(GatewayProxyFactoryBean.java:429)
at org.springframework.integration.gateway.GatewayProxyFactoryBean.invoke(GatewayProxyFactoryBean.java:420)
at org.springframework.integration.gateway.GatewayCompletableFutureProxyFactoryBean.invoke(GatewayCompletableFutureProxyFactoryBean.java:65)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213)
at com.sun.proxy.$Proxy70.getAllAtms(Unknown Source)
at com.backbase.atm.IngAtmApplication.main(IngAtmApplication.java:25)
I am using SI with Spring Boot
#IntegrationComponentScan
#Configuration
#EnableIntegration
#ComponentScan
public class InfrastructorConfig {
#Bean
public PollableChannel requestChannel() {
return new PriorityChannel() ;
}
#Bean
public MessageChannel replyChannel() {
return new DirectChannel() ;
}
#Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata poller() {
return Pollers.fixedRate(500).get();
}
#Bean
public IntegrationFlow httpPostAtms() {
return IntegrationFlows.from("requestChannel")
.handle(Http.outboundGateway("https://www.ing.nl/api/locator/atms/")
.httpMethod(HttpMethod.POST)
.extractPayload(true))
.<String, String>transform(p -> p.substring(5))
.transform(Transformers.fromJson(Atm[].class))
.channel("responseChannel")
.get();
}
}
The Gateway
package com.backbase.atm.service;
import java.util.List;
import org.springframework.integration.annotation.Gateway;
import org.springframework.integration.annotation.MessagingGateway;
import org.springframework.messaging.handler.annotation.Payload;
import com.backbase.atm.model.Atm;
#MessagingGateway
public interface IntegrationService {
#Gateway(requestChannel = "httpPostAtms")
#Payload("new java.util.Date()")
List<Atm> getAllAtms();
}
Application Start
package com.backbase.atm;
import java.util.ArrayList;
import java.util.List;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.context.annotation.Bean;
import com.backbase.atm.service.IntegrationService;
#SpringBootApplication
public class IngAtmApplication {
public static void main(String[] args) {
ConfigurableApplicationContext ctx = SpringApplication.run(IngAtmApplication.class, args);
ctx.getBean(IntegrationService.class).getAllAtms();
ctx.close();
}
You have to use requestChannel bean name in the gateway definition. Right now you have there an IntegrationFlow bean name, but that is wrong.
Always remember that everything in Spring Integration are connected via channels.

Spring Boot with Apache Kafka: Messages not being read

I am currently setting up a Spring Boot application with Kafka listener.
I am trying to code only the consumer. For producer, I am manually sending message from the Kafka console for now.
I followed the example:
http://www.source4code.info/2016/09/spring-kafka-consumer-producer-example.html
I tried running this as a Spring Boot application but not able to see any messages being received. There are already some messages in my local topic of Kafka.
C:\software\kafka_2.11-0.10.1.0\kafka_2.11-0.10.1.0\kafka_2.11-0.10.1.0\bin\wind
ows>kafka-console-producer.bat --broker-list localhost:9092 --topic test
this is a message
testing again
My Spring Boot application is:
#EnableDiscoveryClient
#SpringBootApplication
public class KafkaApplication {
/**
* Run the application using Spring Boot and an embedded servlet engine.
*
* #param args
* Program arguments - ignored.
*/
public static void main(String[] args) {
// Tell server to look for registration.properties or registration.yml
System.setProperty("spring.config.name", "kafka-server");
SpringApplication.run(KafkaApplication.class, args);
}
}
And Kafka configuration is:
package kafka;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.IntegerDeserializer;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.config.KafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.listener.ConcurrentMessageListenerContainer;
import java.util.HashMap;
import java.util.Map;
#Configuration
#EnableKafka
public class KafkaConsumerConfig {
#Bean
KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory();
factory.setConsumerFactory(consumerFactory());
//factory.setConcurrency(1);
//factory.getContainerProperties().setPollTimeout(3000);
return factory;
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
return new DefaultKafkaConsumerFactory(consumerConfigs());
}
#Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> propsMap = new HashMap();
propsMap.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
//propsMap.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
//propsMap.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "100");
//propsMap.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "15000");
propsMap.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, IntegerDeserializer.class);
propsMap.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
//propsMap.put(ConsumerConfig.GROUP_ID_CONFIG, "group1");
//propsMap.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return propsMap;
}
#Bean
public Listener listener() {
return new Listener();
}
}
And Kafka listener is:
package kafka;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.springframework.kafka.annotation.KafkaListener;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.util.concurrent.CountDownLatch;
import java.util.logging.Logger;
public class Listener {
protected Logger logger = Logger.getLogger(Listener.class
.getName());
public CountDownLatch getCountDownLatch1() {
return countDownLatch1;
}
private CountDownLatch countDownLatch1 = new CountDownLatch(1);
#KafkaListener(topics = "test")
public void listen(ConsumerRecord<?, ?> record) {
logger.info("Received message: " + record);
System.out.println("Received message: " + record);
countDownLatch1.countDown();
}
}
I am trying this for the first time. Please let me know if I am doing anything wrong. Any help will be greatly appreciated.
You did not set ConsumerConfig.AUTO_OFFSET_RESET_CONFIG so the default is "latest". Set it to "earliest" so the consumer will receive messages already in the topic.
ConsumerConfig.AUTO_OFFSET_RESET_CONFIG takes effect only if the consumer group does not already have an offset for a topic partition. If you already ran the consumer with the "latest" setting, then running the consumer again with a different setting does not change the offset. The consumer must use a different group so Kafka will assign offsets for that group.
Observed that you dit comment out the consumer group.id property.
//propsMap.put(ConsumerConfig.GROUP_ID_CONFIG, "group1");
Let's see how is quoted in the Kafka official document:
A unique string that identifies the consumer group this consumer belongs to. This property is required if the consumer uses either the group management functionality by using subscribe(topic) or the Kafka-based offset management strategy.
Tried to uncomement that row and the consumer worked.
You will need to annotate your Listener class with either #Service or #Component so that Spring Boot can load the Kafka listener.
package kafka;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.springframework.kafka.annotation.KafkaListener;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.util.concurrent.CountDownLatch;
import java.util.logging.Logger;
#Component
public class Listener {
protected Logger logger = Logger.getLogger(Listener.class
.getName());
public CountDownLatch getCountDownLatch1() {
return countDownLatch1;
}
private CountDownLatch countDownLatch1 = new CountDownLatch(1);
#KafkaListener(topics = "test")
public void listen(ConsumerRecord<?, ?> record) {
logger.info("Received message: " + record);
System.out.println("Received message: " + record);
countDownLatch1.countDown();
}
}
The above suggestions are good. If you have followed all of them but it did not work, please check if lazy loading is set to false for your application.
The lazy loading is false by default. However if your application had explicit setting like the one below,
spring.main.lazy-initialization=true
Please comment it or make it to false

Resources