Validate Kafka producer message delivery - spring

This question is on top of the discussion over here: How to verify sprng kafka producer has successfully sent message or not?. Below is my code to check whether kafka producer was able to send the record to the expected topic or not. To check whether the Exception is thrown or not, I put the topic name which does not exist at all.
#RestController
public class TestController {
#Autowired
MailProcessor processor;
private static final Logger logger = LoggerFactory.getLogger(TestController.class);
#GetMapping(path = "/mailman/{command}")
public void testApp(#PathVariable("command") String action) {
try {
Envelope message = new Envelope();
message.setAction(action);
message.setValue("this is the sample message for testing purpose only");
processor.sendMessage("notAvailableTopic", message);
} catch (Exception e) {
logger.error("Exception in the test controller", e);
}
}
}
Here is the method implementation
public void sendMessage(String topic, Envelope message) {
try {
ListenableFuture<SendResult<String, Envelope>> future = kafkaTemplate.send(topic, message);
SendResult<String, Envelope> result = future.get(65000, TimeUnit.MILLISECONDS);
logger.info("Successful delivery of {}", result.getProducerRecord());
}catch(Exception ex) {
logger.error("Exception while sending to {} topic", topic, ex);
}
}
The kafkaTemplate is instantiated as below:
#Bean
public List<String> consumerBootstrapServers(#Value("${kafka.bootstrap-servers}") String bootstrapServers) {
return Arrays.asList(bootstrapServers.split(","));
}
#Bean
public ProducerFactory<String, Envelope> producerFactory(List<String> consumerBootstrapServers) {
Map<String, Object> config = new HashMap<>();
config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, consumerBootstrapServers);
config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(config);
}
#Bean
public KafkaTemplate<String, Envelope> kafkaTemplate(ProducerFactory<String, Envelope> producerFactory) {
return new KafkaTemplate<>(producerFactory);
}
As mentioned in the previous post; get() will take 60 seconds to fail, I blocked the calling thread for 65 seconds. I could see below the logger statements.
2020-08-12 16:58:35.273 INFO 11471 --- [nio-8080-exec-1] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.5.0
2020-08-12 16:58:35.273 INFO 11471 --- [nio-8080-exec-1] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 66563e712b0b9f84
2020-08-12 16:58:35.273 INFO 11471 --- [nio-8080-exec-1] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1597269515273
2020-08-12 16:58:35.466 WARN 11471 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Error while fetching metadata with correlation id 2 : {notAvailableTopic=LEADER_NOT_AVAILABLE}
2020-08-12 16:58:35.467 INFO 11471 --- [ad | producer-4] org.apache.kafka.clients.Metadata : [Producer clientId=producer-4] Cluster ID: KQOZN8MkRVqke4J4H8PDpA
2020-08-12 16:58:35.879 INFO 11471 --- [nio-8080-exec-1] c.w.gioda.po.worker.KafkaProducer : Successful delivery of ProducerRecord(topic=notAvailableTopic, partition=null, headers=RecordHeaders(headers = [RecordHeader(key = __TypeId__, value = [99, 111, 109, 46, 119, 97, 108, 109, 97, 114, 116, 108, 97, 98, 115, 46, 103, 105, 111, 100, 97, 46, 112, 111, 46, 109, 111, 100, 101, 108, 46, 69, 110, 118, 101, 108, 111, 112, 101])], isReadOnly = true), key=null, value=Envelope [action=updateService, value=this is the sample message for testing purpose only], timestamp=null)
2020-08-12 17:00:17.984 INFO 11471 --- [uterTopic-0-C-1] o.a.kafka.clients.FetchSessionHandler : [Consumer clientId=consumer-postOfficeGrp-7, groupId=postOfficeGrp] Node 244026236 was unable to process the fetch request with (sessionId=1472063313, epoch=179): FETCH_SESSION_ID_NOT_FOUND.
2020-08-12 17:00:18.655 INFO 11471 --- [uterTopic-0-C-1] o.a.kafka.clients.FetchSessionHandler : [Consumer clientId=consumer-postOfficeGrp-7, groupId=postOfficeGrp] Node 1712770852 was unable to process the fetch request with (sessionId=1493387199, epoch=179): FETCH_SESSION_ID_NOT_FOUND.
2020-08-12 17:00:20.485 INFO 11471 --- [ntainer#0-0-C-1] o.a.kafka.clients.FetchSessionHandler : [Consumer clientId=consumer-postOfficeGrp-8, groupId=postOfficeGrp] Node 457669866 was unable to process the fetch request with (sessionId=1173363358, epoch=179): FETCH_SESSION_ID_NOT_FOUND.
It did not print the log statement from the catch() block. How can I validate whether the message was successfully delivered to the Kafka topic or not? Am I missing something?

Please provide your complete test case.
I get the error as expected...
#SpringBootApplication
public class So63385353Application {
public static void main(String[] args) {
SpringApplication.run(So63385353Application.class, args);
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, String> template) {
return args -> {
try {
template.send("missing", "foo").get(10, TimeUnit.SECONDS);
}
catch (Exception e) {
e.printStackTrace();
}
};
}
}
spring.kafka.producer.properties.max.block.ms=5000
2020-08-13 09:49:08.653 ERROR 14921 --- [ main] o.s.k.support.LoggingProducerListener : Exception thrown when sending a message with key='null' and payload='foo' to topic missing:
org.apache.kafka.common.errors.TimeoutException: Topic missing not present in metadata after 5000 ms.
org.springframework.kafka.KafkaException: Send failed; nested exception is org.apache.kafka.common.errors.TimeoutException: Topic missing not present in metadata after 5000 ms.
at org.springframework.kafka.core.KafkaTemplate.doSend(KafkaTemplate.java:573)
at org.springframework.kafka.core.KafkaTemplate.send(KafkaTemplate.java:363)
at com.example.demo.So63385353Application.lambda$0(So63385353Application.java:22)

Related

ExecutionException:Due to: org.springframework.kafka.requestreply.KafkaReplyTimeoutException: Reply timed out using ReplyingKafkaTemplate

I am using kafka to publish both async and sync messages to the broker .One listener would listen to the topic and respond for both sync and async calls. I am using same request topic for both the templates ..
When using fire and forget(Async) I don't see any issues since listener would listen to the messages randomly from topic.When using synchronous call I am getting timeout exception.
Do I need to maintain multiple listeners for different templates ?
With same topic for both synchronous and async operations would there be any issues?
KafkaConfig.java
//Template for synchornous call
#Bean
public ReplyingKafkaTemplate<String, Model, Model> replyingKafkaTemplate (
ProducerFactory<String, Model> pf,
ConcurrentMessageListenerContainer<String, Model> repliesContainer)
{
ReplyingKafkaTemplate<String, Model, Model> replyTemplate =
new ReplyingKafkaTemplate<>(pf, repliesContainer);
replyTemplate.setSharedReplyTopic(true);
return replyTemplate;
}
#Bean //register ConcurrentMessageListenerContainer bean
public ConcurrentMessageListenerContainer<String, Model> repliesContainer (
ConcurrentKafkaListenerContainerFactory<String, Model> containerFactory)
{
ConcurrentMessageListenerContainer<String, Model> repliesContainer =
containerFactory.createContainer("responseTopic");
repliesContainer.getContainerProperties().setGroupId(UUID.randomUUID().toString());
repliesContainer.setAutoStartup(false);
return repliesContainer;
}
//Template for asynchronous call
#Bean
#Qualifier("kafkaTemplate")
public KafkaTemplate<String, Model> kafkaTemplate (
ProducerFactory<String, Model> pf,
ConcurrentKafkaListenerContainerFactory<String, Model> factory)
{
KafkaTemplate<String, Model> kafkaTemplate = new KafkaTemplate<>(pf);
factory.setReplyTemplate(kafkaTemplate);
return kafkaTemplate;
}
Here is service class
#Service
public class KafkaService
{
#Autowired
private ReplyingKafkaTemplate<String, Model, Model> replyingKafkaTemplate;
#Autowired
private KafkaTemplate<String, Model> kafkaTemplate;
#Autowired
private KafkaConfig config;
public Object sendAndReceive (Model model)
{
ProducerRecord<String, Model> producerRecord =
new ProducerRecord("requestTopic", model);
producerRecord.headers()
.add(
new RecordHeader(KafkaHeaders.REPLY_TOPIC, "replyTopic"));
RequestReplyFuture<String, Model, Model> replyFuture =
replyingKafkaTemplate.sendAndReceive(producerRecord, Duration.ofSeconds(timeout));
ConsumerRecord<String, Model> consumerRecord =
replyFuture.get(timeout, TimeUnit.SECONDS);
return consumerRecord.value();
}
public ResponseEntity<Object> send (final Model model)
{
final ProducerRecord<String, Model> producerRecord =
new ProducerRecord("requestTopic", model);
final ListenableFuture<SendResult<String, Model>> future =
kafkaTemplate.send(producerRecord);
final SendResult<String, Model> sendResult = future.get(timeout, TimeUnit.SECONDS);
return new ResponseEntity<>(sendResult, HttpStatus.ACCEPTED);
}
}
Here is the listener class.
#Slf4j
#Service
public class MessageListener
{
#KafkaListener(groupId = "${group.id}", topics = "requestTopic", errorHandler = "customKafkaListenerErrorHandler",containerFactory = "customKafkaListenerContainerFactory")
#SendTo
public Model consumer (Model model)
{
switch (model.getType()) {
case "async":
System.out.println("Async messages are retrieved");
case "sync":
System.out.println("Sync messages are retrieved");
return model;
}
return model;
}
#Bean
public ConcurrentKafkaListenerContainerFactory<?, ?> customKafkaListenerContainerFactory(
ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> kafkaConsumerFactory)
{
ConcurrentKafkaListenerContainerFactory<Object, Object>
concurrentKafkaListenerContainerFactory =
new ConcurrentKafkaListenerContainerFactory<>();
concurrentKafkaListenerContainerFactory.
setConsumerFactory(kafkaConsumerFactory);
concurrentKafkaListenerContainerFactory.getContainerProperties()
.setAckMode(ContainerProperties.AckMode.RECORD);
concurrentKafkaListenerContainerFactory.
setCommonErrorHandler(errorHandler());
configurer.configure(concurrentKafkaListenerContainerFactory, kafkaConsumerFactory);
concurrentKafkaListenerContainerFactory.setReplyTemplate(kafkaTemplate);
return concurrentKafkaListenerContainerFactory;
}
}
application.properties
spring.kafka.consumer.enable-auto-commit=false
spring.kafka.consumer.auto-offset-reset=earliest
Debug Logs:
2022-09-15 15:48:07.771 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Processing [GenericMessage [payload=com.sample.Model#37a32ae0, headers={kafka_offset=239, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#59ff0b21, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=requestTopic, kafka_receivedTimestamp=1663282080306, kafka_groupId=consumer_group_new22}]]
2022-09-15 15:48:07.774 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Listener method returned result [com.sample.Model#37a32ae0] - generating response message for it
2022-09-15 15:48:07.780 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : No replyTopic to handle the reply: com.sample.Model#37a32ae0
2022-09-15 15:50:54.760 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Processing [GenericMessage [payload=com.sample.Model#3f766126, headers={kafka_offset=240, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#59ff0b21, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=requestTopic, kafka_receivedTimestamp=1663282254296, kafka_groupId=consumer_group_new22}]]
2022-09-15 15:50:54.760 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Listener method returned result [com.sample.Model#3f766126] - generating response message for it
2022-09-15 15:50:54.761 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : No replyTopic to handle the reply: com.sample.Model#3f766126
2022-09-15 15:51:44.482 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Processing [GenericMessage [payload=com.sample.Model#56c68983, headers={kafka_offset=241, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#59ff0b21, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=requestTopic, kafka_receivedTimestamp=1663282304204, kafka_groupId=consumer_group_new22}]]
2022-09-15 15:51:44.483 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Listener method returned result [com.sample.Model#56c68983] - generating response message for it
2022-09-15 15:51:44.483 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : No replyTopic to handle the reply: com.sample.Model#56c68983
2022-09-15 15:52:03.237 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Processing [GenericMessage [payload=com.sample.Model#6682bf3c, headers={kafka_offset=242, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#59ff0b21, kafka_correlationId=[B#65f4dd3b, kafka_timestampType=CREATE_TIME, kafka_replyTopic=[B#79cca97, kafka_receivedPartitionId=0, kafka_receivedTopic=requestTopic, kafka_receivedTimestamp=1663282322947, kafka_groupId=consumer_group_new22}]]
2022-09-15 15:52:03.237 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Listener method returned result [com.sample.Model#6682bf3c] - generating response message for it
2022-09-15 15:52:42.585 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Processing [GenericMessage [payload=com.sample.Model#78a4382d, headers={kafka_offset=243, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#59ff0b21, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=requestTopic, kafka_receivedTimestamp=1663282362320, kafka_groupId=consumer_group_new22}]]
2022-09-15 15:52:42.585 DEBUG 35380 --- [ntainer#0-0-C-1] .a.RecordMessagingMessageListenerAdapter : Listener method returned result [com.sample.Model#78a4382d] - generating response message for it
This works exactly as I expected...
#SpringBootApplication
public class So73657031Application {
public static void main(String[] args) {
SpringApplication.run(So73657031Application.class, args);
}
#Bean
ReplyingKafkaTemplate<String, String, String> rkt(ProducerFactory<String, String> pf,
ConcurrentKafkaListenerContainerFactory<String, String> factory,
KafkaTemplate<String, String> template) {
factory.setReplyTemplate(template);
ConcurrentMessageListenerContainer<String, String> container = factory.createContainer("so73657031-replies");
container.getContainerProperties().setGroupId("so73657031-replies");
return new ReplyingKafkaTemplate<>(pf, container);
}
#Bean
KafkaTemplate<String, String> template(ProducerFactory<String, String> pf) {
return new KafkaTemplate<>(pf);
}
#Bean
NewTopic topic1() {
return TopicBuilder.name("so73657031").partitions(1).replicas(1).build();
}
#Bean
NewTopic topic2() {
return TopicBuilder.name("so73657031-replies").partitions(1).replicas(1).build();
}
#Bean
public ApplicationRunner runner(ReplyingKafkaTemplate<String, String, String> rTemplate,
KafkaTemplate<String, String> template) {
return args -> {
RequestReplyFuture<String, String, String> future =
rTemplate.sendAndReceive(new ProducerRecord<String, String>("so73657031", 0, null, "test"),
Duration.ofSeconds(30));
System.out.println(future.getSendFuture().get(10, TimeUnit.SECONDS).getRecordMetadata());
System.out.println(future.get(30, TimeUnit.SECONDS).value());
ListenableFuture<SendResult<String, String>> future2 = template.send("so73657031", "oneWay");
System.out.println(future2.get(10, TimeUnit.SECONDS).getRecordMetadata());
};
}
}
#Component
class Listener {
#KafkaListener(id = "so73657031", topics = "so73657031")
#SendTo
String listen(String in) {
System.out.println(in);
return in.toUpperCase();
}
}
logging.level.root=warn
logging.level.org.springframework.kafka.listener.adapter=debug
so73657031-0#2
2022-09-15 15:36:34.496 DEBUG 71184 --- [o73657031-0-C-1] .a.RecordMessagingMessageListenerAdapter : Processing [GenericMessage [payload=test, headers={kafka_offset=2, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#1582e8e4, kafka_correlationId=[B#2a266829, kafka_timestampType=CREATE_TIME, kafka_deliveryAttempt=1, kafka_replyTopic=[B#3dad3e81, kafka_receivedPartitionId=0, kafka_receivedTopic=so73657031, kafka_receivedTimestamp=1663270594381, kafka_groupId=so73657031}]]
test
2022-09-15 15:36:34.499 DEBUG 71184 --- [o73657031-0-C-1] .a.RecordMessagingMessageListenerAdapter : Listener method returned result [TEST] - generating response message for it
TEST
so73657031-0#3
2022-09-15 15:36:34.519 DEBUG 71184 --- [o73657031-0-C-1] .a.RecordMessagingMessageListenerAdapter : Processing [GenericMessage [payload=oneWay, headers={kafka_offset=3, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#1582e8e4, kafka_timestampType=CREATE_TIME, kafka_deliveryAttempt=1, kafka_receivedPartitionId=0, kafka_receivedTopic=so73657031, kafka_receivedTimestamp=1663270594514, kafka_groupId=so73657031}]]
oneWay
2022-09-15 15:36:34.519 DEBUG 71184 --- [o73657031-0-C-1] .a.RecordMessagingMessageListenerAdapter : Listener method returned result [ONEWAY] - generating response message for it
2022-09-15 15:36:34.519 DEBUG 71184 --- [o73657031-0-C-1] .a.RecordMessagingMessageListenerAdapter : No replyTopic to handle the reply: ONEWAY

How to detect topic does not exist within Spring when using #KafkaListener

When trying to subscribe to non-existing topic with #KafkaListener, it logs a warning:
2021-04-22 13:03:56.710 WARN 20188 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-gg-2, groupId=gg] Error while fetching metadata with correlation id 174 : {not_exist=UNKNOWN_TOPIC_OR_PARTITION}
How to detect and handle this? I tried errorHandler, it isn't got called:
#KafkaListener(topics = "not_exist", groupId = "gg", errorHandler = "onError")
public void receive(String m) {
log.info("Rcd: " + m);
}
...
#Bean
public KafkaListenerErrorHandler onError() {
return new KafkaListenerErrorHandler() {
#Override
public Object handleError(Message<?> message, ListenerExecutionFailedException e) {
log.error("handleError Error: {} message: {}", e.toString(), message);
return message;
}
};
}
I think you can find the answer in org/springframework/kafka/listener/KafkaListenerErrorHandler.java
* #return the return value is ignored unless the annotated method has a {#code #SendTo} annotation.

Spring Integration call another handler method after aggregation

I am developing a system which will read and process file from a directory. Once all the file has been processed it will call a method which in turn generates a file. Also, it should route/process the files based on file name, I have used spring integration router for the same. Below is the code snippet of the Integration. My question is, this is not working if I remove any of the line .channel(aggregatorOutputChannel()) or .channel(confirmChannel()), also I have to keep the same channel .channel(aggregatorOutputChannel()) before and after the aggregator. Why do I need all 3 channel declaration? if this is wrong how to correct it.
I am using JDK 8, Spring 5, Spring boot 2.0.4.
#Configuration
#EnableIntegration
public class IntegrationConfig {
#Value("${agent.demographic.input.directory}")
private String inputDir;
#Value("${agent.demographic.output.directory}")
private String outputDir;
#Value("${confirmationfile.directory}")
private String confirmDir;
#Value("${input.scan.frequency: 2}")
private long scanFrequency;
#Value("${processing.waittime: 6000}")
private long messageGroupWaiting;
#Value("${thread.corepoolsize: 10}")
private int corepoolsize;
#Value("${thread.maxpoolsize: 20}")
private int maxpoolsize;
#Value("${thread.queuecapacity: 1000}")
private int queuedepth;
#Bean
public MessageSource<File> inputFileSource() {
FileReadingMessageSource src = new FileReadingMessageSource();
src.setDirectory(new File(inputDir));
src.setAutoCreateDirectory(true);
ChainFileListFilter<File> chainFileListFilter = new ChainFileListFilter<>();
chainFileListFilter.addFilter(new AcceptOnceFileListFilter<>() );
chainFileListFilter.addFilter(new RegexPatternFileListFilter("(?i)^.+\\.xml$"));
src.setFilter(chainFileListFilter);
return src;
}
#Bean
public UnZipTransformer unZipTransformer() {
UnZipTransformer unZipTransformer = new UnZipTransformer();
unZipTransformer.setExpectSingleResult(false);
unZipTransformer.setZipResultType(ZipResultType.FILE);
unZipTransformer.setDeleteFiles(true);
return unZipTransformer;
}
#Bean("agentdemographicsplitter")
public UnZipResultSplitter splitter() {
UnZipResultSplitter splitter = new UnZipResultSplitter();
return splitter;
}
#Bean
public DirectChannel outputChannel() {
return new DirectChannel();
}
#Bean
public DirectChannel aggregatorOutputChannel() {
return new DirectChannel();
}
#Bean("confirmChannel")
public DirectChannel confirmChannel() {
return new DirectChannel();
}
#Bean
public MessageHandler fileOutboundChannelAdapter() {
FileWritingMessageHandler adapter = new FileWritingMessageHandler(new File(outputDir));
adapter.setDeleteSourceFiles(true);
adapter.setAutoCreateDirectory(true);
adapter.setExpectReply(true);
adapter.setLoggingEnabled(true);
return adapter;
}
#Bean
public MessageHandler confirmationfileOutboundChannelAdapter() {
FileWritingMessageHandler adapter = new FileWritingMessageHandler(new File(confirmDir));
adapter.setDeleteSourceFiles(true);
adapter.setAutoCreateDirectory(true);
adapter.setExpectReply(false);
adapter.setFileNameGenerator(defaultFileNameGenerator() );
return adapter;
}
#Bean
public TaskExecutor taskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(corepoolsize);
executor.setMaxPoolSize(maxpoolsize);
executor.setQueueCapacity(queuedepth);
return executor;
}
#Bean
public DefaultFileNameGenerator defaultFileNameGenerator() {
DefaultFileNameGenerator defaultFileNameGenerator = new DefaultFileNameGenerator();
defaultFileNameGenerator.setExpression("payload.name");
return defaultFileNameGenerator;
}
#Bean
public IntegrationFlow confirmGeneration() {
return IntegrationFlows.
from("confirmChannel")
.handle(confirmationfileOutboundChannelAdapter())
.get();
}
#Bean
public IntegrationFlow individualProcessor() {
return flow -> flow.handle("thirdpartyIndividualAgentProcessor","processfile").channel(outputChannel()).handle(fileOutboundChannelAdapter());
}
#Bean
public IntegrationFlow firmProcessor() {
return flow -> flow.handle("thirdpartyFirmAgentProcessor","processfile").channel(outputChannel()).handle(fileOutboundChannelAdapter());
}
#Bean
public IntegrationFlow thirdpartyAgentDemographicFlow() {
return IntegrationFlows
.from(inputFileSource(), spec -> spec.poller(Pollers.fixedDelay(scanFrequency,TimeUnit.SECONDS)))
.channel(MessageChannels.executor(taskExecutor()))
.<File, Boolean>route(f -> f.getName().contains("individual"), m -> m
.subFlowMapping(true, sf -> sf.gateway(individualProcessor()))
.subFlowMapping(false, sf -> sf.gateway(firmProcessor()))
)
.channel(aggregatorOutputChannel())
.aggregate(aggregator -> aggregator.groupTimeout(messageGroupWaiting).correlationStrategy(new CorrelationStrategy() {
#Override
public Object getCorrelationKey(Message<?> message) {
return "xyz";
}
}))
.channel(aggregatorOutputChannel())
.handle("agentDemograpicOutput","generateAgentDemographicFile")
.channel(confirmChannel())
.get();
}
}
Below is the log
2018-09-07 17:29:20.003 DEBUG 10060 --- [ taskExecutor-2] o.s.integration.channel.DirectChannel : preSend on channel 'outputChannel', message: GenericMessage [payload=C:\thirdpartyintg\input\18237232_firm.xml, headers={replyChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#1a867ae7, errorChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#1a867ae7, file_name=18237232_firm.xml, file_originalFile=C:\thirdpartyintg\input\18237232_firm.xml, id=dd70999a-8b8d-93d2-1a43-a961ac2c339f, file_relativePath=18237232_firm.xml, timestamp=1536366560003}]
2018-09-07 17:29:20.003 DEBUG 10060 --- [ taskExecutor-2] o.s.i.file.FileWritingMessageHandler : fileOutboundChannelAdapter received message: GenericMessage [payload=C:\thirdpartyintg\input\18237232_firm.xml, headers={replyChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#1a867ae7, errorChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#1a867ae7, file_name=18237232_firm.xml, file_originalFile=C:\thirdpartyintg\input\18237232_firm.xml, id=dd70999a-8b8d-93d2-1a43-a961ac2c339f, file_relativePath=18237232_firm.xml, timestamp=1536366560003}]
2018-09-07 17:29:20.006 DEBUG 10060 --- [ taskExecutor-2] o.s.integration.channel.DirectChannel : postSend (sent=true) on channel 'outputChannel', message: GenericMessage [payload=C:\thirdpartyintg\input\18237232_firm.xml, headers={replyChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#1a867ae7, errorChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#1a867ae7, file_name=18237232_firm.xml, file_originalFile=C:\thirdpartyintg\input\18237232_firm.xml, id=dd70999a-8b8d-93d2-1a43-a961ac2c339f, file_relativePath=18237232_firm.xml, timestamp=1536366560003}]
2018-09-07 17:29:20.006 DEBUG 10060 --- [ taskExecutor-2] o.s.integration.channel.DirectChannel : postSend (sent=true) on channel 'firmProcessor.input', message: GenericMessage [payload=C:\thirdpartyintg\input\18237232_firm.xml, headers={replyChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#1a867ae7, errorChannel=org.springframework.messaging.core.GenericMessagingTemplate$TemporaryReplyChannel#1a867ae7, file_name=18237232_firm.xml, file_originalFile=C:\thirdpartyintg\input\18237232_firm.xml, id=0e6dcb75-db99-1740-7b58-e9b42bfbf603, file_relativePath=18237232_firm.xml, timestamp=1536366559761}]
2018-09-07 17:29:20.007 DEBUG 10060 --- [ taskExecutor-2] o.s.integration.channel.DirectChannel : preSend on channel 'thirdpartyintgAgentDemographicFlow.channel#2', message: GenericMessage [payload=C:\thirdpartyintg\output\18237232_firm.xml, headers={file_originalFile=C:\thirdpartyintg\input\18237232_firm.xml, id=e6e2a30a-60b9-7cdd-84cc-4977d4c21c97, file_name=18237232_firm.xml, file_relativePath=18237232_firm.xml, timestamp=1536366560007}]
2018-09-07 17:29:20.008 DEBUG 10060 --- [ taskExecutor-2] o.s.integration.channel.DirectChannel : postSend (sent=true) on channel 'thirdpartyintgAgentDemographicFlow.channel#2', message: GenericMessage [payload=C:\thirdpartyintg\output\18237232_firm.xml, headers={file_originalFile=C:\thirdpartyintg\input\18237232_firm.xml, id=e6e2a30a-60b9-7cdd-84cc-4977d4c21c97, file_name=18237232_firm.xml, file_relativePath=18237232_firm.xml, timestamp=1536366560007}]
2018-09-07 17:29:20.009 DEBUG 10060 --- [ taskExecutor-2] o.s.integration.channel.DirectChannel : postSend (sent=true) on channel 'thirdpartyintgAgentDemographicFlow.subFlow#1.channel#0', message: GenericMessage [payload=C:\thirdpartyintg\input\18237232_firm.xml, headers={file_originalFile=C:\thirdpartyintg\input\18237232_firm.xml, id=13713de8-91ce-b1fa-f52d-450d3038cf9c, file_name=18237232_firm.xml, file_relativePath=18237232_firm.xml, timestamp=1536366559757}]
2018-09-07 17:29:26.009 INFO 10060 --- [ask-scheduler-9] o.s.i.a.AggregatingMessageHandler : Expiring MessageGroup with correlationKey[processdate]
2018-09-07 17:29:26.011 DEBUG 10060 --- [ask-scheduler-9] o.s.integration.channel.NullChannel : message sent to null channel: GenericMessage [payload=C:\thirdpartyintg\output\17019222_individual.xml, headers={file_originalFile=C:\thirdpartyintg\input\17019222_individual.xml, id=c654076b-696f-25d4-bded-0a43d1a8ca97, file_name=17019222_individual.xml, file_relativePath=17019222_individual.xml, timestamp=1536366559927}]
2018-09-07 17:29:26.011 DEBUG 10060 --- [ask-scheduler-9] o.s.integration.channel.NullChannel : message sent to null channel: GenericMessage [payload=C:\thirdpartyintg\output\18237232_firm.xml, headers={file_originalFile=C:\thirdpartyintg\input\18237232_firm.xml, id=e6e2a30a-60b9-7cdd-84cc-4977d4c21c97, file_name=18237232_firm.xml, file_relativePath=18237232_firm.xml, timestamp=1536366560007}]
First of all the RegexPatternFileListFilter should be first in the ChainFileListFilter. This way you won't overhead a memory in the AcceptOnceFileListFilter for files which you are not interested in.
You need .channel(confirmChannel()) in the end of the thirdpartyAgentDemographicFlow because this one is an input to your confirmGeneration flow.
I don't think that you .channel(aggregatorOutputChannel()) at all it has to implicit.
You also don't need that .channel(outputChannel()) in the sub-flows.
this is not working
Please, elaborate more: what error you get, how then it works etc...
You also can share some DEBUG logs for the org.springframework.integration to determine how your messages travel.
Also it would help a lot if your share some simple Spring Boot project on GitHub to let us to play with and reproduce according your provided instructions.
UPDATE
Also I've noticed that your aggregator is based on the groupTimeout(). To make it to send aggregated message to downstream you also need to configure there this:
/**
* #param sendPartialResultOnExpiry the sendPartialResultOnExpiry.
* #return the handler spec.
* #see AbstractCorrelatingMessageHandler#setSendPartialResultOnExpiry(boolean)
*/
public S sendPartialResultOnExpiry(boolean sendPartialResultOnExpiry) {
It is false by default, so your messages indeed are sent to the NullChannel.
See more info in the Docs: https://docs.spring.io/spring-integration/docs/current/reference/html/messaging-routing-chapter.html#agg-and-group-to

Remove file from remote using streaming inbound channel adapter spring boot implementation

I am trying to remove file from remote by implementing streaming inbound but connection is closing before adviceChain implementing.
CODE:
#Bean
public SessionFactory<LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(sftpHost);
factory.setPort(sftpPort);
factory.setUser(sftpUser);
factory.setPassword(sftpPwd);
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<LsEntry>(factory);
}
#Bean
#InboundChannelAdapter(channel = "stream", poller = #Poller(cron = "2 * * * * ?"))
public MessageSource<InputStream> sftpMessageSource() {
SftpStreamingMessageSource messageSource = new SftpStreamingMessageSource(template());
messageSource.setRemoteDirectory(remoteDirecotry);
messageSource.setFilter(new AcceptAllFileListFilter<>());
return messageSource;
}
#Bean
public SftpRemoteFileTemplate template() {
return new SftpRemoteFileTemplate(sftpSessionFactory());
}
#Bean
#Transformer(inputChannel = "stream", outputChannel = "data")
public org.springframework.integration.transformer.Transformer transformer() {
return new StreamTransformer("UTF-8");
}
#ServiceActivator(inputChannel = "data" ,adviceChain = "afterChain")
#Bean
public MessageHandler handler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
String fileName = message.getHeaders().get("file_remoteFile").toString();
if (!StringUtils.isEmpty(message.toString())) {
else{
log.info("No file found in the Remote location");
}
}
};
}
#Bean
public ExpressionEvaluatingRequestHandlerAdvice afterChain() {
ExpressionEvaluatingRequestHandlerAdvice advice = new ExpressionEvaluatingRequestHandlerAdvice();
advice.setOnSuccessExpression(
"#template.remove(headers['file_remoteDirectory'] + headers['file_remoteFile'])");
//advice.setOnSuccessExpressionString("#template.remove(headers['file_remoteFile'])");
advice.setPropagateEvaluationFailures(true);
return advice;
}
wherever i search every one is suggesting to implement ExpressionEvaluatingRequestHandlerAdvice but it is throwing me below error.
2018-03-27 12:32:02.618 INFO 23216 --- [ask-scheduler-1] o.s.b.c.l.support.SimpleJobLauncher : Job: [FlowJob: [name=starsBatchJob]] completed with the following parameters: [{JobID=1522168322277}] and the following status: [COMPLETED]
2018-03-27 12:32:02.618 INFO 23216 --- [ask-scheduler-1] c.f.u.config.ParentBatchConfiguration : Job Status Completed
2018-03-27 12:32:02.618 INFO 23216 --- [ask-scheduler-1] c.f.u.config.ParentBatchConfiguration : Total time tokk for Stars Batch execution: 0 seconds.
2018-03-27 12:32:02.618 INFO 23216 --- [ask-scheduler-1] c.f.u.config.ParentBatchConfiguration : Batch Job lock is released
2018-03-27 12:32:02.633 INFO 23216 --- [ask-scheduler-1] com.jcraft.jsch : Disconnecting from hpchd1e.hpc.ford.com port 22
2018-03-27 12:32:02.633 ERROR 23216 --- [ask-scheduler-1] o.s.integration.handler.LoggingHandler : org.springframework.messaging.MessagingException: Dispatcher failed to deliver Message; nested exception is org.springframework.messaging.MessagingException: Failed to execute on session; nested exception is org.springframework.core.NestedIOException: Failed to remove file: 2: No such file; nested exception is 2
I had this problem. My path to the remote file was incorrect. I needed a trailing /. It is a little difficult to see since the path is being created inside a Spel Expression. You can see the path using the following in the handleMessage() method.
String remoteDirectory = (String) message.getHeaders().get("file_remoteDirectory");
String remoteFile = (String) message.getHeaders().get("file_remoteFile");
I did have to use the advice.setOnSuccessExpressionString("#template.remove(headers['file_remoteFile'])"); that is commented out above instead of advice.setOnSuccessExpression"#template.remove(headers['file_remoteDirectory'] + headers['file_remoteFile'])");
It is incorrect in the documentation https://docs.spring.io/spring-integration/reference/html/sftp.html#sftp-streaming which is why I believe people who struggle with this lose faith in the doc. But this seems to be the only error.

My tcp client using spring integration not able to get response

I have created tcp client using spring integration I am able to receive response for my send message . But when I uses localDateTime.now() to log time I am not able to receive the response of send message . I know this can be solved using time setting to make thread wait. As I am new to spring integration So kindly help me how to do it.
#Configuration
#ComponentScan
#EnableAutoConfiguration
public class Test
{
protected final Log logger = LogFactory.getLog(this.getClass());
// **************** Client **********************************************
#Bean
public MessageChannel replyChannel()
{
return new DirectChannel();
}
#Bean
public MessageChannel sendChannel()
{
MessageChannel directChannel = new DirectChannel();
return directChannel;
}
#EnableIntegration
#IntegrationComponentScan
#Configuration
public static class config
{
#MessagingGateway(defaultRequestChannel = "sendChannel", defaultReplyChannel = "replyChannel")
public interface Gateway
{
String Send(String in);
}
}
#Bean
AbstractClientConnectionFactory tcpNetClientConnectionFactory()
{
AbstractClientConnectionFactory tcpNetClientConnectionFactory = new TcpNetClientConnectionFactory("localhost",
9999);
tcpNetClientConnectionFactory.setSerializer(new UCCXImprovedSerializer());
tcpNetClientConnectionFactory.setDeserializer(new UCCXImprovedSerializer());
tcpNetClientConnectionFactory.setSingleUse(true);
tcpNetClientConnectionFactory.setMapper(new TcpMessageMapper());
return tcpNetClientConnectionFactory;
}
#Bean
#ServiceActivator(inputChannel = "sendChannel")
TcpOutboundGateway tcpOutboundGateway()
{
TcpOutboundGateway tcpOutboundGateway = new TcpOutboundGateway();
tcpOutboundGateway.setConnectionFactory(tcpNetClientConnectionFactory());
tcpOutboundGateway.setReplyChannel(replyChannel());
return tcpOutboundGateway;
}
public static void main(String args[])
{
// new LegaServer();
ConfigurableApplicationContext applicationContext = SpringApplication.run(Test.class, args);
String temp = applicationContext.getBean(Gateway.class).Send("kksingh");
System.out.println(LocalDateTime.now()+"output" + temp);
applicationContext.stop();
}
}
My custom serialzer and deserialser UCCXImprovedSerializerclass
after updating as per #Garry
public class UCCXImprovedSerializer implements Serializer<String>, Deserializer<String>
{
#Override
public String deserialize(InputStream initialStream) throws IOException
{
System.out.println("deserialzier called");
StringBuilder sb = new StringBuilder();
try (BufferedReader rdr = new BufferedReader(new InputStreamReader(initialStream)))
{
for (int c; (c = rdr.read()) != -1;)
{
sb.append((char) c);
}
}
return sb.toString();
}
#Override
public void serialize(String msg, OutputStream os) throws IOException
{
System.out.println(msg + "---serialize---" + Thread.currentThread().getName() + "");
os.write(msg.getBytes());
}
}
My server at port 9999 code
try
{
clientSocket = echoServer.accept();
System.out.println("client connection established..");
is = new DataInputStream(clientSocket.getInputStream());
os = new PrintStream(clientSocket.getOutputStream());
String tempString = "kksingh";
byte[] tempStringByte = tempString.getBytes();
byte[] temp = new byte[tempString.getBytes().length];
while (true)
{
is.read(temp);
System.out.println(new String(temp) + "--received msg is--- " + LocalDateTime.now());
System.out.println(LocalDateTime.now() + "sending value");
os.write(tempStringByte);
break;
}
} catch (IOException e)
{
System.out.println(e);
}
}
My log file for tcp client
2017-06-04 23:10:14.771 INFO 15568 --- [ main] o.s.i.endpoint.EventDrivenConsumer : started org.springframework.integration.endpoint.EventDrivenConsumer#1f12e153
kksingh---serialize---main
pool-1-thread-1---deserialize----
pool-1-thread-1---deserialize----
pool-1-thread-1---deserialize----
pool-1-thread-1---deserialize----
2017-06-04 23:10:14.812 ERROR 15568 --- [pool-1-thread-1] o.s.i.ip.tcp.TcpOutboundGateway : Cannot correlate response - no pending reply for localhost:9999:57622:bc98ee29-8957-47bd-bd8a-f734c3ec3f9d
2017-06-04T23:10:14.809output
2017-06-04 23:10:14.821 INFO 15568 --- [ main] o.s.c.support.DefaultLifecycleProcessor : Stopping beans in phase 0
My log file for server side
client connection established..
kksingh--received msg is--- 2017-06-04T23:10:14.899
2017-06-04T23:10:14.899sending value
when I removed the localdatetime.now() from server and tcpclient I am able to get response as outputkksingh
o.s.i.endpoint.EventDrivenConsumer : Adding {logging-channel-adapter:_org.springframework.integration.errorLogger} as a subscriber to the 'errorChannel' channel
2017-06-05 12:46:32.494 INFO 29076 --- [ main] o.s.i.channel.PublishSubscribeChannel : Channel 'application.errorChannel' has 1 subscriber(s).
2017-06-05 12:46:32.495 INFO 29076 --- [ main] o.s.i.endpoint.EventDrivenConsumer : started _org.springframework.integration.errorLogger
2017-06-05 12:46:32.746 INFO 29076 --- [ main] s.b.c.e.t.TomcatEmbeddedServletContainer : Tomcat started on port(s): 8080 (http)
2017-06-05 12:46:32.753 INFO 29076 --- [ main] o.s.i.samples.tcpclientserver.Test : Started Test in 2.422 seconds (JVM running for 2.716)
2017-06-05 12:46:32.761 INFO 29076 --- [ main] o.s.i.endpoint.EventDrivenConsumer : Adding {bridge:null} as a subscriber to the 'replyChannel' channel
2017-06-05 12:46:32.762 INFO 29076 --- [ main] o.s.integration.channel.DirectChannel : Channel 'application.replyChannel' has 1 subscriber(s).
2017-06-05 12:46:32.763 INFO 29076 --- [ main] o.s.i.endpoint.EventDrivenConsumer : started org.springframework.integration.endpoint.EventDrivenConsumer#1f12e153
kksingh---serialize---main
pool-1-thread-1---deserialize----kksingh
outputkksingh
2017-06-05 12:46:32.837 INFO 29076 --- [ main] o.s.c.support.DefaultLifecycleProcessor : Stopping beans in phase 0
2017-06-05 12:46:32.839 INFO 29076 --- [ main] o.s.i.endpoint.EventDrivenConsumer : Removing {bridge:null} as a subscriber to the 'replyChannel' channel
2017-06-05 12:46:32.839 INFO 29076 --- [
Your deserializer is deserializing multiple packets...
pool-1-thread-1---deserialize----
pool-1-thread-1---deserialize----
pool-1-thread-1---deserialize----
pool-1-thread-1---deserialize----
Which produces 4 reply messsages; the gateway can only handle one reply which is why you see that ERROR message.
You deserializer needs to be smarter than just capturing "available" bytes. You need something in the message to indicate the end of the data (or close the socket to indicate the end).

Resources