i am trying to get data from redis. then when deserializes getting error
"Could not read JSON: Cannot deserialize instance of `com.xxxxx.adapters.output.cache.redis.data.flightdetail.FlightDetailCache` out of START_ARRAY token\n at [Source: (byte[])\"[{\"numSeatsLeft\":null,\"segments\":[{\"cabinClass\":\"N\",\"airlineClass\":\"Economy\". etc
this is my code below:
RedisConfig:
#Bean
public ReactiveRedisOperations<String, FlightDetailCache> flightDetailRedisOperations(
#Qualifier("connectionFactoryScheduleCluster")
LettuceConnectionFactory lettuceConnectionFactory) {
return this.reactiveRedisScheduleOperations(lettuceConnectionFactory, FlightDetailCache.class);
}
#Bean("fareRedisOperations")
public ReactiveRedisOperations<String, List<FareDetailCache>> fareRedisOperations(
#Qualifier("connectionFactoryScheduleCluster")
LettuceConnectionFactory lettuceConnectionFactory, ObjectMapper objectMapper) {
JavaType type = objectMapper.getTypeFactory().constructCollectionType(ArrayList.class, FareDetailCache.class);
return reactiveRedisScheduleOperations(lettuceConnectionFactory, type);
}
My Redis Adapter:
#Autowired
public FareDetailRedisAdapter(#Qualifier("fareRedisOperations")
ReactiveRedisOperations<String, List<FareDetailCache>> redisFare,
#Qualifier("fareRedisOperations")
ReactiveRedisOperations<String, List<FareDetailCache>> redisFareHash,
FareDetailCacheMapper fareDetailCacheMapper) {
this.redisFare = redisFare;
this.redisFareHash = redisFareHash.opsForHash();
this.fareDetailCacheMapper = fareDetailCacheMapper;
}
#Override
public Mono<List<FareDetail>> getFareDetail(String redisKey, String flightClass) {
return redisFareHash.get(redisKey, flightClass)
.map(fareDetailCaches -> fareDetailCacheMapper.fareDetailToDomain(fareDetailCaches))
.switchIfEmpty(Mono.empty());
}
Expected deserrialize to FareDetailCache object, but always deserialize to FlightDetailCache.
anyone can help me to find this issue? i really appreciate it. Thank you
Stack Trace :
Stack trace:
at com.xxx.xxx.air.xxx.xxx.xxx.application.ports.input.FareDetailServiceImpl.lambda$getFareDetailByFlightDetail$5(FareDetailServiceImpl.java:53) ~[main/:na]
at reactor.core.publisher.Mono.lambda$onErrorMap$30(Mono.java:3325) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onError(FluxOnErrorResume.java:88) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at org.springframework.cloud.sleuth.instrument.reactor.ScopePassingSpanSubscriber.onError(ScopePassingSpanSubscriber.java:97) ~[spring-cloud-sleuth-core-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onError(FluxMapFuseable.java:134) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at org.springframework.cloud.sleuth.instrument.reactor.ScopePassingSpanSubscriber.onError(ScopePassingSpanSubscriber.java:97) ~[spring-cloud-sleuth-core-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onError(MonoFlatMap.java:165) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at org.springframework.cloud.sleuth.instrument.reactor.ScopePassingSpanSubscriber.onError(ScopePassingSpanSubscriber.java:97) ~[spring-cloud-sleuth-core-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onError(Operators.java:2059) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at org.springframework.cloud.sleuth.instrument.reactor.ScopePassingSpanSubscriber.onError(ScopePassingSpanSubscriber.java:97) ~[spring-cloud-sleuth-core-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at reactor.core.publisher.FluxMap$MapSubscriber.onError(FluxMap.java:126) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at org.springframework.cloud.sleuth.instrument.reactor.ScopePassingSpanSubscriber.onError(ScopePassingSpanSubscriber.java:97) ~[spring-cloud-sleuth-core-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at reactor.core.publisher.MonoNext$NextSubscriber.onError(MonoNext.java:87) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at org.springframework.cloud.sleuth.instrument.reactor.ScopePassingSpanSubscriber.onError(ScopePassingSpanSubscriber.java:97) ~[spring-cloud-sleuth-core-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at reactor.core.publisher.FluxUsingWhen$UsingWhenSubscriber.deferredError(FluxUsingWhen.java:408) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at reactor.core.publisher.FluxUsingWhen$RollbackInner.onComplete(FluxUsingWhen.java:485) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at org.springframework.cloud.sleuth.instrument.reactor.ScopePassingSpanSubscriber.onComplete(ScopePassingSpanSubscriber.java:104) ~[spring-cloud-sleuth-core-2.2.8.RELEASE.jar:2.2.8.RELEASE]
at reactor.core.publisher.MonoIgnoreElements$IgnoreElementsSubscriber.onComplete(MonoIgnoreElements.java:81) ~[reactor-core-3.3.17.RELEASE.jar:3.3.17.RELEASE]
at org.springframework.cloud.sleuth.instrument.reactor.ScopePassingSpanSubscriber.onComplete(ScopePassingSpanSubscriber.java:104) ~[spring-cloud-sleuth-core-2.2.8.RELEASE.jar:2.2.8.RELEASE]
Related
I have a Spring Boot Kafka consumer with the below configuration . I was trying manual acknowledgment instead of auto commit . With manual acknowledgment I started getting error .
Spring Boot version is 2.7.2.
kafka.consumer.groupId=mcs-ccp-event
message.topic.name=mcs_ccp_test
kafka.bootstrapAddress=kafka-dev-app-a1.com:9092
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.listener.ack-mode=MANUAL_IMMEDIATE
and this consumer configurations
#EnableKafka
#Configuration
#Slf4j
public class KafkaConsumerConfig {
#Value(value = "${kafka.bootstrapAddress}")
private String bootstrapAddress;
#Value(value = "${kafka.consumer.groupId}")
private String groupId;
public ConsumerFactory<String, Event> eventConsumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
//props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
props.put(JsonDeserializer.VALUE_DEFAULT_TYPE, "com.xxx.mcsccpkafkaconsumer.vo.Event");
props.put(JsonDeserializer.USE_TYPE_INFO_HEADERS,false);
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(Event.class));
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Event> eventKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Event> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(eventConsumerFactory());
return factory;
}
This is my listener
#KafkaListener(topics = "${message.topic.name}", containerFactory = "eventKafkaListenerContainerFactory", groupId = "${kafka.consumer.groupId}")
public void eventListener(#Payload Event event, #Header(KafkaHeaders.RECEIVED_PARTITION_ID) int partition,
Acknowledgment acknowledgment) {
log.info("Received event message: {} from partition : {}", event, partition);
persistEventToDB(event);
acknowledgment.acknowledge();
this.eventLatch.countDown();
}
Whenever consumer is receiving message from producer , its always throwing the error :
2022-08-06 11:16:11.749 ERROR 37700 --- [ntainer#0-0-C-1] o.s.kafka.listener.DefaultErrorHandler : Backoff none exhausted for mcs__ccp-1#122
org.springframework.kafka.listener.ListenerExecutionFailedException: invokeHandler Failed; nested exception is java.lang.IllegalStateException: No Acknowledgment available as an argument, the listener container must have a MANUAL AckMode to populate the Acknowledgment.; nested exception is java.lang.IllegalStateException: No Acknowledgment available as an argument, the listener container must have a MANUAL AckMode to populate the Acknowledgment.
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:2713) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:2683) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:2643) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:2570) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:2451) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:2329) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:2000) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeIfHaveRecords(KafkaMessageListenerContainer.java:1373) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1364) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1255) ~[spring-kafka-2.8.8.jar:2.8.8]
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na]
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na]
at java.base/java.lang.Thread.run(Thread.java:829) ~[na:na]
Suppressed: org.springframework.kafka.listener.ListenerExecutionFailedException: Restored Stack Trace
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.checkAckArg(MessagingMessageListenerAdapter.java:369) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:352) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:92) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:53) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:2663) ~[spring-kafka-2.8.8.jar:2.8.8]
Caused by: java.lang.IllegalStateException: No Acknowledgment available as an argument, the listener container must have a MANUAL AckMode to populate the Acknowledgment.
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.checkAckArg(MessagingMessageListenerAdapter.java:369) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:352) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:92) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:53) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:2663) ~[spring-kafka-2.8.8.jar:2.8.8]
... 11 common frames omitted
Caused by: org.springframework.messaging.converter.MessageConversionException: Cannot handle message; nested exception is org.springframework.messaging.converter.MessageConversionException: Cannot convert from [com.xxx.mcsccpkafkaconsumer.vo.Event] to [org.springframework.kafka.support.Acknowledgment] for GenericMessage [payload=Event(eventType=Download, timestamp=2022-08-05 19:11:12, username=xxxxx, browser=Chrome, eventDetails=EventDetails(objectName=VW_Attachment, recordType=ELA,EULA, agreementStatus=null, searchCategory=null, searchKeyword=null, downloadType=PDF, templateId=null, fileName=null, agreementNumber=null)), headers={kafka_offset=122, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#5c5b32a5, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=1, kafka_receivedTopic=mcs_ccp_test, kafka_receivedTimestamp=1659764771420, kafka_groupId=mcs-ccp-event}], failedMessage=GenericMessage [payload=Event(eventType=Download, timestamp=2022-08-05 19:11:12, username=xxxx, browser=Chrome, eventDetails=EventDetails(objectName=VW_Attachment, recordType=ELA,EULA, agreementStatus=null, searchCategory=null, searchKeyword=null, downloadType=PDF, templateId=null, fileName=null, agreementNumber=null)), headers={kafka_offset=122, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#5c5b32a5, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=1, kafka_receivedTopic=mcs_ccp_test, kafka_receivedTimestamp=1659764771420, kafka_groupId=mcs-ccp-event}]
... 15 common frames omitted
Caused by: org.springframework.messaging.converter.MessageConversionException: Cannot convert from [com.xxx.mcsccpkafkaconsumer.vo.Event] to [org.springframework.kafka.support.Acknowledgment] for GenericMessage [payload=Event(eventType=Download, timestamp=2022-08-05 19:11:12, username=xxxxxx, browser=Chrome, eventDetails=EventDetails(objectName=VW_Attachment, recordType=ELA,EULA, agreementStatus=null, searchCategory=null, searchKeyword=null, downloadType=PDF, templateId=null, fileName=null, agreementNumber=null)), headers={kafka_offset=122, kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#5c5b32a5, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=1, kafka_receivedTopic=mcs_ccp_test, kafka_receivedTimestamp=1659764771420, kafka_groupId=mcs-ccp-event}]
at org.springframework.messaging.handler.annotation.support.PayloadMethodArgumentResolver.resolveArgument(PayloadMethodArgumentResolver.java:145) ~[spring-messaging-5.3.22.jar:5.3.22]
at org.springframework.kafka.annotation.KafkaNullAwarePayloadArgumentResolver.resolveArgument(KafkaNullAwarePayloadArgumentResolver.java:46) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.messaging.handler.invocation.HandlerMethodArgumentResolverComposite.resolveArgument(HandlerMethodArgumentResolverComposite.java:118) ~[spring-messaging-5.3.22.jar:5.3.22]
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java:147) ~[spring-messaging-5.3.22.jar:5.3.22]
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:115) ~[spring-messaging-5.3.22.jar:5.3.22]
at org.springframework.kafka.listener.adapter.HandlerAdapter.invoke(HandlerAdapter.java:56) ~[spring-kafka-2.8.8.jar:2.8.8]
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:347) ~[spring-kafka-2.8.8.jar:2.8.8]
... 14 common frames omitted
You're setting the ack-mode and auto-offset-reset in the properties file, which is used by Spring Boot's auto configuration to setup its own KafkaListenerContainerFactory.
But since then you declare your own KafkaListenerContainerFactory bean, auto configuration backs off, and your programatic configuration is used instead.
You can set the properties for your consumer factory directly in the properties file and let Spring Boot create the beans - then there's no need for this KafkaConsumerConfig class.
Or you can set the ack mode and auto-offset-reset directly in the factory bean you're declaring instead of the properties file.
I'm trying to get a messages from Kafka topic, but for some reason I get the following error:
2022-06-28 14:17:52.044 INFO 1 --- [ntainer#0-0-C-1] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer-api1-1, groupId=api1] Seeking to offset 1957 for partition ActiveProxySources-0
2022-06-28T14:17:52.688451744Z 2022-06-28 14:17:52.687 ERROR 1 --- [ntainer#0-0-C-1] o.s.kafka.listener.DefaultErrorHandler : Backoff none exhausted for ActiveProxySources-0#1957
2022-06-28T14:17:52.688499949Z
2022-06-28T14:17:52.688511943Z org.springframework.kafka.listener.ListenerExecutionFailedException: Listener failed; nested exception is org.springframework.kafka.support.serializer.DeserializationException: failed to deserialize; nested exception is org.springframework.messaging.converter.MessageConversionException: failed to resolve class name. Class not found [com.freeproxy.parser.model.kafka.KafkaMessage]; nested exception is java.lang.ClassNotFoundException: com.freeproxy.parser.model.kafka.KafkaMessage
2022-06-28T14:17:52.688544511Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:2691) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688555996Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.checkDeser(KafkaMessageListenerContainer.java:2738) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688564633Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:2612) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688573552Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:2544) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688582961Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:2429) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688591538Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:2307) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688600362Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:1981) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688610882Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeIfHaveRecords(KafkaMessageListenerContainer.java:1365) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688620353Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1356) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688629357Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1251) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688637662Z at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na]
2022-06-28T14:17:52.688646009Z at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na]
2022-06-28T14:17:52.688655783Z at java.base/java.lang.Thread.run(Thread.java:829) ~[na:na]
2022-06-28T14:17:52.688664349Z Caused by: org.springframework.kafka.support.serializer.DeserializationException: failed to deserialize; nested exception is org.springframework.messaging.converter.MessageConversionException: failed to resolve class name. Class not found [com.freeproxy.parser.model.kafka.KafkaMessage]; nested exception is java.lang.ClassNotFoundException: com.freeproxy.parser.model.kafka.KafkaMessage
2022-06-28T14:17:52.688674537Z at org.springframework.kafka.support.serializer.SerializationUtils.deserializationException(SerializationUtils.java:150) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688683348Z at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:204) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688699174Z at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:1420) ~[kafka-clients-3.0.1.jar!/:na]
2022-06-28T14:17:52.688707618Z at org.apache.kafka.clients.consumer.internals.Fetcher.access$3400(Fetcher.java:134) ~[kafka-clients-3.0.1.jar!/:na]
2022-06-28T14:17:52.688718316Z at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.fetchRecords(Fetcher.java:1652) ~[kafka-clients-3.0.1.jar!/:na]
2022-06-28T14:17:52.688728359Z at org.apache.kafka.clients.consumer.internals.Fetcher$CompletedFetch.access$1800(Fetcher.java:1488) ~[kafka-clients-3.0.1.jar!/:na]
2022-06-28T14:17:52.688736716Z at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:721) ~[kafka-clients-3.0.1.jar!/:na]
2022-06-28T14:17:52.688748228Z at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:672) ~[kafka-clients-3.0.1.jar!/:na]
2022-06-28T14:17:52.688758573Z at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1304) ~[kafka-clients-3.0.1.jar!/:na]
2022-06-28T14:17:52.688768278Z at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1238) ~[kafka-clients-3.0.1.jar!/:na]
2022-06-28T14:17:52.688776576Z at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1211) ~[kafka-clients-3.0.1.jar!/:na]
2022-06-28T14:17:52.688785598Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollConsumer(KafkaMessageListenerContainer.java:1521) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688793960Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doPoll(KafkaMessageListenerContainer.java:1511) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688802367Z at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1339) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688811023Z ... 4 common frames omitted
2022-06-28T14:17:52.688819230Z Caused by: org.springframework.messaging.converter.MessageConversionException: failed to resolve class name. Class not found [com.freeproxy.parser.model.kafka.KafkaMessage]; nested exception is java.lang.ClassNotFoundException: com.freeproxy.parser.model.kafka.KafkaMessage
2022-06-28T14:17:52.688828306Z at org.springframework.kafka.support.mapping.DefaultJackson2JavaTypeMapper.getClassIdType(DefaultJackson2JavaTypeMapper.java:142) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688837754Z at org.springframework.kafka.support.mapping.DefaultJackson2JavaTypeMapper.toJavaType(DefaultJackson2JavaTypeMapper.java:103) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688846335Z at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:572) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688854685Z at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:201) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688862907Z ... 16 common frames omitted
2022-06-28T14:17:52.688870692Z Caused by: java.lang.ClassNotFoundException: com.freeproxy.parser.model.kafka.KafkaMessage
2022-06-28T14:17:52.688888550Z at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476) ~[na:na]
2022-06-28T14:17:52.688898662Z at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589) ~[na:na]
2022-06-28T14:17:52.688907289Z at org.springframework.boot.loader.LaunchedURLClassLoader.loadClass(LaunchedURLClassLoader.java:151) ~[java.jar:na]
2022-06-28T14:17:52.688915418Z at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) ~[na:na]
2022-06-28T14:17:52.688923583Z at java.base/java.lang.Class.forName0(Native Method) ~[na:na]
2022-06-28T14:17:52.688931577Z at java.base/java.lang.Class.forName(Class.java:398) ~[na:na]
2022-06-28T14:17:52.688939753Z at org.springframework.util.ClassUtils.forName(ClassUtils.java:284) ~[spring-core-5.3.19.jar!/:5.3.19]
2022-06-28T14:17:52.688948555Z at org.springframework.kafka.support.mapping.DefaultJackson2JavaTypeMapper.getClassIdType(DefaultJackson2JavaTypeMapper.java:138) ~[spring-kafka-2.8.5.jar!/:2.8.5]
2022-06-28T14:17:52.688957079Z ... 19 common frames omitted
2022-06-28T14:17:52.688964715Z
I have other applications that send and read messages on Kafka topics with the same settings and they all work fine, but not this application. Ideally, I want to read messages from two Kafka topics (messages in both topics look the same and contain the same objects), but even when I try to read messages from one topic, I get the error shown above.
The settings are follows:
class KafkaMessage {
String id
IdStatus status
}
#Service
#Slf4j
class ConsumerService {
Set<String> activeProxies = []
int getActiveProxiesNumber() {
activeProxies.size()
}
Set<String> activeProxySources = []
int getActiveProxySourcesNumber() {
activeProxySources.size()
}
#KafkaListener(topics = "ActiveProxies"/*, containerFactory = "KafkaListenerContainerFactoryActiveProxies"*/)
public void consumeProxyId(KafkaMessage message) {
log.info("Consuming ${message.id}: ${message.status}")
if (message.status == IdStatus.ADD) {
activeProxies.add(message.id)
}
if (message.status == IdStatus.DELETE) {
activeProxies.remove(message.id)
}
}
#KafkaListener(topics = "ActiveProxySources"/*, containerFactory = "KafkaListenerContainerFactoryActiveProxySources"*/)
public void consumeProxySourceId(KafkaMessage message) {
log.info("Consuming ${message.id}: ${message.status}")
if (message.status == IdStatus.ADD) {
activeProxySources.add(message.id)
}
if (message.status == IdStatus.DELETE) {
activeProxySources.remove(message.id)
}
}
}
TopicConfig:
#Configuration
public class TopicConfig {
#Value(value = "kafka:9092")
private String bootstrapAddress
#Value(value = "ActiveProxies")
private String activeProxies
#Value(value = "ActiveProxySources")
private String activeProxySources
// #Bean
// public KafkaAdmin kafkaAdmin() {
// Map<String, Object> configs = new HashMap<>();
// configs.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
// return new KafkaAdmin(configs);
// }
#Bean
public NewTopic ActiveProxiesTopic() {
return TopicBuilder.name(activeProxies)
.partitions(1)
.replicas(1)
.config(org.apache.kafka.common.config.TopicConfig.RETENTION_MS_CONFIG, "60000")
.build()
}
#Bean
public NewTopic ActiveProxySourcesTopic() {
return TopicBuilder.name(activeProxySources)
.partitions(1)
.replicas(1)
.config(org.apache.kafka.common.config.TopicConfig.RETENTION_MS_CONFIG, "60000")
.build()
}
}
application.properties file:
server.port=30329
spring.data.mongodb.database=free-proxy-engine
spring.kafka.bootstrap-servers=kafka:9092
spring.kafka.consumer.group-id=consumer-Api1
spring.kafka.consumer.properties.spring.json.trusted.packages=*
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.enable-auto-commit=false
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer=org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
spring.kafka.consumer.properties.spring.deserializer.value.delegate.class=org.springframework.kafka.support.serializer.JsonDeserializer
I use Docker-Compose to run Kafka and all other applications
docker-compose.yaml:
version: '2'
services:
mongodb:
image: mongo:5.0.9
restart: unless-stopped
api:
image: openjdk:11
depends_on:
- mongodb
- kafka
restart: unless-stopped
volumes:
- ./libs/api-0.0.1-SNAPSHOT.jar:/gjava/java.jar
environment:
spring_data_mongodb_host: mongodb
spring_kafka_consumer_group-id: api1
command: /bin/bash -c "cd /gjava && chmod +x /gjava/*.jar && java -jar /gjava/java.jar"
ports:
- 30329:30329
zookeeper:
image: confluentinc/cp-zookeeper
container_name: zookeeper
ports:
- "2181:2181"
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
kafka:
image: confluentinc/cp-kafka
restart: always
hostname: kafka
depends_on:
- zookeeper
container_name: kafka
ports:
- "9092:9092"
environment:
KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:9092
KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
I created my own Consumer configuration file for Kafka, but the error remained when I tried to read messages from two topics and from one.
#EnableKafka
#Configuration
class KafkaConsumerConfig {
#Value(value = "kafka:9092")
private String bootstrapAddress
#Bean
public ConsumerFactory<String, KafkaMessage> ConsumerFactoryActiveProxies() {
Map<String, Object> props = new HashMap<>()
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress)
props.put(ConsumerConfig.GROUP_ID_CONFIG, "Api-1")
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest")
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "false")
props.put(JsonDeserializer.TRUSTED_PACKAGES, "*")
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer.class)
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer.class)
props.put(ErrorHandlingDeserializer.KEY_DESERIALIZER_CLASS, JsonDeserializer.class)
props.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS, JsonDeserializer.class.getName())
return new DefaultKafkaConsumerFactory<>(props/*,
new StringDeserializer(),
new JsonDeserializer<>(KafkaMessage.class)*/)
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, KafkaMessage>
KafkaListenerContainerFactoryActiveProxies() {
ConcurrentKafkaListenerContainerFactory<String, KafkaMessage> factory
= new ConcurrentKafkaListenerContainerFactory<>()
factory.setConsumerFactory(ConsumerFactoryActiveProxies())
factory.setMessageConverter(new StringJsonMessageConverter())
return factory
}
#Bean
public ConsumerFactory<String, KafkaMessage> ConsumerFactoryActiveProxySources() {
Map<String, Object> props = new HashMap<>()
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress)
props.put(ConsumerConfig.GROUP_ID_CONFIG, "Api-2")
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest")
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "false")
props.put(JsonDeserializer.TRUSTED_PACKAGES, "*")
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer.class)
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer.class)
props.put(ErrorHandlingDeserializer.KEY_DESERIALIZER_CLASS, JsonDeserializer.class)
props.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS, JsonDeserializer.class.getName())
return new DefaultKafkaConsumerFactory<>(props/*,
new StringDeserializer(),
new JsonDeserializer<>(KafkaMessage.class)*/)
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, KafkaMessage>
KafkaListenerContainerFactoryActiveProxySources() {
ConcurrentKafkaListenerContainerFactory<String, KafkaMessage> factory
= new ConcurrentKafkaListenerContainerFactory<>()
factory.setConsumerFactory(ConsumerFactoryActiveProxySources())
factory.setMessageConverter(new StringJsonMessageConverter())
return factory
}
}
I will be grateful for your help.
By default, the deserializer will use type information in headers to determine which type to create.
Caused by: java.lang.ClassNotFoundException: com.freeproxy.parser.model.kafka.KafkaMessage
Most likely, KafkaMessage is in a different package on the sending side.
There are a couple of solutions:
https://docs.spring.io/spring-kafka/docs/current/reference/html/#serdes-json-config
Set JsonDeserializer.USE_TYPE_INFO_HEADERS to false and JsonDeserializer.VALUE_DEFAULT_TYPE to com.new.package.kafka.KafkaMessage (the fully qualified name of KafkaMessage on the receiving side).
Use type mapping: https://docs.spring.io/spring-kafka/docs/current/reference/html/#serdes-mapping-types
I suggest you read this whole section https://docs.spring.io/spring-kafka/docs/current/reference/html/#json-serde
When adding #Transactional annotation to a service suspend function which is being called by the handler I get the following error. If I leave it without the annotation then the code works as expected but in case of error it cannot roll back.
either is coming from arrow-kt-core
asHandlerFunction is used as a brigde to be able to document the APIs.
Any idea what happens?
Entities and repositories are placed under io.x.a. The service is inside io.x. The repository scans only io.x.a
Error:
java.lang.IllegalArgumentException: object is not an instance of declaring class
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Error has been observed at the following site(s):
|_ checkpoint ⇢ org.springframework.boot.actuate.metrics.web.reactive.server.MetricsWebFilter [DefaultWebFilterChain]
|_ checkpoint ⇢ HTTP POST "/api/assets/130473/one-second-resolution" [ExceptionHandlingWebHandler]
Stack trace:
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at kotlin.reflect.jvm.internal.calls.InlineClassAwareCaller.call(InlineClassAwareCaller.kt:134)
at kotlin.reflect.jvm.internal.KCallableImpl.call(KCallableImpl.kt:108)
at kotlin.reflect.full.KCallables.callSuspend(KCallables.kt:55)
at org.springframework.core.CoroutinesUtils$invokeSuspendingFunction$mono$1.invokeSuspend(CoroutinesUtils.kt:64)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.internal.DispatchedContinuationKt.resumeCancellableWith(DispatchedContinuation.kt:377)
at kotlinx.coroutines.intrinsics.CancellableKt.startCoroutineCancellable(Cancellable.kt:30)
at kotlinx.coroutines.intrinsics.CancellableKt.startCoroutineCancellable$default(Cancellable.kt:25)
at kotlinx.coroutines.CoroutineStart.invoke(CoroutineStart.kt:110)
at kotlinx.coroutines.AbstractCoroutine.start(AbstractCoroutine.kt:126)
at kotlinx.coroutines.reactor.MonoKt.monoInternal$lambda-2(Mono.kt:90)
at reactor.core.publisher.MonoCreate.subscribe(MonoCreate.java:57)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoUsingWhen.subscribe(MonoUsingWhen.java:87)
at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64)
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:157)
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:73)
at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:127)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1815)
at reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1815)
at reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1815)
at reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.complete(MonoIgnoreThen.java:284)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onNext(MonoIgnoreThen.java:187)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:232)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onComplete(MonoIgnoreThen.java:203)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onComplete(MonoPeekTerminal.java:299)
at reactor.core.publisher.MonoIgnoreElements$IgnoreElementsSubscriber.onComplete(MonoIgnoreElements.java:88)
at reactor.core.publisher.MonoIgnoreElements$IgnoreElementsSubscriber.onComplete(MonoIgnoreElements.java:88)
at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onComplete(Operators.java:2057)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1816)
at reactor.core.publisher.MonoFlatMap$FlatMapInner.onNext(MonoFlatMap.java:249)
at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onNext(FluxOnErrorResume.java:79)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.complete(MonoIgnoreThen.java:284)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onNext(MonoIgnoreThen.java:187)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:232)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onComplete(MonoIgnoreThen.java:203)
at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onComplete(MonoPeekTerminal.java:299)
at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onComplete(MonoIgnoreThen.java:209)
at reactor.core.publisher.FluxPeek$PeekSubscriber.onComplete(FluxPeek.java:259)
at reactor.core.publisher.MonoIgnoreElements$IgnoreElementsSubscriber.onComplete(MonoIgnoreElements.java:88)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1816)
at reactor.core.publisher.MonoCompletionStage.lambda$subscribe$0(MonoCompletionStage.java:82)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at com.github.jasync.sql.db.util.FutureUtilsKt.success(FutureUtils.kt:16)
at com.github.jasync.sql.db.mysql.MySQLConnection$succeedQueryPromise$1.accept(MySQLConnection.kt:344)
at com.github.jasync.sql.db.mysql.MySQLConnection$succeedQueryPromise$1.accept(MySQLConnection.kt:54)
at java.base/java.util.Optional.ifPresent(Optional.java:183)
at com.github.jasync.sql.db.mysql.MySQLConnection.succeedQueryPromise(MySQLConnection.kt:343)
at com.github.jasync.sql.db.mysql.MySQLConnection.onOk(MySQLConnection.kt:218)
at com.github.jasync.sql.db.mysql.codec.MySQLConnectionHandler.channelRead0(MySQLConnectionHandler.kt:119)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:324)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:296)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:719)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:655)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:581)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at java.base/java.lang.Thread.run(Thread.java:829)
Router:
#Bean
fun router(...): RouterFunction<ServerResponse> {
return route()
.POST(
"...",
asHandlerFunction { createNewOneSecondResolutionSession(it) })
.build()
}
private fun asHandlerFunction(init: suspend (ServerRequest) -> ServerResponse) = HandlerFunction {
mono(Dispatchers.Unconfined) {
init(it)
}
}
Handler:
private suspend fun theFun(req: ServerRequest): ServerResponse {
val a = ...
val b = ...
return service.theFun(a, b).fold(
{ error ->
internalServerErrorResponse("Client user already exists.")
},
{ ServerResponse.ok().bodyValueAndAwait(it) }
)
}
Service:
#Transactional("tm1")
suspend fun theFun(
a: A,
b: B
): Either<Error, Result> = either {
val user= userService.createNewUser(username = "test", password = "pw")
.mapLeft {
log
Error
}
.bind()
throw RuntimeException("xx")
}
Persistence Configuration:
#Configuration
#EnableR2dbcRepositories(
basePackages = ["io.x.a"],
entityOperationsRef = "operations1"
)
class PersistenceConfig1(
#Value("\${spring.datasource.d1.r2dbcUrl}") private val r2dbcUrl: String
) {
#Bean
#Qualifier("d1")
fun connectionFactory(): ConnectionFactory {
return ConnectionFactories.get(ConnectionFactoryOptions.parse(r2dbcUrl))
}
#Bean
fun r2dbcEntityOperations(#Qualifier("d1") connectionFactory: ConnectionFactory): R2dbcEntityOperations {
val databaseClient = DatabaseClient.create(connectionFactory)
return R2dbcEntityTemplate(databaseClient, DefaultReactiveDataAccessStrategy(MySqlDialect.INSTANCE));
}
#Bean("tm1")
fun transactionManager(#Qualifier("d1") connectionFactory: ConnectionFactory): ReactiveTransactionManager {
return R2dbcTransactionManager(connectionFactory)
}
}
I found a solution using TransactionalOperator.
#Bean("transactionalOperator1")
fun transactionalOperator1(#Qualifier("tm1") transactionManager: ReactiveTransactionManager): TransactionalOperator {
return TransactionalOperator.create(transactionManager)
}
#Bean("transactionalOperator2")
fun transactionalOperator2(#Qualifier("tm2") transactionManager: ReactiveTransactionManager): TransactionalOperator {
return TransactionalOperator.create(transactionManager)
}
#Qualifier("transactionalOperator1") private val transactionalOperator1: TransactionalOperator,
#Qualifier("transactionalOperator2") private val transactionalOperator2: TransactionalOperator,
suspend fun theFun(...): Either<Error, Result> =
transactionalOperator1.executeAndAwait { t1->
transactionalOperator2.executeAndAwait { t2->
...
.handleErrorWith {
t1.setRollbackOnly()
t2.setRollbackOnly()
}
}
}!!
At the end of executeAndAwait it commits the code. If the rollbackOnly flag is set then it will instruct the ReactiveTransactionManager to roll back the changes.
I created Spring boot multi module project with lettuce.
My config :
#Bean
public LettuceConnectionFactory lettuceConnectionFactory(){
return new LettuceConnectionFactory(host, port);
}
#Bean
public RedisTemplate<String, FullMessage> redisTemplate() {
RedisTemplate<String, FullMessage> template = new RedisTemplate<>();
template.setConnectionFactory(lettuceConnectionFactory());
return template;
}
I use RedisTemplate other module asynchronously:
#Async
#EventListener(ApplicationReadyEvent.class)
public void check() {
while (true){
try{
List<ObjectRecord<String, FullMessage>> records = redisTemplate
.opsForStream()
.read(FullMessage.class, StreamOffset.fromStart(key));
...
}
This code doesn`t work in multi module project but working another project(not multi module)
I am gettting the following error(infinity):
java.lang.IllegalStateException: Cannot connect, Event executor group is terminated.
at io.lettuce.core.AbstractRedisClient.initializeChannelAsync(AbstractRedisClient.java:283)
at io.lettuce.core.RedisClient.connectStatefulAsync(RedisClient.java:314)
at io.lettuce.core.RedisClient.connectStandaloneAsync(RedisClient.java:271)
at io.lettuce.core.RedisClient.connect(RedisClient.java:204)
at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.lambda$getConnection$1(StandaloneConnectionProvider.java:115)
at java.base/java.util.Optional.orElseGet(Optional.java:369)
at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.getConnection(StandaloneConnectionProvider.java:115)
at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getNativeConnection(LettuceConnectionFactory.java:1197)
at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getConnection(LettuceConnectionFactory.java:1178)
at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getSharedConnection(LettuceConnectionFactory.java:942)
at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getConnection(LettuceConnectionFactory.java:353)
at org.springframework.data.redis.core.RedisConnectionUtils.doGetConnection(RedisConnectionUtils.java:132)
at org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:95)
at org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:82)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:215)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:188)
at org.springframework.data.redis.core.AbstractOperations.execute(AbstractOperations.java:96)
at org.springframework.data.redis.core.DefaultStreamOperations.read(DefaultStreamOperations.java:217)
at org.springframework.data.redis.core.StreamOperations.read(StreamOperations.java:319)
at org.springframework.data.redis.core.StreamOperations.read(StreamOperations.java:290)
at uz.test.websocket.config.MainFIFOListener.check(MainFIFOListener.java:49)
at uz.test.websocket.config.MainFIFOListener$$FastClassBySpringCGLIB$$a659b30d.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:749)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at org.springframework.aop.interceptor.AsyncExecutionInterceptor.lambda$invoke$0(AsyncExecutionInterceptor.java:115)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
I am trying to filter incoming mail messages - if the mail body contains, for example, 'github.com', the application will not reply automatically. So I tried writing an IntegrationFlow for that (please see code below). I am not sure how to handle it since it seems like the Transformer cannot open the inbox folder, which seems to be Java Mail API issue?
#Bean
open fun flow(): IntegrationFlow
{
return IntegrationFlows
.from("emailReceiveChannel")
.transform(transformer())
.filter("#messageFilter.containsDomainNames('payload')")
.handle(MessageHandler(MailServiceImpl(javaMailSender(), mailStore())))
.get()
}
where transfomer() is:
#Bean
#Transformer(inputChannel = "emailReceiveChannel", outputChannel = "outputChannel")
open fun transformer(): org.springframework.integration.transformer.Transformer
{
return MailToStringTransformer()
}
and the messageFilter:
#Component
class MessageFilter
{
#Filter
open fun containsDomainNames(messageBody: String): Boolean
{
return messageBody.contains("github.com") ||
messageBody.contains("trello.com") ||
messageBody.contains("bitbucket.com")
}
}
.
#Bean
#InboundChannelAdapter(autoStartup = "true", value = "emailReceiveChannel", poller = (arrayOf(Poller(fixedDelay = "10000", maxMessagesPerPoll = "10"))))
open fun mailReceivingMessageSource(mailReceiver: ImapMailReceiver): MailReceivingMessageSource
{
return MailReceivingMessageSource(mailReceiver)
}
#Bean
open fun pollingConsumer(): PollingConsumer
{
return PollingConsumer(emailReceiveChannel(), MessageHandler(MailServiceImpl(javaMailSender(), mailStore())))
}
#Bean
open fun outputChannel(): PollableChannel
{
return QueueChannel()
}
#Bean
open fun emailReceiveChannel(): PollableChannel
{
return QueueChannel(10)
}
Stacktrace:
2018-01-20 13:05:53.840 ERROR 8204 --- [ask-scheduler-6] o.s.integration.handler.LoggingHandler : org.springframework.integration.transformer.MessageTransformationException: failed to transform mail message; nested exception is javax.mail.FolderClosedException, failedMessage=GenericMessage [payload=org.springframework.integration.mail.AbstractMailReceiver$IntegrationMimeMessage#3f0ef4a2, headers={id=e9eff05d-e9af-70e4-73f7-c4d39c740291, timestamp=1516449953838}]
at org.springframework.integration.mail.transformer.AbstractMailMessageTransformer.transform(AbstractMailMessageTransformer.java:83)
at org.springframework.integration.transformer.MessageTransformingHandler.handleRequestMessage(MessageTransformingHandler.java:89)
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:109)
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:131)
at org.springframework.integration.endpoint.PollingConsumer.handleMessage(PollingConsumer.java:129)
at org.springframework.integration.endpoint.AbstractPollingEndpoint.doPoll(AbstractPollingEndpoint.java:271)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$Poller.lambda$run$0(AbstractPollingEndpoint.java:372)
at org.springframework.integration.util.ErrorHandlingTaskExecutor.lambda$execute$0(ErrorHandlingTaskExecutor.java:53)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.integration.util.ErrorHandlingTaskExecutor.execute(ErrorHandlingTaskExecutor.java:51)
at org.springframework.integration.endpoint.AbstractPollingEndpoint$Poller.run(AbstractPollingEndpoint.java:366)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at org.springframework.scheduling.concurrent.ReschedulingRunnable.run(ReschedulingRunnable.java:83)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: javax.mail.FolderClosedException
at com.sun.mail.imap.IMAPMessage.getProtocol(IMAPMessage.java:153)
at com.sun.mail.imap.IMAPBodyPart.loadHeaders(IMAPBodyPart.java:390)
at com.sun.mail.imap.IMAPBodyPart.getNonMatchingHeaderLines(IMAPBodyPart.java:371)
at javax.mail.internet.MimeBodyPart.writeTo(MimeBodyPart.java:1536)
at javax.mail.internet.MimeBodyPart.writeTo(MimeBodyPart.java:948)
at javax.mail.internet.MimeMultipart.writeTo(MimeMultipart.java:538)
at org.springframework.integration.mail.transformer.MailToStringTransformer.doTransform(MailToStringTransformer.java:62)
at org.springframework.integration.mail.transformer.AbstractMailMessageTransformer.transform(AbstractMailMessageTransformer.java:80)
... 19 more
I have the same problem in mail-attachments project of:
https://github.com/spring-projects/spring-integration
I resolved it with
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-bom</artifactId>
<version>4.3.19.RELEASE</version>
<scope>import</scope>
<type>pom</type>
</dependency>
instead of
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-bom</artifactId>
<version>5.1.3.RELEASE</version>
<scope>import</scope>
<type>pom</type>
</dependency>
EDIT: It's ok also with version 5.1.3.RELEASE but it's necessary:
simple-content="true" in mail:inbound-channel-adapter
This issue are corresponding with integration lib update, that is why downgrade of this lib is helpfull, but the right way is to add one more java mail property: simpleContent.
Please read docs: https://docs.spring.io/spring-integration/api/org/springframework/integration/mail/AbstractMailReceiver.html#setSimpleContent-boolean-
Also my code snippet:
#Bean
public IntegrationFlow userEmailFlow(EmailProperties props,
EmailToUserTransformer emailToUserTransformer,
UserMessageHandler userMessageHandler) {
return IntegrationFlows
.from(Mail.imapInboundAdapter(props.getImapUrl())
.javaMailProperties(p -> p.put("mail.debug", "false")
.put("mail.imap.socketFactory.class", "javax.net.ssl.SSLSocketFactory")
.put("mail.imap.socketFactory.fallback", "false")
.put("mail.store.protocol", "imaps"))
.shouldMarkMessagesAsRead(false)
.shouldDeleteMessages(false)
.simpleContent(true),
e -> e.poller(Pollers.fixedDelay(props.getPollRate())))
.transform(emailToUserTransformer)
.handle(userMessageHandler)
.get();
}