I have the following endpoint
#GetMapping(value = "/mypath/{mychoice}")
public ResponseClass generateEndpoint(
#PathVariable("mychoice") FormatEnum format,
) {
...
and the following enum annotated with Jackson
#JsonNaming(PropertyNamingStrategies.KebabCaseStrategy.class)
public enum Format {
AVRO,
ORC,
PARQUET,
PROTOBUF
}
I hoped, #JsonNaming annotation will tell swagger to display cases in lowercase, but it doesn't
Adding #JsonProperty to each case also doesn't help.
It also doesn't accept lowercase URL with error
org.springframework.beans.TypeMismatchException: Failed to convert value of type 'java.lang.String' to required type 'FormatEnum'; nested exception is org.springframework.core.convert.ConversionFailedException: Failed to convert from type [java.lang.String] to type [#org.springframework.web.bind.annotation.RequestParam FormatEnum] for value 'avro'; nested exception is java.lang.IllegalArgumentException: No enum constant FormatEnum.avro
Setting
spring.jackson.mapper.ACCEPT_CASE_INSENSITIVE_ENUMS = true
has no effect (and is true in code).
Looks like it just doesn't use Jackson to deserialize enum!
Answering the part for serializing Enum case insensitive
The reason why setting spring.jackson.mapper.ACCEPT_CASE_INSENSITIVE_ENUMS=true is not working can be found in Konstantin Zyubin's answer.
And inspired from above answer, we can have a more generic approach to handle case insensitive enums in request parameter.
Converter class
import org.springframework.core.convert.converter.Converter;
public class CaseInsensitiveEnumConverter<T extends Enum<T>> implements Converter<String, T> {
private Class<T> enumClass;
public CaseInsensitiveEnumConverter(Class<T> enumClass) {
this.enumClass = enumClass;
}
#Override
public T convert(String from) {
return T.valueOf(enumClass, from.toUpperCase());
}
}
Add in configuration
import com.example.enums.EnumA;
import com.example.enums.EnumB;
import com.example.enums.FormatEnum;
import org.springframework.context.annotation.Configuration;
import org.springframework.format.FormatterRegistry;
import org.springframework.web.servlet.config.annotation.WebMvcConfigurer;
import java.util.List;
#Configuration
public class AppConfig implements WebMvcConfigurer {
#Override
public void addFormatters(FormatterRegistry registry) {
List<Class<? extends Enum>> enums = List.of(EnumA.class, EnumB.class, FormatEnum.class);
enums.forEach(enumClass -> registry.addConverter(String.class, enumClass,
new CaseInsensitiveEnumConverter<>(enumClass)));
}
}
Related: Jackson databind enum case insensitive
in my spring boot application, I have been using one external commons library for handling exceptions. The external library has an aspect defined for the same something like below:
#Aspect
#Order(0)
public class InternalExceptionAspect {
public InternalExceptionAspect() {
}
#Pointcut("#within(org.springframework.stereotype.Service)")
public void applicationServicePointcut() {
}
#AfterThrowing(
pointcut = "applicationServicePointcut()",
throwing = "e"
)
public void translate(JoinPoint joinPoint, Throwable e) {
String resourceId = this.getResourceId();
if (e instanceof BadInputException) {
BadInputException inputException = (BadInputException)e;
throw new BadRequestAlertException(inputException.getErrorCode().getDefaultMessage(), inputException.getMessage(), inputException.getErrorCode().getHttpStatusCode(), resourceId, inputException.getErrorCode().getCode());
} else if (!(e instanceof BadServerStateException) && !(e instanceof InternalException)) {
String message;
if (e instanceof JDBCException) {
...
throw new BadServerStateException(message, resourceId, "20");
} else {
...
throw new BadServerStateException(message, resourceId, "10");
}
} else {
InternalException serverStateException = (InternalException)e;
throw new BadServerStateException(serverStateException.getErrorCode().getDefaultMessage(), serverStateException.getMessage(), resourceId, serverStateException.getErrorCode().getHttpStatusCode(), serverStateException.getErrorCode().getCode(), serverStateException.getErrorCode().getErrorType().name());
}
}
String getResourceId() {
RequestHeaders requestHeaders = RequestResponseContext.getRequestHeaders();
return requestHeaders.getResourceId();
}
}
Here I would like to introduce another else if block in order to handle DuplicateKeyException for my application.
The problem is, the above code, being part of the commons library is being used by multiple other applications. But, I would like to do the change to be applied only in my application.
I have been thinking to inherit the Aspect class something like below, inside my application:
#Aspect
#Order(0)
public class MyInternalExceptionAspect extends InternalExceptionAspect {
public MyInternalExceptionAspect() {
}
#Pointcut("#within(org.springframework.stereotype.Service)")
public void applicationServicePointcut() {
}
#AfterThrowing(
pointcut = "applicationServicePointcut()",
throwing = "e"
)
public void translate(JoinPoint joinPoint, Throwable e) {
if(e instanceof DuplicateKeyException) {
...
}
super.translate(joinpoint, e);
}
}
But, I am not sure, if this is the correct approach to do this. Could anyone please help here regarding what would be the best approach to achieve this? Thanks.
You cannot extend a concrete aspect using class MyInternalExceptionAspect extends InternalExceptionAspect. It will cause an exception in Spring AOP:
...AopConfigException:
[...MyInternalExceptionAspect] cannot extend concrete aspect
[...InternalExceptionAspect]
Only abstract aspects are meant to be extended.
But you can simply create a new aspect without inheritance and make sure that it has a lower priority than the original aspect.
Why lower priority?
Acording to the #Order javadoc, "lower values have higher priority".
You want your own aspect's #AfterReturning advice to kick in before the original aspect possibly transforms the exception of interest into something else, before you have a chance to handle it. But according to the Spring manual: "The highest precedence advice runs first "on the way in" (so, given two pieces of before advice, the one with highest precedence runs first). "On the way out" from a join point, the highest precedence advice runs last (so, given two pieces of after advice, the one with the highest precedence will run second).".
Therefore, your own aspect should have #Order(1), giving it lower priority than the original aspect, but making the #AfterThrowing advide run before the original one. Sorry for the reverse logic, even though it makes sense. You just need to be aware of it.
Here is an MCVE, simulating your situation in a simplified way:
package de.scrum_master.spring.q69862121;
import org.springframework.stereotype.Service;
#Service
public class MyService {
public void doSomething(Throwable throwable) throws Throwable {
if (throwable != null)
throw throwable;
}
}
package de.scrum_master.spring.q69862121;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.context.annotation.Configuration;
#SpringBootApplication
#Configuration
public class DemoApplication {
private static final Logger log = LoggerFactory.getLogger(DemoApplication.class.getName());
public static void main(String[] args) throws Throwable {
try (ConfigurableApplicationContext appContext = SpringApplication.run(DemoApplication.class, args)) {
doStuff(appContext);
}
}
private static void doStuff(ConfigurableApplicationContext appContext) {
MyService myService = appContext.getBean(MyService.class);
List<Throwable> throwables = Arrays.asList(
null, // No exception -> no aspect should kick in
new Exception("oops"), // Not covered by any aspects -> no translation
new IllegalArgumentException("uh-oh"), // Original aspect translates to RuntimeException
new NullPointerException("null"), // Custom aspect translates to RuntimeException
new ArithmeticException("argh") // Custom aspect translates to IllegalArgumentException,
// then original aspect translates to RuntimeException
);
for (Throwable originalThrowable : throwables) {
try {
myService.doSomething(originalThrowable);
}
catch (Throwable translatedThrowable) {
log.info(translatedThrowable.toString());
}
}
}
}
As you can see, the application calls the service, the first time with null, not causing any exception, then with several types of exceptions the aspects are meant to either ignore or translate.
package de.scrum_master.spring.q69862121;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.annotation.AfterThrowing;
import org.aspectj.lang.annotation.Aspect;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.core.annotation.Order;
import org.springframework.stereotype.Component;
#Aspect
#Component
#Order(0)
public class InternalExceptionAspect {
private static final Logger log = LoggerFactory.getLogger(InternalExceptionAspect.class.getName());
#AfterThrowing(pointcut = "#within(org.springframework.stereotype.Service)", throwing = "e")
public void translate(JoinPoint joinPoint, Throwable e) {
log.info(joinPoint + " -> " + e);
if (e instanceof IllegalArgumentException)
throw new RuntimeException("Transformed by InternalExceptionAspect", e);
}
}
package de.scrum_master.spring.q69862121;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.annotation.AfterThrowing;
import org.aspectj.lang.annotation.Aspect;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.core.annotation.Order;
import org.springframework.stereotype.Component;
#Aspect
#Component
#Order(1)
public class MyInternalExceptionAspect {
private static final Logger log = LoggerFactory.getLogger(MyInternalExceptionAspect.class.getName());
#AfterThrowing(pointcut = "#within(org.springframework.stereotype.Service)", throwing = "e")
public void translate(JoinPoint joinPoint, Throwable e) {
log.info(joinPoint + " -> " + e);
if (e instanceof NullPointerException)
throw new RuntimeException("Transformed by MyInternalExceptionAspect", e);
if (e instanceof ArithmeticException)
throw new IllegalArgumentException("Transformed by MyInternalExceptionAspect", e);
}
}
The console log proves that everything works as expected, also with regard to invocation order:
d.s.s.q.MyInternalExceptionAspect : execution(void de.scrum_master.spring.q69862121.MyService.doSomething(Throwable)) -> java.lang.Exception: oops
d.s.s.q69862121.InternalExceptionAspect : execution(void de.scrum_master.spring.q69862121.MyService.doSomething(Throwable)) -> java.lang.Exception: oops
d.s.spring.q69862121.DemoApplication : java.lang.Exception: oops
d.s.s.q.MyInternalExceptionAspect : execution(void de.scrum_master.spring.q69862121.MyService.doSomething(Throwable)) -> java.lang.IllegalArgumentException: uh-oh
d.s.s.q69862121.InternalExceptionAspect : execution(void de.scrum_master.spring.q69862121.MyService.doSomething(Throwable)) -> java.lang.IllegalArgumentException: uh-oh
d.s.spring.q69862121.DemoApplication : java.lang.RuntimeException: Transformed by InternalExceptionAspect
d.s.s.q.MyInternalExceptionAspect : execution(void de.scrum_master.spring.q69862121.MyService.doSomething(Throwable)) -> java.lang.NullPointerException: null
d.s.s.q69862121.InternalExceptionAspect : execution(void de.scrum_master.spring.q69862121.MyService.doSomething(Throwable)) -> java.lang.RuntimeException: Transformed by MyInternalExceptionAspect
d.s.spring.q69862121.DemoApplication : java.lang.RuntimeException: Transformed by MyInternalExceptionAspect
d.s.s.q.MyInternalExceptionAspect : execution(void de.scrum_master.spring.q69862121.MyService.doSomething(Throwable)) -> java.lang.ArithmeticException: argh
d.s.s.q69862121.InternalExceptionAspect : execution(void de.scrum_master.spring.q69862121.MyService.doSomething(Throwable)) -> java.lang.IllegalArgumentException: Transformed by MyInternalExceptionAspect
d.s.spring.q69862121.DemoApplication : java.lang.RuntimeException: Transformed by InternalExceptionAspect
I have been facing the exception below on the Kafka consumer side. Surprisingly, this issue is not consistent and an older version of the code (with the exact same configuration but some new unrelated features) works as expected. Could anyone help in determining what could be causing this?
[ERROR][938f3c68-f481-4224-b2c6-43af5fb27ada-0-C-1][org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer] - Error handler threw an exception
org.springframework.kafka.KafkaException: Seek to current after exception; nested exception is org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method could not be invoked with the incoming message
Endpoint handler details:
Method [public void com.mycompany.listener.KafkaBatchListener.onMessage(java.lang.Object,org.springframework.kafka.support.Acknowledgment)]
Bean [com.mycompany.listener.KafkaBatchListener#7a59780b]; nested exception is org.springframework.messaging.handler.invocation.MethodArgumentResolutionException: Could not resolve method parameter at index 0 in public void com.mycompany.listener.KafkaBatchListener.onMessage(java.util.List<org.apache.kafka.clients.consumer.ConsumerRecord<K, V>>,org.springframework.kafka.support.Acknowledgment): Could not resolve parameter [0] in public void com.mycompany.listener.KafkaBatchListener.onMessage(java.util.List<org.apache.kafka.clients.consumer.ConsumerRecord<K, V>>,org.springframework.kafka.support.Acknowledgment): No suitable resolver, failedMessage=GenericMessage [payload=[[B#21bc784f, MyPOJO(), [B#33bb5851], headers={kafka_offset=[4046, 4047, 4048], kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#4871203f, kafka_timestampType=[CREATE_TIME, CREATE_TIME, CREATE_TIME], kafka_receivedPartitionId=[0, 0, 0], kafka_receivedMessageKey=[[B#295620f1, MyPOJOKey(id=0), [B#5d3d6361], kafka_batchConvertedHeaders=[{myFirstHeader=[B#1f011689, myUUIDHeader=[B#7691bce8, myMetadataHeader=[B#6e585b63, myRequestIdHeader=[B#58c81ba2, myMetricsHeader=[B#4f6aeb6c, myTargetHeader=[B#34677895}, {myUUIDHeader=[B#1848ae39, myMetadataHeader=[B#c5b399, myRequestIdHeader=[B#186c1966, myMetricsHeader=[B#1740692e, myTargetHeader=[B#4a242499}, {myUUIDHeader=[B#67d01f3f, myMetadataHeader=[B#1f0f9d8a, myRequestIdHeader=[B#b928e5c, isLastMessage=[B#6079735b, myMetricsHeader=[B#7b7b18c, myTargetHeader=[B#64378f3d}], kafka_receivedTopic=[my_topic, my_topic, my_topic], kafka_receivedTimestamp=[1623420136620, 1623420137255, 1623420137576], kafka_acknowledgment=Acknowledgment for org.apache.kafka.clients.consumer.ConsumerRecords#7bc81d89, kafka_groupId=dev-consumer-grp}]
at org.springframework.kafka.listener.SeekToCurrentBatchErrorHandler.handle(SeekToCurrentBatchErrorHandler.java:77) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.RecoveringBatchErrorHandler.handle(RecoveringBatchErrorHandler.java:124) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.ContainerAwareBatchErrorHandler.handle(ContainerAwareBatchErrorHandler.java:56) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeBatchErrorHandler(KafkaMessageListenerContainer.java:2010) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeBatchListener(KafkaMessageListenerContainer.java:1854) [spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeBatchListener(KafkaMessageListenerContainer.java:1720) [spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:1699) [spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeIfHaveRecords(KafkaMessageListenerContainer.java:1272) [spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1264) [spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1161) [spring-kafka-2.7.1.jar:2.7.1]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.lang.Thread.run(Thread.java:834) [?:?]
Caused by: org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method could not be invoked with the incoming message
Endpoint handler details:
Method [public void com.mycompany.listener.KafkaBatchListener.onMessage(java.lang.Object,org.springframework.kafka.support.Acknowledgment)]
Bean [com.mycompany.listener.KafkaBatchListener#7a59780b]; nested exception is org.springframework.messaging.handler.invocation.MethodArgumentResolutionException: Could not resolve method parameter at index 0 in public void com.mycompany.listener.KafkaBatchListener.onMessage(java.util.List<org.apache.kafka.clients.consumer.ConsumerRecord<K, V>>,org.springframework.kafka.support.Acknowledgment): Could not resolve parameter [0] in public void com.mycompany.listener.KafkaBatchListener.onMessage(java.util.List<org.apache.kafka.clients.consumer.ConsumerRecord<K, V>>,org.springframework.kafka.support.Acknowledgment): No suitable resolver, failedMessage=GenericMessage [payload=[[B#21bc784f, MyPOJO(), [B#33bb5851], headers={kafka_offset=[4046, 4047, 4048], kafka_consumer=org.apache.kafka.clients.consumer.KafkaConsumer#4871203f, kafka_timestampType=[CREATE_TIME, CREATE_TIME, CREATE_TIME], kafka_receivedPartitionId=[0, 0, 0], kafka_receivedMessageKey=[[B#295620f1, MyPOJOKey(id=0), [B#5d3d6361], kafka_batchConvertedHeaders=[{myFirstHeader=[B#1f011689, myUUIDHeader=[B#7691bce8, myMetadataHeader=[B#6e585b63, myRequestIdHeader=[B#58c81ba2, myMetricsHeader=[B#4f6aeb6c, myTargetHeader=[B#34677895}, {myUUIDHeader=[B#1848ae39, myMetadataHeader=[B#c5b399, myRequestIdHeader=[B#186c1966, myMetricsHeader=[B#1740692e, myTargetHeader=[B#4a242499}, {myUUIDHeader=[B#67d01f3f, myMetadataHeader=[B#1f0f9d8a, myRequestIdHeader=[B#b928e5c, isLastMessage=[B#6079735b, myMetricsHeader=[B#7b7b18c, myTargetHeader=[B#64378f3d}], kafka_receivedTopic=[my_topic, my_topic, my_topic], kafka_receivedTimestamp=[1623420136620, 1623420137255, 1623420137576], kafka_acknowledgment=Acknowledgment for org.apache.kafka.clients.consumer.ConsumerRecords#7bc81d89, kafka_groupId=dev-consumer-grp}]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:2367) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeBatchOnMessage(KafkaMessageListenerContainer.java:2003) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeBatchOnMessageWithRecordsOrList(KafkaMessageListenerContainer.java:1973) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeBatchOnMessage(KafkaMessageListenerContainer.java:1925) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeBatchListener(KafkaMessageListenerContainer.java:1837) ~[spring-kafka-2.7.1.jar:2.7.1]
... 8 more
Caused by: org.springframework.messaging.handler.invocation.MethodArgumentResolutionException: Could not resolve method parameter at index 0 in public void com.mycompany.listener.KafkaBatchListener.onMessage(java.util.List<org.apache.kafka.clients.consumer.ConsumerRecord<K, V>>,org.springframework.kafka.support.Acknowledgment): Could not resolve parameter [0] in public void com.mycompany.listener.KafkaBatchListener.onMessage(java.util.List<org.apache.kafka.clients.consumer.ConsumerRecord<K, V>>,org.springframework.kafka.support.Acknowledgment): No suitable resolver
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java:145) ~[spring-messaging-5.2.12.RELEASE.jar:5.2.12.RELEASE]
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:116) ~[spring-messaging-5.2.12.RELEASE.jar:5.2.12.RELEASE]
at org.springframework.kafka.listener.adapter.HandlerAdapter.invoke(HandlerAdapter.java:56) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:339) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.adapter.BatchMessagingMessageListenerAdapter.invoke(BatchMessagingMessageListenerAdapter.java:180) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.adapter.BatchMessagingMessageListenerAdapter.onMessage(BatchMessagingMessageListenerAdapter.java:172) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.adapter.BatchMessagingMessageListenerAdapter.onMessage(BatchMessagingMessageListenerAdapter.java:61) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeBatchOnMessage(KafkaMessageListenerContainer.java:1983) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeBatchOnMessageWithRecordsOrList(KafkaMessageListenerContainer.java:1973) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeBatchOnMessage(KafkaMessageListenerContainer.java:1925) ~[spring-kafka-2.7.1.jar:2.7.1]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeBatchListener(KafkaMessageListenerContainer.java:1837) ~[spring-kafka-2.7.1.jar:2.7.1]
... 8 more
My app uses the following:
A custom listener class com.mycompany.listener.KafkaBatchListener<K, V> which implements org.springframework.kafka.listener.BatchAcknowledgingMessageListener<K, V> and overrides onMessage(List<ConsumerRecord<K, V>> consumerRecords, Acknowledgment acknowledgment) with a custom marker annotation #MyKafkaListener
A custom container factory which extends org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory<K, V> and configures setConsumerFactory(consumerFactory), setBatchErrorHandler(errorHandler), setBatchListener(true) and ContainerProperties.setOnlyLogRecordMetadata(true).
A SpringBoot #Configuration class which implements org.springframework.kafka.annotation.KafkaListenerConfigurer and is responsible for configuring org.springframework.kafka.core.DefaultKafkaConsumerFactory<K, V>, org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory<K, V> and org.springframework.kafka.config.MethodKafkaListenerEndpoint<String, String> (used by #MyKafkaListener)
Spring Kafka 2.7.1
Additional query:
Even though ContainerProperties.setOnlyLogRecordMetadata(true) is set, the exception stacktrace still contains the full payload which I have omitted. Any idea why?
Thanks in advance!
UPDATE:
KafkaBatchListener
package com.mycompany.listener;
import java.util.List;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.springframework.kafka.listener.BatchAcknowledgingMessageListener;
import org.springframework.kafka.support.Acknowledgment;
public class KafkaBatchListener<K, V> implements BatchAcknowledgingMessageListener<K, V> {
#Override
#com.mycompany.listener.KafkaListener
public void onMessage(final List<ConsumerRecord<K, V>> consumerRecords, final Acknowledgment acknowledgment) {
// process batch using MyService<K, V>.process(consumerRecords)
acknowledgment.acknowledge();
}
}
Custom Annotation
package com.mycompany.listener;
import static java.lang.annotation.ElementType.METHOD;
import static java.lang.annotation.RetentionPolicy.RUNTIME;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
#Retention(RUNTIME)
#Target(METHOD)
public #interface KafkaListener {
}
Listener Container Factory
package com.mycompany.factory;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.listener.ContainerProperties;
import com.mycompany.errorhandler.ListenerContainerRecoveringBatchErrorHandler;
public class KafkaBatchListenerContainerFactory<K, V>
extends ConcurrentKafkaListenerContainerFactory<K, V> {
public KafkaBatchListenerContainerFactory(final DefaultKafkaConsumerFactory<K, V> consumerFactory,
final ListenerContainerRecoveringBatchErrorHandler errorHandler, final int concurrency) {
super.setConsumerFactory(consumerFactory);
super.setBatchErrorHandler(errorHandler);
super.setConcurrency(concurrency);
super.setBatchListener(true);
super.setAutoStartup(true);
final ContainerProperties containerProperties = super.getContainerProperties();
containerProperties.setAckMode(ContainerProperties.AckMode.MANUAL);
containerProperties.setOnlyLogRecordMetadata(true);
}
}
Batch Error Handler
package com.mycompany.errorhandler;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.listener.RecoveringBatchErrorHandler;
import org.springframework.stereotype.Component;
import org.springframework.util.backoff.FixedBackOff;
#Component
public class ListenerContainerRecoveringBatchErrorHandler extends RecoveringBatchErrorHandler {
public ListenerContainerRecoveringBatchErrorHandler(
#Value("${spring.kafka.consumer.properties.backOffMS:0}") final int backOffTimeMS,
#Value("${spring.kafka.consumer.properties.retries:3}") final int retries) {
super(new FixedBackOff(backOffTimeMS, retries));
}
}
Kafka Listener Configurer
package com.mycompany.config;
import java.lang.reflect.Method;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import java.util.Properties;
import java.util.UUID;
import org.springframework.beans.factory.BeanFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.annotation.KafkaListenerConfigurer;
import org.springframework.kafka.config.KafkaListenerEndpointRegistrar;
import org.springframework.kafka.config.MethodKafkaListenerEndpoint;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.messaging.handler.annotation.support.DefaultMessageHandlerMethodFactory;
import com.mycompany.errorhandler.ListenerContainerRecoveringBatchErrorHandler;
import com.mycompany.factory.KafkaBatchListenerContainerFactory;
import com.mycompany.listener.KafkaBatchListener;
#Configuration
public class KafkaBatchListenerConfigurer<K, V> implements KafkaListenerConfigurer {
private final List<KafkaBatchListener<K, V>> listeners;
private final BeanFactory beanFactory;
private final ListenerContainerRecoveringBatchErrorHandler errorHandler;
private final int concurrency;
#Autowired
public KafkaBatchListenerConfigurer(final List<KafkaBatchListener<K, V>> listeners, final BeanFactory beanFactory,
final ListenerContainerRecoveringBatchErrorHandler errorHandler,
#Value("${spring.kafka.listener.concurrency:1}") final int concurrency) {
this.listeners = listeners;
this.beanFactory = beanFactory;
this.errorHandler = errorHandler;
this.concurrency = concurrency;
}
#Override
public void configureKafkaListeners(final KafkaListenerEndpointRegistrar registrar) {
final Method listenerMethod = lookUpBatchListenerMethod();
listeners.forEach(listener -> {
registerListenerEndpoint(listener, listenerMethod, registrar);
});
}
private void registerListenerEndpoint(final KafkaBatchListener<K, V> listener, final Method listenerMethod,
final KafkaListenerEndpointRegistrar registrar) {
// final Map<String, Object> consumerConfig = get ConsumerConfig from a custom provider;
registrar.setContainerFactory(createContainerFactory(consumerConfig));
registrar.registerEndpoint(createListenerEndpoint(listener, listenerMethod, consumerConfig));
}
private KafkaBatchListenerContainerFactory<K, V> createContainerFactory(final Map<String, Object> consumerConfig) {
final DefaultKafkaConsumerFactory<K, V> consumerFactory = new DefaultKafkaConsumerFactory<>(consumerConfig);
final KafkaBatchListenerContainerFactory<K, V> containerFactory = new KafkaBatchListenerContainerFactory<>(
consumerFactory, errorHandler, concurrency);
return containerFactory;
}
private MethodKafkaListenerEndpoint<String, String> createListenerEndpoint(final KafkaBatchListener<K, V> listener,
final Method listenerMethod, final Map<String, Object> consumerConfig) {
final MethodKafkaListenerEndpoint<String, String> endpoint = new MethodKafkaListenerEndpoint<>();
endpoint.setId(UUID.randomUUID().toString());
endpoint.setBean(listener);
endpoint.setMethod(listenerMethod);
endpoint.setBeanFactory(beanFactory);
endpoint.setGroupId("my-group-id");
endpoint.setMessageHandlerMethodFactory(new DefaultMessageHandlerMethodFactory());
// final String topicName = get TopicName for this key-value from a custom utility;
endpoint.setTopics(topicName);
final Properties properties = new Properties();
properties.putAll(consumerConfig);
endpoint.setConsumerProperties(properties);
return endpoint;
}
private Method lookUpBatchListenerMethod() {
return Arrays.stream(com.mycompany.listener.KafkaBatchListener.class.getMethods())
.filter(m -> m.isAnnotationPresent(com.mycompany.listener.KafkaListener.class))
.findAny()
.orElseThrow(() -> new IllegalStateException(
String.format("[%s] class should have at least 1 method with [%s] annotation.",
com.mycompany.listener.KafkaBatchListener.class.getCanonicalName(),
com.mycompany.listener.KafkaListener.class.getCanonicalName())));
}
}
You don't need all the standard #KafkaListener method invoking infrastructure when your listener already implements one of the message listener interfaces; instead of registering endpoints for each listener, just create a container for each from the factory and add the listener to the container properties.
val container = containerFactory.createContainer("topic1");
container.getContainerProperties().set...
...
container.getContainerProperies().setMessageListener(myListenerInstance);
...
container.start();
I have a requirement to unzip and file and process its contents. Inside the zip file, there could be two types of file individual or firm. That can be distinguished by file name. After processing of all files, it should call another program module, also archive the processed file in a different location. Would like use Spring integration for the same. I am trying to achieve this by the following code, but it is creating problem while routing based on file name. I am using JDK 8, Spring 5
.<File, Boolean>route(new Function<File, Boolean>() {
#Override
public Boolean apply(File f) {
return f.getName().contains("individual");
}
}, m -> m
.subFlowMapping(true, sf -> sf.gateway(individualProcessor()))
.subFlowMapping(false, sf -> sf.gateway(firmProcessor()))
)
Exception
Caused by: java.lang.IllegalArgumentException: Found ambiguous parameter type [interface java.util.function.Function] for method match: [public default <V> java.util.function.Function<V, R> java.util.function.Function.compose(java.util.function.Function<? super V, ? extends T>), public static <T> java.util.function.Function<T, T> java.util.function.Function.identity(), public java.lang.Boolean com.xxx.thirdpatysystem.config.IntegrationConfig$1.apply(java.io.File)]
at org.springframework.util.Assert.isNull(Assert.java:155) ~[spring-core-5.0.4.RELEASE.jar:5.0.4.RELEASE]
at org.springframework.integration.util.MessagingMethodInvokerHelper.findHandlerMethodsForTarget(MessagingMethodInvokerHelper.java:843) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.util.MessagingMethodInvokerHelper.<init>(MessagingMethodInvokerHelper.java:362) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.util.MessagingMethodInvokerHelper.<init>(MessagingMethodInvokerHelper.java:231) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.util.MessagingMethodInvokerHelper.<init>(MessagingMethodInvokerHelper.java:225) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.handler.MethodInvokingMessageProcessor.<init>(MethodInvokingMessageProcessor.java:60) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.router.MethodInvokingRouter.<init>(MethodInvokingRouter.java:46) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.dsl.IntegrationFlowDefinition.route(IntegrationFlowDefinition.java:1922) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.dsl.IntegrationFlowDefinition.route(IntegrationFlowDefinition.java:1895) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
even I have tried below
.<File, Boolean>route(f -> f.getName().contains("individual"), m -> m
.subFlowMapping(true, sf -> sf.gateway(individualProcessor()))
.subFlowMapping(false, sf -> sf.gateway(firmProcessor()))
)
Exception
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.integration.dsl.IntegrationFlow]: Factory method 'thirdpatysystemFlow' threw exception; nested exception is java.lang.UnsupportedOperationException
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:185) ~[spring-beans-5.0.4.RELEASE.jar:5.0.4.RELEASE]
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:579) ~[spring-beans-5.0.4.RELEASE.jar:5.0.4.RELEASE]
... 17 common frames omitted
Caused by: java.lang.UnsupportedOperationException: null
at org.springframework.integration.dsl.StandardIntegrationFlow.configure(StandardIntegrationFlow.java:89) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.dsl.IntegrationFlowDefinition.gateway(IntegrationFlowDefinition.java:2172) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
at org.springframework.integration.dsl.IntegrationFlowDefinition.gateway(IntegrationFlowDefinition.java:2151) ~[spring-integration-core-5.0.3.RELEASE.jar:5.0.3.RELEASE]
Entire code snippet is below
import java.io.File;
import java.util.concurrent.TimeUnit;
import java.util.function.Function;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.task.TaskExecutor;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.config.EnableIntegration;
import org.springframework.integration.core.MessageSource;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.dsl.Pollers;
import org.springframework.integration.dsl.channel.MessageChannels;
import org.springframework.integration.file.FileReadingMessageSource;
import org.springframework.integration.file.FileWritingMessageHandler;
import org.springframework.integration.file.filters.AcceptOnceFileListFilter;
import org.springframework.integration.file.filters.ChainFileListFilter;
import org.springframework.integration.file.filters.RegexPatternFileListFilter;
import org.springframework.integration.zip.splitter.UnZipResultSplitter;
import org.springframework.integration.zip.transformer.UnZipTransformer;
import org.springframework.integration.zip.transformer.ZipResultType;
import org.springframework.messaging.MessageHandler;
import org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor;
/**
* #author dpoddar
*
*/
#Configuration
#EnableIntegration
public class IntegrationConfig {
#Value("${input.directory}")
private String inputDir;
#Value("${outputDir.directory}")
private String outputDir;
#Value("${input.scan.frequency: 100}")
private long scanFrequency;
#Bean
public MessageSource<File> inputFileSource() {
FileReadingMessageSource src = new FileReadingMessageSource();
src.setDirectory(new File(inputDir));
src.setAutoCreateDirectory(true);
ChainFileListFilter<File> chainFileListFilter = new ChainFileListFilter<>();
chainFileListFilter.addFilter(new AcceptOnceFileListFilter<>() );
chainFileListFilter.addFilter(new RegexPatternFileListFilter("(?i)^.+\\.zip$"));
src.setFilter(chainFileListFilter);
return src;
}
#Bean
public UnZipTransformer unZipTransformer() {
UnZipTransformer unZipTransformer = new UnZipTransformer();
unZipTransformer.setExpectSingleResult(false);
unZipTransformer.setZipResultType(ZipResultType.FILE);
//unZipTransformer.setWorkDirectory(new File("/usr/tmp/uncompress"));
unZipTransformer.setDeleteFiles(true);
return unZipTransformer;
}
#Bean
public UnZipResultSplitter splitter() {
UnZipResultSplitter splitter = new UnZipResultSplitter();
return splitter;
}
#Bean
public DirectChannel outputChannel() {
return new DirectChannel();
}
#Bean
public MessageHandler fileOutboundChannelAdapter() {
FileWritingMessageHandler adapter = new FileWritingMessageHandler(new File(outputDir));
adapter.setDeleteSourceFiles(true);
adapter.setAutoCreateDirectory(true);
adapter.setExpectReply(false);
return adapter;
}
#Bean
public TaskExecutor taskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(5);
executor.setMaxPoolSize(10);
executor.setQueueCapacity(25);
return executor;
}
#Autowired
DirectChannel outputChannel;
#Autowired
MessageHandler fileOutboundChannelAdapter;
#Bean
public IntegrationFlow individualProcessor() {
return flow -> flow.handle("thirdpatysystemprocessor","processfile").channel(outputChannel).handle(fileOutboundChannelAdapter);
}
#Bean
public IntegrationFlow firmProcessor() {
return flow -> flow.handle("thirdpatysystemprocessor","processfile").channel(outputChannel).handle(fileOutboundChannelAdapter);
}
#Bean
public IntegrationFlow thirdpatysystemAgentDemographicFlow() {
return IntegrationFlows
.from(inputFileSource(), spec -> spec.poller(Pollers.fixedDelay(scanFrequency,TimeUnit.SECONDS)))
.transform(unZipTransformer())
.split(splitter())
.channel(MessageChannels.executor(taskExecutor()))
.<File, Boolean>route(new Function<File, Boolean>() {
#Override
public Boolean apply(File f) {
return f.getName().contains("individual");
}
}, m -> m
.subFlowMapping(true, sf -> sf.gateway(individualProcessor()))
.subFlowMapping(false, sf -> sf.gateway(firmProcessor()))
)
.aggregate()
/*.handle("thirdpatysystemprocessor","processfile")
.channel(outputChannel())
.handle(fileOutboundChannelAdapter())*/
.get()
;
}
}
The java.lang.IllegalArgumentException: Found ambiguous parameter type [interface java.util.function.Function] has been fixed in the Spring Integration 5.0.5: https://jira.spring.io/browse/INT-4456. So, now with an explicit Function impl we do this:
MethodInvokingRouter methodInvokingRouter = isLambda(router)
? new MethodInvokingRouter(new LambdaMessageProcessor(router, payloadType))
: new MethodInvokingRouter(router, ClassUtils.FUNCTION_APPLY_METHOD);
We explicitly point to the apply() method.
The re-use of existing IntegrationFlow beans in the sub-flows (gateway()) has been fixed in version 5.0.4: https://jira.spring.io/browse/INT-4434
So, what you need is just to upgrade your project to the latest dependencies. In particular Spring Integration 5.0.7: https://spring.io/projects/spring-integration#learn
I am getting following error in my weblogic console when i am starting my server.
SEVERE: Missing dependency for constructor
public com.test.mine.exception.JsonExceptionMapper(java.lang.String,com.fasterxml.jackson.core.JsonLocation) at parameter index 0
SEVERE: Missing dependency for constructor public com.test.mine.exception.JsonExceptionMapper(java.lang.String,com.fasterxml.jackson.core.JsonLocation) at parameter index 1
Below is my java code.
package com.test.mine.exception;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import javax.ws.rs.core.Response.Status;
import javax.ws.rs.ext.ExceptionMapper;
import javax.ws.rs.ext.Provider;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Service;
import com.fasterxml.jackson.core.JsonLocation;
import com.fasterxml.jackson.core.JsonParseException;
#Provider
#Service
public class JsonExceptionMapper extends JsonParseException implements ExceptionMapper {
public JsonExceptionMapper(String msg, JsonLocation loc) {
super(msg, loc);
// TODO Auto-generated constructor stub
}
private static final Logger LOGGER = LoggerFactory.getLogger(JsonExceptionMapper.class);
protected Logger getLogger() {
return LOGGER;
}
public Status getStatus(JsonParseException thr) {
return Status.BAD_REQUEST;
}
#Override
public Response toResponse(Throwable arg0) {
// TODO Auto-generated method stub
return Response.status(Status.BAD_REQUEST).type(MediaType.APPLICATION_JSON_TYPE).build();
}
}
The annotation #Service tells spring to create a singleton of the annotated class. At startup spring tries to create that instance and to provide the required constructor args String msg, JsonLocation loc which it does not find, so the exception.
JsonExceptionMapper does not look like a service, and it should not be a singleton. Instead it must be created whenever an exception is created.
I have never worked with that class, so sorry, cannot give you any advice on how to do that.
I bumped into a similar problem while configuring swagger to work with Jersey. After searching various forums found that Jersey scanning require a constructor without parameters. I added a a constructor and it worked for me.