spring-data-redis Jackson serialization - spring

I'm attempting to use the Jackson serialization feature of spring-data-redis. I am building a ObjectMapper and using the GenericJackson2JsonRedisSerializer as the serializer for the redisTemplate:
#Configuration
public class SampleModule {
#Bean
public ObjectMapper objectMapper() {
return Jackson2ObjectMapperBuilder.json()
.serializationInclusion(JsonInclude.Include.NON_NULL) // Don’t include null values
.featuresToDisable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS) //ISODate
.build();
}
#Bean
public RedisTemplate getRedisTemplate(ObjectMapper objectMapper, RedisConnectionFactory redisConnectionFactory){
RedisTemplate redisTemplate = new RedisTemplate();
redisTemplate.setDefaultSerializer(new GenericJackson2JsonRedisSerializer(objectMapper));
redisTemplate.setConnectionFactory(redisConnectionFactory);
return redisTemplate;
}
}
I have a SampleBean I am attempting to save:
#RedisHash("sampleBean")
public class SampleBean {
#Id
String id;
String value;
Date date;
public SampleBean(String value, Date date) {
this.value = value;
this.date = date;
}
}
And a repository for that bean:
public interface SampleBeanRepository extends CrudRepository {
}
I am then trying to write the bean to Redis:
ConfigurableApplicationContext context = SpringApplication.run(SampleRedisApplication.class, args);
SampleBean helloSampleBean = new SampleBean("hello", new Date());
ObjectMapper objectMapper = context.getBean(ObjectMapper.class);
logger.info("Expecting date to be written as: " + objectMapper.writeValueAsString(helloSampleBean.date));
SampleBeanRepository repository = context.getBean(SampleBeanRepository.class);
repository.save(helloSampleBean);
context.close();
I expect the redisTemplate to use the Serializer to write the Date inside of the SampleBean as a Timestamp, however it is written as a long.
The relevant spring-data-redis reference: http://docs.spring.io/spring-data/data-redis/docs/current/reference/html/#redis:serializer
Full code sample: https://github.com/bandyguy/spring-redis-jackson-sample-broken

The serializer/mapper used by the template does not affect the one used by the repository since the repository directly operates upon the byte[] using Converter implementations for reading/writing data based on domain type metadata.
Please refer to the Object to Hash Mapping section of the reference manual for guidance how to write and register a custom Converter.

Have you tried disable serialization feature SerializationFeature.WRITE_DATES_AS_TIMESTAMPS?

Related

SpEL KafkaListener. How can i inject custom deserializer through properties?

I am using spring.
I have a configured ObjectMapper for the entire project and I use it to set up a kafka deserializer.
And then I need a custom kafka deserializer to be used in KafkaListener.
I'm configuring KafkaListener via autoconfiguration, not via #Configuration class.
#Component
#RequiredArgsConstructor
public class CustomMessageDeserializer implements Deserializer<MyMessage> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public MyMessage deserialize(String topic, byte[] data) {
return objectMapper.readValue(data, MyMessage.class);
}
}
If i do like this
#KafkaListener(
topics = {"${topics.invite-user-topic}"},
properties = {"value.deserializer=com.service.deserializer.CustomMessageDeserializer"}
)
public void receiveInviteUserMessages(MyMessage myMessage) {}
I received KafkaException: Could not find a public no-argument constructor
But with public no-argument constructor in CustomMessageDeserializer class i am getting NPE because ObjectMapper = null. It creates and uses a new class, not a spring component.
#KafkaListener supports SpEL expressions.
And I think that this problem can be solved using SpEL.
Do you have any idea how to inject spring bean CustomMessageDeserializer with SpEL?
There are no easy ways to do it with SPeL.
Analysis
To get started, see the JavaDoc for #KafkaListener#properties:
/**
*
* SpEL expressions must resolve to a String ...
*/
The value of value.deserializer is used to instantiate the specified deserializer class. Let's follow the call chain:
You specify this value in the #KafkaListener annotation, then you are probably not creating a bean of the ConsumerFactory.class. So Spring creates this bean class itself - see KafkaAutoConfiguration#kafkaConsumerFactory.
Next is the creation of the returned object new DefaultKafkaConsumerFactory(...) as ConsumerFactory<?,?> using the constructor for default delivery expressions keyDeserializer/valueDeserializer = () -> null
This factory is used to create a Kafka consumer (The entry point is the constructor KafkaMessageListenerContainer#ListenerConsumer, then KafkaMessageListenerContainer.this.consumerFactory.createConsumer...)
In the KafkaConsumer constructor, the valueDeserializer object is being created, because it is null (for the default factory of point 2 above):
if (valueDeserializer == null) {
this.valueDeserializer = config.getConfiguredInstance(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, Deserializer.class);
The implementation of config.getConfiguredInstance involves instantiating your deserializer class via a parameterless constructor using reflection and your String "com.service.deserializer.CustomMessageDeserializer" class name
Solutions
To use value.deserializer with your customized ObjectMapper, you must create the ConsumerFactory bean yourself using the setValueDeserializer(...) method. This is also mentioned in the second Important part of the JSON.Mapping_Types.Important documentation
If you don't want to create a ConsumerFactory bean, and also don't have complicated logic in your deserializer (you only have return objectMapper.readValue(data, MyMessage.class);), then register DefaultKafkaConsumerFactoryCustomizer:
#Bean
// inject your custom objectMapper
public DefaultKafkaConsumerFactoryCustomizer customizeJsonDeserializer(ObjectMapper objectMapper) {
return consumerFactory ->
consumerFactory.setValueDeserializerSupplier(() ->
new org.springframework.kafka.support.serializer.JsonDeserializer<>(objectMapper));
}
In this case, you don't need to create your own CustomMessageDeserializer class (remove it) and Spring will automatically parse the message into your MyMessage.
#KafkaListener annotation should also not contains the property properties = {"value.deserializer=com.my.kafka_test.component.CustomMessageDeserializer"}. This DefaultKafkaConsumerFactoryCustomizer bean will automatically be used to configure the default ConsumerFactory<?, ?> (see the implementation of the KafkaAutoConfiguration#kafkaConsumerFactory method)
Here how it works for me:
#KafkaListener(topics = "${solr.kafka.topic}", containerFactory = "batchFactory")
public void listen(List<SolrInputDocument> docs, #Header(KafkaHeaders.BATCH_CONVERTED_HEADERS) List<Map<String, Object>> headers, Acknowledgment ack) throws IOException {...}
And then I have 2 beans defined in my Configuration
#Profile("!test")
#Bean
#Autowired
public ConsumerFactory<String, SolrInputDocument> consumerFactory(KafkaProperties properties) {
Map<String, Object> props = properties.buildConsumerProperties();
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
DefaultKafkaConsumerFactory<String, SolrInputDocument> result = new DefaultKafkaConsumerFactory<>(props);
String validatedKeyDeserializerName = KafkaMessageType.valueOf(keyDeserializerName).toString();
ZiDeserializer<SolrInputDocument> deserializer = ZiDeserializerFactory.getInstance(validatedKeyDeserializerName);
result.setValueDeserializer(deserializer);
return result;
}
#Profile("!test")
#Bean
#Autowired
public ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> batchFactory(ConsumerFactory<String, SolrInputDocument> consumerFactory) {
ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory);
factory.setBatchListener(true);
factory.setConcurrency(2);
ExponentialBackOffWithMaxRetries backoff = new ExponentialBackOffWithMaxRetries(10);
backoff.setMultiplier(3); // Default is 1.5 but this seems more reasonable
factory.setCommonErrorHandler(new DefaultErrorHandler(null, backoff));
// Needed for manual commits
factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
return factory;
}
Note that the interface ZiDeserializer<SolrInputDocument> deserializeris my interface and ZiDeserializerFactory.getInstance(validatedKeyDeserializerName); returns my custom implementation of ZiDeserializer. And ZiDeserializer extends org.apache.kafka.common.serialization.Deserializer. This works for me

Configured ObjectMapper not used in spring-boot-webflux

I have mixins configured in my objectmapperbuilder config, using the regular spring web controller, the data outputted according to the mixins.
However using webflux, a controller with a method returning a Flow or Mono have the data serialized like if the objectmapper a default one.
How to get webflux to enforce an objectmapper configuration to be used ?
sample config:
#Bean
JavaTimeModule javatimeModule(){
return new JavaTimeModule();
}
#Bean
Jackson2ObjectMapperBuilderCustomizer jackson2ObjectMapperBuilderCustomizer(){
return jacksonObjectMapperBuilder -> jacksonObjectMapperBuilder.featuresToEnable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
.mixIn(MyClass.class, MyClassMixin.class);
}
I actually found my solution by stepping through the init code:
#Configuration
public class Config {
#Bean
JavaTimeModule javatimeModule(){
return new JavaTimeModule();
}
#Bean
Jackson2ObjectMapperBuilderCustomizer jackson2ObjectMapperBuilderCustomizer(){
return jacksonObjectMapperBuilder -> jacksonObjectMapperBuilder.featuresToEnable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
.mixIn(MyClass.class, MyClassMixin.class);
}
#Bean
Jackson2JsonEncoder jackson2JsonEncoder(ObjectMapper mapper){
return new Jackson2JsonEncoder(mapper);
}
#Bean
Jackson2JsonDecoder jackson2JsonDecoder(ObjectMapper mapper){
return new Jackson2JsonDecoder(mapper);
}
#Bean
WebFluxConfigurer webFluxConfigurer(Jackson2JsonEncoder encoder, Jackson2JsonDecoder decoder){
return new WebFluxConfigurer() {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.defaultCodecs().jackson2JsonEncoder(encoder);
configurer.defaultCodecs().jackson2JsonDecoder(decoder);
}
};
}
}
I translated the solution of #Alberto Galiana to Java and injected the configured Objectmapper for convenience, so you avoid having to do multiple configurations:
#Configuration
#RequiredArgsConstructor
public class WebFluxConfig implements WebFluxConfigurer {
private final ObjectMapper objectMapper;
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.defaultCodecs().jackson2JsonEncoder(
new Jackson2JsonEncoder(objectMapper)
);
configurer.defaultCodecs().jackson2JsonDecoder(
new Jackson2JsonDecoder(objectMapper)
);
}
}
Just implement WebFluxConfigurer and override method configureHttpMessageCodecs
Sample code for Spring Boot 2 + Kotlin
#Configuration
#EnableWebFlux
class WebConfiguration : WebFluxConfigurer {
override fun configureHttpMessageCodecs(configurer: ServerCodecConfigurer) {
configurer.defaultCodecs().jackson2JsonEncoder(Jackson2JsonEncoder(ObjectMapper()
.setSerializationInclusion(JsonInclude.Include.NON_EMPTY)))
configurer.defaultCodecs().jackson2JsonDecoder(Jackson2JsonDecoder(ObjectMapper()
.enable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES)))
}
}
Make sure all your data classes to be encoded/decoded have all its properties annotated with #JsonProperty even if property name is equal in class and json data
data class MyClass(
#NotNull
#JsonProperty("id")
val id: String,
#NotNull
#JsonProperty("my_name")
val name: String)
In my case, I was trying to use a customized ObjectMapper while inheriting all of the behavior from my app's default WebClient.
I found that I had to use WebClient.Builder.codecs. When I used WebClient.Builder.exchangeStrategies, the provided overrides were ignored. Not sure if this behavior is something specific to using WebClient.mutate, but this is the only solution I found that worked.
WebClient customizedWebClient = webClient.mutate()
.codecs(clientCodecConfigurer ->
clientCodecConfigurer.defaultCodecs()
.jackson2JsonDecoder(new Jackson2JsonDecoder(customObjectMapper)))
.build();
I have tried all the different solutions (#Primary #Bean for ObjectMapper, configureHttpMessageCodecs(), etc.). What worked for me at the end was specifying a MIME type. Here's an example:
#Configuration
class WebConfig: WebFluxConfigurer {
override fun configureHttpMessageCodecs(configurer: ServerCodecConfigurer) {
val encoder = Jackson2JsonEncoder(objectMapper, MimeTypeUtils.APPLICATION_JSON)
val decoder = Jackson2JsonDecoder(objectMapper, MimeTypeUtils.APPLICATION_JSON)
configurer.defaultCodecs().jackson2JsonEncoder(encoder)
configurer.defaultCodecs().jackson2JsonDecoder(decoder)
}
}

Jackson - configure override for collections via Jackson2ObjectMapperBuilderCustomizer

I am customizing treatment of collections in my Jackson's object mapper in my Spring Boot config by constructing a new mapper like so
#Configuration
public class Config {
#Autowired(required = true)
public void objectMapper(ObjectMapper mapper) {
mapper.configOverride(Collection.class).setInclude(JsonInclude.Value.construct(JsonInclude.Include.NON_EMPTY, null));
mapper.configOverride(List.class).setInclude(JsonInclude.Value.construct(JsonInclude.Include.NON_EMPTY, null));
mapper.configOverride(Map.class).setInclude(JsonInclude.Value.construct(JsonInclude.Include.NON_EMPTY, null));
}
While this works, I understand that a more elegant approach is to use Jackson2ObjectMapperBuilderCustomizer :
#Bean
public Jackson2ObjectMapperBuilderCustomizer customizeJackson2ObjectMapper() {
return new Jackson2ObjectMapperBuilderCustomizer() {
#Override
public void customize(Jackson2ObjectMapperBuilder builder) {
builder
.indentOutput(true)
.someOtherMethod(...)
}
};
}
How do I implement ObjectMapper collection tweaks above via Jackson2ObjectMapperBuilder ?
You can use a simple Module defined locally, like in this other use case. The SetupContext also has a configOverride() method, just like the ObjectMapper itself.
No idea ? I'm interested to do the same just to add :
mapper.configOverride(Map.Entry.class).setFormat(forShape(Shape.OBJECT));
Because #JsonFormat(shape = JsonFormat.Shape.OBJECT) doesn't work well ( https://github.com/FasterXML/jackson-databind/issues/1419 ) and but after Jackson 2.5 it is the only solution (but requires 2.9.x) to restore the previous behavior without writing a custom serializer.

Upgrade from Spring Boot 1.3 to Spring Boot 1.4 and Pageable is not working as expected.

I am converting an existing Spring Boot application from 1.3.6 to 1.4.1. I would like to have a default page size for repository and controller responses of 25. I am not getting the expected behavior in either case. For repository methods I am getting a page size of 20. For controllers I am getting 0 for the page size.
I added a new configuration class to define the default page size. I found this code snippet in another article. The debug message does get printed out.
#Configuration
public class RestConfigurationAdapter extends WebMvcConfigurerAdapter {
private static final int DEFAULT_PAGE_SIZE = 25;
#Override
public void addArgumentResolvers(List<HandlerMethodArgumentResolver> argumentResolvers) {
System.out.println("DEBUG: AddArguments----");
PageableHandlerMethodArgumentResolver resolver = new PageableHandlerMethodArgumentResolver();
resolver.setFallbackPageable(new PageRequest(0, DEFAULT_PAGE_SIZE));
argumentResolvers.add(resolver);
super.addArgumentResolvers(argumentResolvers);
}
}
In a custom controller I would like to have a default pageable populated with a size of 25. However the pageable object is null in this controller. In 1.3.x the pageable object worked as expected.
public class BatchManagerController
{
#Autowired
private BatchRepository batchRepository;
#Autowired
private PagedResourcesAssembler pagedResourcesAssembler;
#Transactional(readOnly = true)
#RequestMapping(value = "/search/managerBatchView", method = RequestMethod.GET, produces = MediaType.APPLICATION_JSON_VALUE)
#PreAuthorize("hasRole(T(com.nextgearcapital.tms.api.util.AuthorityEnum).MANAGER)")
public ResponseEntity<?> getManagerBatchListView(BatchListSearchRequest requestDTO, Pageable pageable, PersistentEntityResourceAssembler myAssembler)
{
System.out.println("DEBUG1:---------- " + pageable);
Page<Batch> batchPage = batchRepository.findBatchesForManager(requestDTO, pageable);
PagedResources<VaultResource> pagedResources = pagedResourcesAssembler.toResource(batchPage, myAssembler);
return new ResponseEntity<>(pagedResources, HttpStatus.OK);
}
}
When calling SDR Repository methods with a pageable parameter, the parameter works correctly, but it has a default page size of 20, rather than 25.
I would appreciate any help and advise in getting the correct configuration for pagination.
You probably have 2 solutions
Register the PageableHandlerMethodArgumentResolver as an #Bean which will disable the auto configuration for Spring Data Web.
Create a BeanPostProcessor to do additional configuration on the existing PageableHandlerMethodArgumentResolver.
Using #Bean
#Configuration
public class RestConfigurationAdapter extends WebMvcConfigurerAdapter {
private static final int DEFAULT_PAGE_SIZE = 25;
#Bean
public PageableHandlerMethodArgumentResolver pageableResolver() {
PageableHandlerMethodArgumentResolver resolver = new PageableHandlerMethodArgumentResolver();
resolver.setFallbackPageable(new PageRequest(0, DEFAULT_PAGE_SIZE));
return resolver;
}
#Override
public void addArgumentResolvers(List<HandlerMethodArgumentResolver> argumentResolvers) {
System.out.println("DEBUG: AddArguments----");
argumentResolvers.add(pageableResolver());
}
}
Drawback is that it will disable the autoconfiguration for Spring Data Web, so you might miss some things.
Using a BeanPostProcessor.
#Bean
public BeanPostProcessor pageableProcessor() {
private static final int DEFAULT_PAGE_SIZE = 25;
return new BeanPostProcessor() {
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
if (bean instanceof PageableHandlerMethodArgumentResolver) {
((PageableHandlerMethodArgumentResolver) bean).setFallbackPageable(new PageRequest(0, DEFAULT_PAGE_SIZE));
}
return bean;
}
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
return bean;
}
}
}
Drawback is that it is a little more complex as registering your own PageableHandlerMethodArgumentResolver instance as a bean. Advantage however is that you can simply use this to add additional configuration to existing beans and leave the auto configuration in tact.
Starting in spring-data-commons version 2.0, there is are 2 new classes that might make this kind of thing easier:
SortHandlerMethodArgumentResolverCustomizer
PageableHandlerMethodArgumentResolverCustomizer
Unfortunately that's not the version that ships with the current version (1.5.9) of Spring Boot, so replace at your own risk.
#Bean
PageableHandlerMethodArgumentResolverCustomizer pagingCustomizer() {
// p is PageableHandlerMethodArgumentResolver
return p -> p.setMaxPageSize(25);
}
In this case, one would probably call resolveArgument to manipulate it.
That said, I'm not sure spring-data-rest would use that config. There is a HateoasPageableHandlerMethodArgumentResolver which seems more likely that source of what I would think SDR would use. If that's the case, the BeanPostProcessor #M. Deinum suggested is probably your best option.
Spring Data Web Support

Collisions may occur when using Spring #Cacheable and SimpleKeyGenerator

When I use #Cacheable and call different method with same parameter, it generated a same key.
SimpleKeyGenerator generated key without cache names.
I use spring-boot 1.3.2 with spring 4.2.4.
Here is a sample:
#Component
public static class CacheableTestClass {
#Cacheable(cacheNames = "test-cacheproxy-echo1")
public String echo1(String text) {
return text;
}
#Cacheable(cacheNames = "test-cacheproxy-echo2")
public String echo2(String text) {
return "Another " + text;
}
}
And run a test:
assertEquals("OK", cacheableTestClass.echo1("OK"));
assertEquals("Another OK", cacheableTestClass.echo2("OK")); // Failure: expected 'Another OK', actual 'OK'.
So, is there a way to resolve this issue?
Thanks a lot.
Update
Here is my CacheManager configuration.
#Bean
#ConditionalOnMissingBean(name = "cacheRedisTemplate")
public RedisTemplate<Object, Object> cacheRedisTemplate(
RedisConnectionFactory redisConnectionFactory)
throws UnknownHostException {
RedisTemplate<Object, Object> template = new RedisTemplate<>();
template.setConnectionFactory(redisConnectionFactory);
template.setValueSerializer(new GenericJackson2JsonRedisSerializer());
template.setHashKeySerializer(template.getKeySerializer());
return template;
}
#Bean
public RedisCacheManager cacheManager(#Qualifier("cacheRedisTemplate") RedisTemplate<Object, Object> cacheRedisTemplate) {
RedisCacheManager cacheManager = new RedisCacheManager(cacheRedisTemplate);
cacheManager.setDefaultExpiration(
redisCacheProperties().getDefaultExpiration());
cacheManager.setExpires(redisCacheProperties().getExpires());
return cacheManager;
}
This has nothing to do with SimpleKeyGenerator but this is a redis-specific issue that does not use the name of the cache as a discriminant for the key it uses to store the value.
You need to invoke setUsePrefix(true) on your RedisCacheManager. This is what Spring Boot does when it auto-configures the cache manager for you. Note that it should have been the default and we're discussing how we can improve the out-of-the-box experience in a future release

Resources