How to create a second RedisTemplate instance in a Spring Boot application - spring-boot

According to this answer, one RedisTemplate cannot support multiple serializers for values. So I want to create multiple RedisTemplates for different needs, specifically one for string actions and one for object to JSON serializations, to be used in RedisCacheManager. I'm using Spring Boot and the current RedisTemplate is autowired, I'm wondering what's the correct way to declare a second RedisTemplate instance sharing the same Jedis connection factory but has its own serializers?
Tried something like this in two different components,
Component 1 declares,
#Autowired
private RedisTemplate redisTemplate;
redisTemplate.setValueSerializer(new Jackson2JsonRedisSerializer(Instance.class));
Component 2 declares,
#Autowired
private StringRedisTemplate stringRedisTemplate;
In this case the two templates actually are the same. Traced into Spring code and found component 1's template got resolved to autoconfigured stringRedisTemplate.
Manually calling RedisTemplate's contructor and then its afterPropertiesSet() won't work either as it complains no connection factory can be found.
I know this request probably is no big difference from defining another bean in a Spring app but not sure with the current Spring-Data-Redis integration what's the best way for me to do. Please help, thanks.

you can follow two ways how to use multiple RedisTemplates within one Spring Boot application:
Named bean injection with #Autowired #Qualifier("beanname") RedisTemplate myTemplate and create the bean with #Bean(name = "beanname").
Type-safe injection by specifying type parameters on RedisTemplate (e.g. #Autowired RedisTemplate<byte[], byte[]> byteTemplate and #Autowired RedisTemplate<String, String> stringTemplate).
Here's the code to create two different:
#Configuration
public class Config {
#Bean
public RedisTemplate<String, String> stringTemplate(RedisConnectionFactory redisConnectionFactory) {
RedisTemplate<String, String> stringTemplate = new RedisTemplate<>();
stringTemplate.setConnectionFactory(redisConnectionFactory);
stringTemplate.setDefaultSerializer(new StringRedisSerializer());
return stringTemplate;
}
#Bean
public RedisTemplate<byte[], byte[]> byteTemplate(RedisConnectionFactory redisConnectionFactory) {
RedisTemplate<byte[], byte[]> byteTemplate = new RedisTemplate<>();
byteTemplate.setConnectionFactory(redisConnectionFactory);
return byteTemplate;
}
}
HTH, Mark

Related

SpEL KafkaListener. How can i inject custom deserializer through properties?

I am using spring.
I have a configured ObjectMapper for the entire project and I use it to set up a kafka deserializer.
And then I need a custom kafka deserializer to be used in KafkaListener.
I'm configuring KafkaListener via autoconfiguration, not via #Configuration class.
#Component
#RequiredArgsConstructor
public class CustomMessageDeserializer implements Deserializer<MyMessage> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public MyMessage deserialize(String topic, byte[] data) {
return objectMapper.readValue(data, MyMessage.class);
}
}
If i do like this
#KafkaListener(
topics = {"${topics.invite-user-topic}"},
properties = {"value.deserializer=com.service.deserializer.CustomMessageDeserializer"}
)
public void receiveInviteUserMessages(MyMessage myMessage) {}
I received KafkaException: Could not find a public no-argument constructor
But with public no-argument constructor in CustomMessageDeserializer class i am getting NPE because ObjectMapper = null. It creates and uses a new class, not a spring component.
#KafkaListener supports SpEL expressions.
And I think that this problem can be solved using SpEL.
Do you have any idea how to inject spring bean CustomMessageDeserializer with SpEL?
There are no easy ways to do it with SPeL.
Analysis
To get started, see the JavaDoc for #KafkaListener#properties:
/**
*
* SpEL expressions must resolve to a String ...
*/
The value of value.deserializer is used to instantiate the specified deserializer class. Let's follow the call chain:
You specify this value in the #KafkaListener annotation, then you are probably not creating a bean of the ConsumerFactory.class. So Spring creates this bean class itself - see KafkaAutoConfiguration#kafkaConsumerFactory.
Next is the creation of the returned object new DefaultKafkaConsumerFactory(...) as ConsumerFactory<?,?> using the constructor for default delivery expressions keyDeserializer/valueDeserializer = () -> null
This factory is used to create a Kafka consumer (The entry point is the constructor KafkaMessageListenerContainer#ListenerConsumer, then KafkaMessageListenerContainer.this.consumerFactory.createConsumer...)
In the KafkaConsumer constructor, the valueDeserializer object is being created, because it is null (for the default factory of point 2 above):
if (valueDeserializer == null) {
this.valueDeserializer = config.getConfiguredInstance(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, Deserializer.class);
The implementation of config.getConfiguredInstance involves instantiating your deserializer class via a parameterless constructor using reflection and your String "com.service.deserializer.CustomMessageDeserializer" class name
Solutions
To use value.deserializer with your customized ObjectMapper, you must create the ConsumerFactory bean yourself using the setValueDeserializer(...) method. This is also mentioned in the second Important part of the JSON.Mapping_Types.Important documentation
If you don't want to create a ConsumerFactory bean, and also don't have complicated logic in your deserializer (you only have return objectMapper.readValue(data, MyMessage.class);), then register DefaultKafkaConsumerFactoryCustomizer:
#Bean
// inject your custom objectMapper
public DefaultKafkaConsumerFactoryCustomizer customizeJsonDeserializer(ObjectMapper objectMapper) {
return consumerFactory ->
consumerFactory.setValueDeserializerSupplier(() ->
new org.springframework.kafka.support.serializer.JsonDeserializer<>(objectMapper));
}
In this case, you don't need to create your own CustomMessageDeserializer class (remove it) and Spring will automatically parse the message into your MyMessage.
#KafkaListener annotation should also not contains the property properties = {"value.deserializer=com.my.kafka_test.component.CustomMessageDeserializer"}. This DefaultKafkaConsumerFactoryCustomizer bean will automatically be used to configure the default ConsumerFactory<?, ?> (see the implementation of the KafkaAutoConfiguration#kafkaConsumerFactory method)
Here how it works for me:
#KafkaListener(topics = "${solr.kafka.topic}", containerFactory = "batchFactory")
public void listen(List<SolrInputDocument> docs, #Header(KafkaHeaders.BATCH_CONVERTED_HEADERS) List<Map<String, Object>> headers, Acknowledgment ack) throws IOException {...}
And then I have 2 beans defined in my Configuration
#Profile("!test")
#Bean
#Autowired
public ConsumerFactory<String, SolrInputDocument> consumerFactory(KafkaProperties properties) {
Map<String, Object> props = properties.buildConsumerProperties();
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
DefaultKafkaConsumerFactory<String, SolrInputDocument> result = new DefaultKafkaConsumerFactory<>(props);
String validatedKeyDeserializerName = KafkaMessageType.valueOf(keyDeserializerName).toString();
ZiDeserializer<SolrInputDocument> deserializer = ZiDeserializerFactory.getInstance(validatedKeyDeserializerName);
result.setValueDeserializer(deserializer);
return result;
}
#Profile("!test")
#Bean
#Autowired
public ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> batchFactory(ConsumerFactory<String, SolrInputDocument> consumerFactory) {
ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory);
factory.setBatchListener(true);
factory.setConcurrency(2);
ExponentialBackOffWithMaxRetries backoff = new ExponentialBackOffWithMaxRetries(10);
backoff.setMultiplier(3); // Default is 1.5 but this seems more reasonable
factory.setCommonErrorHandler(new DefaultErrorHandler(null, backoff));
// Needed for manual commits
factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
return factory;
}
Note that the interface ZiDeserializer<SolrInputDocument> deserializeris my interface and ZiDeserializerFactory.getInstance(validatedKeyDeserializerName); returns my custom implementation of ZiDeserializer. And ZiDeserializer extends org.apache.kafka.common.serialization.Deserializer. This works for me

Java JobRunr when using Spring Boot Redis Starter

How do I create and use the Redis connection that spring-boot-starter-data-redis creates? It doesn't seem like there is a Bean for RedisClient created by the default auto configuration so I'm not sure of the best way to do this.
The documentation does state that in this case you need to create the StorageProvider yourself which is fine, but can you reuse what Spring Boot has already created. I believe this would need to be a pooled connection which you would also need to enable through Spring Boot.
RedisTemplate offers a high-level abstraction for Redis interactions:
https://docs.spring.io/spring-data/data-redis/docs/current/reference/html/#redis:template
Redis autoconfiguration :
#AutoConfiguration
#ConditionalOnClass({RedisOperations.class})
#EnableConfigurationProperties({RedisProperties.class})
#Import({LettuceConnectionConfiguration.class, JedisConnectionConfiguration.class})
public class RedisAutoConfiguration {
public RedisAutoConfiguration() {
}
#Bean
#ConditionalOnMissingBean(
name = {"redisTemplate"}
)
#ConditionalOnSingleCandidate(RedisConnectionFactory.class)
public RedisTemplate<Object, Object> redisTemplate(RedisConnectionFactory redisConnectionFactory) {
RedisTemplate<Object, Object> template = new RedisTemplate();
template.setConnectionFactory(redisConnectionFactory);
return template;
}
#Bean
#ConditionalOnMissingBean
#ConditionalOnSingleCandidate(RedisConnectionFactory.class)
public StringRedisTemplate stringRedisTemplate(RedisConnectionFactory redisConnectionFactory) {
return new StringRedisTemplate(redisConnectionFactory);
}
}
Here you can find the corresponding configuration properties(including connection pool default configuration).
Simple implementation example :
https://www.baeldung.com/spring-data-redis-tutorial

Getting qualifier names from initialized bean objects

I have two beans of the same type;
#Bean
public RestTemplate jsonTemplate() {
return new RestTemplate();
}
#Bean
public RestTemplate xmlTemplate() {
return new RestTemplate();
}
And I autowire both beans into a list as follows;
#Autowired
private List<RestTemplate> templates;
The list templates will have both beans inside with size=2.
From this list, how can I get their names (["jsonTemplate", "xmlTemplate"])?
It was really simple...
Just doing;
#Autowired
private Map<String, RestTemplate> templates;
will let Spring to insert the names as keys and the beans themselves as the values in
the map.
It seems Spring just stops keeping track of the naming after the injection. So I don't know if there is any other way (or, if even possible, simpler way) than this?
You could use map of beans:
#Bean
public Map<String, RestTemplate> templateMap(RestTemplate jsonTemplate, RestTemplate xmlTemplate) {
Map<String, RestTemplate> map = new HashgMap<>();
map.put("jsonTemplate", jsonTemplate);
map.put("xmlTemplate", xmlTemplate);
return map;
}
#Autowired
private Map<String, RestTemplate> templates;

Spring Auto Configuration prioritization between Lettuce or Jedis

I want to use Lettuce as a Redis Client, which is the default dependency for spring-boot-starter-data-redis-reactive. However I am inheriting Jedis as a dependency from another component written as pure Java code (no Spring). This is resulting in a conflict when initializing LettuceConnectionFactory due to presence of JedisConnectionFactory.
How can I keep Jedis in dependency for other component's use while ensuring that LettuceConnectionFactory is initialised for my own code? The main reason for using LetticeConnectionFactory is reactive programming in my service.
Both the connection factories are configured for initialization via RedisAutoConfiguration with no option of prioritisation.
https://github.com/spring-projects/spring-boot/blob/master/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/data/redis/RedisAutoConfiguration.java
You could override RedisConnectionFactory by creating your own #Configuration class and adding new #Bean in it.
For example:
#Bean
RedisConnectionFactory myLettuceConnectionFactory() {
// your setup....
new LettuceConnectionFactory();
}
and then use myLettuceConnectionFactory bean to setup RedisTemplate #Bean
#Bean
public RedisTemplate<String, Object> redisTemplate() {
final RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(myLettuceConnectionFactory());
// other settings...
return template;
}

Spring Java Config - Is it possible to create a #Bean dinamically?

Given this configuration class, I want to dinamically create a DataSourceTransactionManager bean for each one of the DataSource objects, is it possible?
#Configuration
public class SomeConfig {
#Autowired
private DataSource[] dataSources;
}
That is to say, I want to loop the dataSources array to create a #Bean that returns a new DataSourceTransactionManager(dataSource[i]).
In this case I don't want to create a #Bean List<DataSourceTransactionManager> as answered here, but a number of #Bean DataSourceTransactionManager instances.

Resources