Dynamic Code Evaluation: Unsafe Deserialization for RedisTemplate - spring-boot

I get this error. Do you have any idea ? How can I solve this ?
public RedisTemplate<String, Object> redisObjectTemplate(RedisConnectionFactory connectionFactory) {

Related

springboot configuration redis serialization

Why is the error "Could not autowire. No beans of 'RedisConnectionFactory' type found" reported here?
Serialization doesn't work
I used the new way to inject the object and it didn't solve the problem:
#Configuration
public class RedisConfig {
#Resource
private RedisConnectionFactory redisConnectionFactory;
#Bean
public RedisTemplate<Object,Object> redisTemplate(){
RedisTemplate<Object,Object> redisTemplate=new RedisTemplate<>();
Jackson2JsonRedisSerializer jackson2JsonRedisSerializer = new Jackson2JsonRedisSerializer(Object.class);
redisTemplate.setConnectionFactory(redisConnectionFactory);
redisTemplate.setKeySerializer(new StringRedisSerializer());
redisTemplate.setValueSerializer(jackson2JsonRedisSerializer);
redisTemplate.setHashKeySerializer(new StringRedisSerializer());
redisTemplate.setHashValueSerializer(jackson2JsonRedisSerializer);
redisTemplate.setStringSerializer(new StringRedisSerializer());
return redisTemplate;
}
}
When I was implementing redis server I solved this issue by writing code like this
#Configuration
public class RedisConfiguration {
#Bean
#ConditionalOnMissingBean(name = "redisTemplate")
#Primary
public <T> RedisTemplate<String, T> redisTemplate(RedisConnectionFactory connectionFactory) {
final RedisTemplate<String, T> template = new RedisTemplate<>();
template.setConnectionFactory(connectionFactory);
ObjectMapper om = new ObjectMapper();
om.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
om.registerModule(new JavaTimeModule());
template.setKeySerializer(new StringRedisSerializer());
template.setHashValueSerializer(new GenericJackson2JsonRedisSerializer(om));
template.setValueSerializer(new GenericJackson2JsonRedisSerializer(om));
return template;
}
}
This code may be little different from your requirements, but I think if you look closely you can modified according with your needs.

Getting ProducerFencedException on producing record in Kafka listener thread

I am getting this exception when producing the message inside the kafka listener container.
javax.management.InstanceAlreadyExistsException: kafka.producer:type=app-info,id=producer-tx-group.topicA.1
org.apache.kafka.common.errors.ProducerFencedException: The producer has been rejected from the broker because it tried to use an old epoch with the transactionalId
My listener looks like this
#Transactional
#kafkaListener(...)
listener(topicA, message){
process(message)
produce(topicB, notification) // use Kafkatemplate to send the message
}
My configuration looks like this
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory(KafkaTransactionManager kafkaTransactionManager) {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.getContainerProperties().setTransactionManager(kafkaTransactionManager);
return factory;
}
public ProducerFactory<String, Object> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, enableIdempotence);
DefaultKafkaProducerFactory<String, Object> factory = new
DefaultKafkaProducerFactory<>(props);
factory.setTransactionIdPrefix(transactionIdPrefix);
return factory;
}
#Bean
public KafkaTemplate<String, Object> kafkaTemplate() {
KafkaTemplate<String, Object> template = new KafkaTemplate<>(producerFactory());
return template;
}
#Bean
public KafkaTransactionManager kafkaTransactionManager() {
KafkaTransactionManager manager = new KafkaTransactionManager(producerFactory());
return manager;
}
I know when ProducerFencedException is thrown by Kafka, But what I am trying to figure out here where is the second producer with the same transaction.id.
If I set the unique transaction prefix in the Kafka template it works fine
#Bean
public KafkaTemplate<String, Object> kafkaTemplate() {
KafkaTemplate<String, Object> template = new KafkaTemplate<>(producerFactory());
template.setTransactionIdPrefix(MessageFormat.format("{0}-{1}", transactionIdPrefix, UUID.randomUUID().toString()));
return template;
}
But I am trying to understand the exception here, from where the other producer is being started with the same transaction id which follow this pattern for listener started transactions as per spring docs group.id/topic/partition
I am just trying this locally on single application instance.
I found the root cause, I was creating two producer instances here
public ProducerFactory<String, Object> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, enableIdempotence);
DefaultKafkaProducerFactory<String, Object> factory = new
DefaultKafkaProducerFactory<>(props);
factory.setTransactionIdPrefix(transactionIdPrefix);
return factory;
}
I was missing Bean configuration.
Adding #Bean on producing factor and properly auto wiring it in template and TM fixed the issue.

RedisTemplate nullException

There was a problem when I used Springboot to integrate Redis. I want to customize a RedisTemplate, and when I use it, I find that it is always empty and cannot be injected. My code is as follows:
#Configuration
public class RedisConfig {
#Bean(name = "myRedisTemplate")
public RedisTemplate<String, Object> redisTemplate(RedisConnectionFactory redisConnectionFactory) {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(redisConnectionFactory);
Jackson2JsonRedisSerializer jackson2JsonRedisSerializer = new Jackson2JsonRedisSerializer(Object.class);
ObjectMapper om = new ObjectMapper();
om.setVisibility(PropertyAccessor.ALL, JsonAutoDetect.Visibility.ANY);
om.activateDefaultTyping(LaissezFaireSubTypeValidator.instance, ObjectMapper.DefaultTyping.NON_FINAL);
jackson2JsonRedisSerializer.setObjectMapper(om);
StringRedisSerializer stringRedisSerializer = new StringRedisSerializer();
template.setKeySerializer(stringRedisSerializer);
template.setHashKeySerializer(stringRedisSerializer);
template.setValueSerializer(jackson2JsonRedisSerializer);
template.setHashKeySerializer(jackson2JsonRedisSerializer);
template.afterPropertiesSet();
return template;
}
#Autowired
#Qualifier(value = "myRedisTemplate")
private RedisTemplate<String, Object> redisTemplate;
redisTemplate is empty in debug mode.I don't know what went wrong
It could be caused by jedisConnectionFactory do you have such a Bean
#Bean
JedisConnectionFactory jedisConnectionFactory() {
return new JedisConnectionFactory();
}

Kafka: Topic not present in metadata Exception

I use the Spring KafkaTemplate abilities to send message in Kafak-topic.
Configuration is:
#Bean
public KafkaAdmin createKafkaAdmin() {
Map<String, Object> configs = new HashMap<>();
configs.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:2181");
return new KafkaAdmin(configs);
}
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"localhost:2181");
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,StringSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
Then I try to send message:
#Autowire
private KafkaTemplate<String, String> kafkaTemplate;
ListenableFuture<SendResult<String, String>> future =
kafkaTemplate.send("waiting_for_ack",key, value);
But I receive the following exception:
TimeoutException: Topic waiting_for_ack not present in metadata after 60000 ms.
Target topic exist, in which was able to make sure, by:
./kafka-topics.sh --zookeeper localhost:2181 --list _consumer_offsets
waiting_for_ack
What am I do wrong, I what way to determine the cause of this exception?
You need to specify the broker urls instead of the zookeeper url in the BOOTSTRAP_SERVERS_CONFIG property. You can try checking for it in the
server.properties
available in /config folder under the kafka installation.Usually it would be
bootstrap.servers=localhost:9092

Multi redis connection issue

I am trying to connect my project with two Redis connection but I got
Consider defining a bean named 'redisTemplate' in your configuration.
I don't know if my code is correct but this is my code config, my spring boot version 2.4.2 :
#Configuration
public class RedisConfiguration {
#Autowired
private EditorialRedisPropertyConfiguration editorialRedisConfiguration;
#Autowired
private ProductRedisPropertyConfiguration productRedisConfiguration;
#Bean(name = "editorialRedisTemplate")
public RedisTemplate<String, ?> editorialRedisTemplate(
#Qualifier(value = "redisEditorialConnectionFactory") RedisConnectionFactory redisConnectionFactory) {
RedisTemplate<String, ?> template = new RedisTemplate<>();
template.setConnectionFactory(redisEditorialConnectionFactory());
return template;
}
#Bean(name = "productRedisTemplate")
public RedisTemplate<String, ?> productRedisTemplate(
#Qualifier(value = "redisProductConnectionFactory") RedisConnectionFactory redisConnectionFactory) {
RedisTemplate<String, ?> template = new RedisTemplate<>();
template.setConnectionFactory(redisConnectionFactory);
return template;
}
#SuppressWarnings("deprecation")
private Jackson2JsonRedisSerializer<Object> initJackson2JsonRedisSerializer() {
. ..
}
#Bean(name = "redisEditorialConnectionFactory")
#Primary
public LettuceConnectionFactory redisEditorialConnectionFactory() {
...
}
#Bean(name = "redisProductConnectionFactory")
#Primary
public LettuceConnectionFactory redisProductConnectionFactory() {
.....
}
}
with application config :
spring.autoconfigure.exclude=org.springframework.boot.autoconfigure.data.redis.RedisReactiveAutoConfiguration
Not quite sure about your application code but this can be fixed by simply creating another bean, given you don't use that you're free to create it from any redis config.
#Bean(name = "productRedisTemplate")
public RedisTemplate<String, ?> redisTemplate(
#Qualifier(value = "redisProductConnectionFactory") RedisConnectionFactory redisConnectionFactory) {
RedisTemplate<String, ?> template = new RedisTemplate<>();
template.setConnectionFactory(redisConnectionFactory);
return template;
}

Resources