In my REST controllers Spring project, I want to store Session information in Redis.
In my application.properties I have defined the following:
spring.session.store-type=redis
spring.session.redis.namespace=rdrestcore
com.xyz.redis.host=192.168.201.46
com.xyz.redis.db=0
com.xyz.redis.port=6379
com.xyz.redis.pool.min-idle=5
I also have enabled Http Redis Session with:
#Configuration
#EnableRedisHttpSession
public class SessionConfig extends AbstractHttpSessionApplicationInitializer
{}
I finally have a redis connection factory like this:
#Configuration
#EnableRedisRepositories
public class RdRedisConnectionFactory {
#Autowired
private Environment env;
#Value("${com.xyz.redis.host}")
private String redisHost;
#Value("${com.xyz.redis.db}")
private Integer redisDb;
#Value("${com.xyz.redis.port}")
private Integer redisPort;
#Value("${com.xyz.redis.pool.min-idle}")
private Integer redisPoolMinIdle;
#Bean
JedisPoolConfig jedisPoolConfig() {
JedisPoolConfig poolConfig = new JedisPoolConfig();
if(redisPoolMinIdle!=null) poolConfig.setMinIdle(redisPoolMinIdle);
return poolConfig;
}
#Bean
JedisConnectionFactory jedisConnectionFactory() {
JedisConnectionFactory jedisConFactory = new JedisConnectionFactory();
if(redisHost!=null) jedisConFactory.setHostName(redisHost);
if(redisPort!=null) jedisConFactory.setPort(redisPort);
if(redisDb!=null) jedisConFactory.setDatabase(redisDb);
jedisConFactory.setPoolConfig(jedisPoolConfig());
return jedisConFactory;
}
#Bean
public RedisTemplate<String, Object> redisTemplate() {
final RedisTemplate< String, Object > template = new RedisTemplate();
template.setConnectionFactory( jedisConnectionFactorySpring());
template.setKeySerializer( new StringRedisSerializer() );
template.setValueSerializer( new GenericJackson2JsonRedisSerializer() );
template.setHashKeySerializer(new StringRedisSerializer());
template.setHashValueSerializer( new GenericJackson2JsonRedisSerializer() );
return template;
}
}
With this configuration, the session information gets stored in Redis, but, it is serialized very strangely. I mean, the keys are readable, but the values stored are not (I query the information from a program called "Redis Desktop Manager")... for example... for a new session, I get a hash with key:
*spring:session:sessions:c1110241-0aed-4d40-9861-43553b3526cb*
and the keys this hash contains are: maxInactiveInterval, lastAccessedTime, creationTime, sessionAttr:SPRING_SECURITY_CONTEXT
but their values are all they coded like something similar to this:
\xAC\xED\x00\x05sr\x00\x0Ejava.lang.Long;\x8B\xE4\x90\xCC\x8F#\xDF\x02\x00\x01J\x00\x05valuexr\x00\x10java.lang.Number\x86\xAC\x95\x1D\x0B\x94\xE0\x8B\x02\x00\x00xp\x00\x00\x01b$G\x88*
(for the creationTime key)
and if I try to access this information from code, with the redisTemplate, it rises an exception like this one:
Exception occurred in target VM: Cannot deserialize; nested exception is
org.springframework.core.serializer.support.SerializationFailedException:
Failed to deserialize payload. Is the byte array a result of
corresponding serialization for DefaultDeserializer?; nested exception
is java.io.StreamCorruptedException: invalid stream header: 73657373
org.springframework.data.redis.serializer.SerializationException: Cannot deserialize; nested exception is
org.springframework.core.serializer.support.SerializationFailedException:
Failed to deserialize payload. Is the byte array a result of
corresponding serialization for DefaultDeserializer?; nested exception
is java.io.StreamCorruptedException: invalid stream header: 73657373
at org.springframework.data.redis.serializer.JdkSerializationRedisSerializer.deserialize(JdkSerializationRedisSerializer.java:82)
I think it is some kind of problem with the serialization/deserialization of the Spring session information, but I don't know what else to do to be able to control that.
Does anyone know what Im doing wrong?
Thank you
You're on the right track, your problem is serialization indeed. Try this configuration (configure your template with these serializers only):
template.setHashValueSerializer(new JdkSerializationRedisSerializer());
template.setHashKeySerializer(new StringRedisSerializer());
template.setKeySerializer(new StringRedisSerializer());
template.setDefaultSerializer(new JdkSerializationRedisSerializer());
Related
I am migrating a Kafka Streams implementation which uses pure Kafka apis to use spring-kafka instead as it's incorporated in a spring-boot application.
Everything works fine the Stream, GlobalKTable, branching that I have all works perfectly fine but I am having a hard time incorporating a ReadOnlyKeyValueStore. Based on the spring-kafka documentation here: https://docs.spring.io/spring-kafka/docs/2.6.10/reference/html/#streams-spring
It says:
If you need to perform some KafkaStreams operations directly, you can
access that internal KafkaStreams instance by using
StreamsBuilderFactoryBean.getKafkaStreams(). You can autowire
StreamsBuilderFactoryBean bean by type, but you should be sure to use
the full type in the bean definition.
Based on that I tried to incorporate it to my example as in the following fragments below:
#Bean(name = KafkaStreamsDefaultConfiguration.DEFAULT_STREAMS_CONFIG_BEAN_NAME)
public KafkaStreamsConfiguration defaultKafkaStreamsConfig() {
Map<String, Object> props = defaultStreamsConfigs();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "quote-stream");
props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, SpecificAvroSerde.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "stock-quotes-stream-group");
return new KafkaStreamsConfiguration(props);
}
#Bean(name = KafkaStreamsDefaultConfiguration.DEFAULT_STREAMS_BUILDER_BEAN_NAME)
public StreamsBuilderFactoryBean defaultKafkaStreamsBuilder(KafkaStreamsConfiguration defaultKafkaStreamsConfig) {
return new StreamsBuilderFactoryBean(defaultKafkaStreamsConfig);
}
...
final GlobalKTable<String, LeveragePrice> leverageBySymbolGKTable = streamsBuilder
.globalTable(KafkaConfiguration.LEVERAGE_PRICE_TOPIC,
Materialized.<String, LeveragePrice, KeyValueStore<Bytes, byte[]>>as("leverage-by-symbol-table")
.withKeySerde(Serdes.String())
.withValueSerde(leveragePriceSerde));
leveragePriceView = myKStreamsBuilder.getKafkaStreams().store("leverage-by-symbol-table", QueryableStoreTypes.keyValueStore());
But adding the StreamsBuilderFactoryBean(which seems to be needed to get a reference to KafkaStreams) definition causes an error:
The bean 'defaultKafkaStreamsBuilder', defined in class path resource [com/resona/springkafkastream/repository/KafkaConfiguration.class], could not be registered. A bean with that name has already been defined in class path resource [org/springframework/kafka/annotation/KafkaStreamsDefaultConfiguration.class] and overriding is disabled.
The issue is I don't want to control the lifecycle of the stream that's what I get with the plain Kafka APIs so I would like to get a reference to the default managed one as I want spring to manage it but whenever I try to expose the bean it gives the error. Any ideas on what's the correct approach to that using spring-kafka?
P.S - I am not interested in solutions using spring-cloud-stream I am looking for implementations of spring-kafka.
You don't need to define any new beans; something like this should work...
spring.application.name=quote-stream
spring.kafka.streams.properties.default.key.serde=org.apache.kafka.common.serialization.Serdes$StringSerde
spring.kafka.streams.properties.default.value.serde=org.apache.kafka.common.serialization.Serdes$StringSerde
#SpringBootApplication
#EnableKafkaStreams
public class So69669791Application {
public static void main(String[] args) {
SpringApplication.run(So69669791Application.class, args);
}
#Bean
GlobalKTable<String, String> leverageBySymbolGKTable(StreamsBuilder sb) {
return sb.globalTable("gkTopic",
Materialized.<String, String, KeyValueStore<Bytes, byte[]>> as("leverage-by-symbol-table"));
}
private ReadOnlyKeyValueStore<String, String> leveragePriceView;
#Bean
StreamsBuilderFactoryBean.Listener afterStart(StreamsBuilderFactoryBean sbfb,
GlobalKTable<String, String> leverageBySymbolGKTable) {
StreamsBuilderFactoryBean.Listener listener = new StreamsBuilderFactoryBean.Listener() {
#Override
public void streamsAdded(String id, KafkaStreams streams) {
leveragePriceView = streams.store("leverage-by-symbol-table", QueryableStoreTypes.keyValueStore());
}
};
sbfb.addListener(listener);
return listener;
}
#Bean
KStream<String, String> stream(StreamsBuilder builder) {
KStream<String, String> stream = builder.stream("someTopic");
stream.to("otherTopic");
return stream;
}
}
I have below method which caches student-classes , I want to clear only the cache name of student-classes
#Cacheable( value = "getStudentClasses",key ="(new net.student.util.CacheKeyCreator()).createKey(''+#university)",cacheManager = "cacheManager")
public List<StudentClass> getStudentClasses(String university) {
//get studentclasses
}
I have tried to clear as below, but it doesn't clear the cache with the specific name
#Bean(name = "cacheManager")
public CacheManager cacheManager ( RedisTemplate<String, Object> redisTemplate ) {
RedisCacheManager redisCacheManager = new RedisCacheManager( redisTemplate );
redisCacheManager.setDefaultExpiration(0);
redisCacheManager.setUsePrefix( true);
return redisCacheManager;
}
#Autowired
ApplicationContext context;
public void clearStudentClasses(){
CacheManager cacheManager= (CacheManager) context.getBean("cacheManager");
cacheManager.getCache("getStudentClasses").clear(); //exceptionLine
}
I got this exception at exception line
> org.springframework.dao.InvalidDataAccessApiUsageException: ERR
> unknown command 'EVAL'; nested exception is
> redis.clients.jedis.exceptions.JedisDataException: ERR unknown command
> 'EVAL'
I am using Spring Boot 2.3.1 and want to publish records that could not be deserialized using the DeadLetterPublishingRecoverer.
Everything looks fine, except that the original payload isn't written to the DLT topic. Instead I see it Base64 encoded.
In a different posting I have read that this is caused by the JsonSerializer that is used in the Kafkatemplate, so I tried using a different template. But now I get an SerializationException:
org.apache.kafka.common.errors.SerializationException: Can't convert value of class [B to class org.apache.kafka.common.serialization.BytesSerializer specified in value.serializer
A similar exception occurs when using the StringSerializer.
My code looks like this:
#Autowired
private KafkaProperties kafkaProperties;
private ProducerFactory<String, String> pf() {
return new DefaultKafkaProducerFactory<>(kafkaProperties.buildProducerProperties());
}
private KafkaTemplate<String, String> stringTemplate() {
return new KafkaTemplate<>(pf(), Collections.singletonMap(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class));
}
#Bean
public SeekToCurrentErrorHandler errorHandler() {
SeekToCurrentErrorHandler eh = new SeekToCurrentErrorHandler(new DeadLetterPublishingRecoverer(stringTemplate()));
eh.setLogLevel(Level.WARN);
return eh;
}
Found it just 5 minutes later.
I had to use the ByteArraySerializer instead.
I am using a Redis cache (via the Jedis client), and I would like to use ValueOperations#multiGet, which takes a Collection of keys, and returns a List of objects from the cache, in the same order. My question is, what happens when some of the keys are in the cache, but others are not? I am aware that underneath, Redis MGET is used, which will return nil for any elements that are not in the cache.
I cannot find any documentation of how ValueOperations will interpret this response. I assume they will be null, and can certainly test it, but it would be dangerous to build a system around undocumented behavior.
For completeness, here is how the cache client is configured:
#Bean
public RedisConnectionFactory redisConnectionFactory() {
JedisConnectionFactory redisConnectionFactory = new JedisConnectionFactory();
redisConnectionFactory.setHostName(address);
redisConnectionFactory.setPort(port);
redisConnectionFactory.afterPropertiesSet();
return redisConnectionFactory;
}
#Bean
public ValueOperations<String, Object> someRedisCache(RedisConnectionFactory cf) {
RedisTemplate<String, Object> redisTemplate = new RedisTemplate<>();
redisTemplate.setConnectionFactory(cf);
redisTemplate.setDefaultSerializer(new GenericJackson2JsonRedisSerializer());
redisTemplate.afterPropertiesSet();
return redisTemplate.opsForValue();
}
I am using spring-data-redis:2.1.4
So, is there any documentation around this, or some reliable source of truth?
After some poking around, it looks like the answer has something to do with the serializer used - in this case GenericJackson2JsonRedisSerializer. Not wanting to dig too much, I simply wrote a test validating that any (nil) values returned by Redis are convereted to null:
#Autowired
ValueOperations<String, SomeObject> valueOperations
#Test
void multiGet() {
//Given
SomeObject someObject = SomeObject
.builder()
.contentId("key1")
.build()
valueOperations.set("key1", someObject)
//When
List<SomeObject> someObjects = valueOperations.multiGet(Arrays.asList("key1", "nonexisting"))
//Then
assertEquals(2, someObjects.size())
assertEquals(someObject, someObjects.get(0))
assertEquals(null, someObjects.get(1))
}
So, in Redis, this:
127.0.0.1:6379> MGET "\"key1\"" "\"nonexisting\""
1) "{\"#class\":\"some.package.SomeObject\",\"contentId\":\"key1\"}"
2) (nil)
Will results in a List of {SomeObject, null}
I have seen answers in couple of threads but didn't work out for me and since my problem occurs occasionally, asking this question if any one has any idea.
I am using jedis version 2.8.0, Spring Data redis version 1.7.5. and redis server version 2.8.4 for our caching application.
I have multiple cache that gets saved in redis and get request is done from redis. I am using spring data redis APIs to save and get data.
All save and get works fine, but getting below exception occasionally:
Cannot get Jedis connection; nested exception is redis.clients.jedis.exceptions.JedisConnectionException: Could not get a resource from the pool | org.springframework.data.redis.RedisConnectionFailureException: Cannot get Jedis connection; nested exception is redis.clients.jedis.exceptions.JedisConnectionException: Could not get a resource from the poolorg.springframework.data.redis.RedisConnectionFailureException: Cannot get Jedis connection; nested exception is redis.clients.jedis.exceptions.JedisConnectionException: Could not get a resource from the pool
org.springframework.data.redis.connection.jedis.JedisConnectionFactory.fetchJedisConnector(JedisConnectionFactory.java:198)
org.springframework.data.redis.connection.jedis.JedisConnectionFactory.getConnection(JedisConnectionFactory.java:345)
org.springframework.data.redis.core.RedisConnectionUtils.doGetConnection(RedisConnectionUtils.java:129)
org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:92)
org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:79)
org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:191)
org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:166)
org.springframework.data.redis.core.AbstractOperations.execute(AbstractOperations.java:88)
org.springframework.data.redis.core.DefaultHashOperations.get(DefaultHashOperations.java:49)
My redis configuration class:
#Configuration
public class RedisConfiguration {
#Value("${redisCentralCachingURL}")
private String redisHost;
#Value("${redisCentralCachingPort}")
private int redisPort;
#Bean
public StringRedisSerializer stringRedisSerializer() {
StringRedisSerializer stringRedisSerializer = new StringRedisSerializer();
return stringRedisSerializer;
}
#Bean
JedisConnectionFactory jedisConnectionFactory() {
JedisConnectionFactory factory = new JedisConnectionFactory();
factory.setHostName(redisHost);
factory.setPort(redisPort);
factory.setUsePool(true);
return factory;
}
#Bean
public RedisTemplate<String, Object> redisTemplate() {
RedisTemplate<String, Object> redisTemplate = new RedisTemplate<>();
redisTemplate.setConnectionFactory(jedisConnectionFactory());
redisTemplate.setExposeConnection(true);
// No serializer required all serialization done during impl
redisTemplate.setKeySerializer(stringRedisSerializer());
//`redisTemplate.setHashKeySerializer(stringRedisSerializer());
redisTemplate.setHashValueSerializer(new GenericSnappyRedisSerializer());
redisTemplate.afterPropertiesSet();
return redisTemplate;
}
#Bean
public RedisCacheManager cacheManager() {
RedisCacheManager redisCacheManager = new RedisCacheManager(redisTemplate());
redisCacheManager.setTransactionAware(true);
redisCacheManager.setLoadRemoteCachesOnStartup(true);
redisCacheManager.setUsePrefix(true);
return redisCacheManager;
}
}
Did anyone faced this issue or have any idea on this, why might this happen?
We were facing the same problem with RxJava, the application was running fine but after some time, no connections could be aquired from the pool anymore. After days of debugging we finally figured out what caused the problem:
redisTemplate.setEnableTransactionSupport(true)
somehow caused spring-data-redis to not release connections. We needed transaction support for MULTI / EXEC but in the end changed the implementation to get rid of this problem.
Still we don't know if this is a bug or wrong usage on our side.
I moved from redis.template to plain jedis.
Added below configuration(can be added in redis template too) for pool and don't see any exception now:
jedisPoolConfig.setMaxIdle(30);
jedisPoolConfig.setMinIdle(10);
for redis template:
jedisConnectionFactory.getPoolConfig().setMaxIdle(30);
jedisConnectionFactory.getPoolConfig().setMinIdle(10);
Same above config can be added in redis template too.
The problem is with the Redis configuration
For me, I was using this property for my local, when I commented on this property, the issue got resolved
#spring.redis.database=12
The correct property will be
spring.redis.sentinel.master=mymaster
spring.redis.password=
spring.redis.sentinel.nodes=localhost:5000
I fixed mine by changing this in my application.yml file:
redis:
password: ${REDIS_SECRET_KEY: null}
to this:
password: ${REDIS_SECRET_KEY:}